Pages
Thursday, September 24, 2009
Staving off the Cosmic Malthus
The Darwinian argument may be overcome if culture keeps evolving faster than genes and thereby can keep overcoming future genetic adaptations. (Richard Dawkins argues that we can overcome our selfish genes). It may be rebutted that units of culture (what Dawkins calls "memes") themselves are Darwinian competitors and thus also face Malthusian limits, or that future computerized minds may reproduce very quickly and evolve as fast as culture. I won't elaborate on these arguments further here as Robin has got me into a hyper-futuristic mood and I'd like to suggest another way in which we might achieve more "room at the bottom".
Hanson counts atoms in order to estimate the density of information (or of minds) that might be created. But, just as Freeman Dyson, Gerard O'Neill, and others showed that planets are a waste of mass, so that technologically mature civilizations won't have planets, I'll argue here that atoms are a waste of mass-energy, and technologically mature civilizations may not have very many of them. Instead information may be stored in photons and collections of electrons and positrons (for example geonium) may handle most information processing and form the substrate of minds.
Given that a photon can come in a vast number of possibly distinguishable frequencies, the spectrum spanning more than 20 orders of magnitude, we may be able to store at least 10^30 bits per photon. One approach to creating photons is to simply capture the energy of solar nuclear fusions as photons, as we already know how to do -- this should give us about 10^95 bits worth of photons of average energy blue. But we'd have to either wait billions of years for all these fusion reactions to occur naturally in the sun or accelerate them somehow. More completely, the neutrons and protons in the sun, if converted into photons of average energy blue, would give us 10^97 bits and we may not have to wait billions of years if we can figure out how to bring about this hypothetical conversion. This is a fascinating but very speculative bit of physics which I will explore further.
Of course, we will still need some electrons or positrons around to actually process that information and recycle photons. And we still need some neutrons and protons around to fuse for energy to make up for the waste heat, to the extent that geonium computations will be less than perfectly reversible. Unless we are very clever and figure out how to make solid structures that don't blow up out of electrons and positrons, we will need some magnetic tanks made out of traditional heavy atoms to hold the geonium. Worse, the strong tendency for baryon number to be conserved makes cracking protons difficult and perhaps impossible. Protons are made out of three quarks, and while cracking quarks is quite possible (particles with two quarks but zero net baryon number decay spontaneously into particles with no quarks), the tendency for baryon numbers to be conserved at the energy levels used by current particle accelerators suggests that cracking the proton, if we can even figure out how to do it, may require vast amounts of energy, so that only a tiny fraction of the sun's neutrons and protons might be converted before we run out of energy from the fusion of the remaining nuclei. Right now we know how to crack the neutron into a proton and electron, but we don't know how to crack the proton. To be feasible we will have to discover a way to "catalyze" proton decay, by analogy to how the activation energies of chemical reactions can be lowered by catalysts.
If feasible, converting wasteful atoms into more useful photons would give us many orders of magnitude more room at the bottom. Staving off Malthus then becomes a question of how much information can be stored in a photon, and of how quickly electrons or positrons can process those photons.
We still face Heisenberg uncertainty as a limit on how quickly these photonic memories can be recalled. The product of the measured time of arrival of a photon and its measured energy (and thus the number of distinguishable frequencies) has a fixed uncertainty -- if we measure the time with greater precision, we can distinguish fewer frequencies, and vice versa. This sets a finite limit on the rate at which we can process the information stored in the photons. Seth Lloyd has calculated that 1 kilogram of mass converted into energy can perform at most 10^50 operations per second. So future civilizations could only stave off Malthus by going photonic -- Malthus will still eventually catch up, assuming Darwinian competition in reproduction remains.
In addition to classical bits stored as photon frequencies, an exponentially higher number of quantum bits (qubits) might be stored in the entangled states of these photons. However, to use some number of these qubits requires destroying an exponentially larger amount of them. Thus, against exponential population growth memory storage itself remains cheap, but recalling memories or thinking about things becomes exponentially expensive. Qubit minds might stave off Malthus by hibernating for exponentially longer periods of time, waking up only to observe an exponentially decreasing number of interesting events.
My argument that we may figure out how to crack three-quark particles like neutrons and protons into photons relies on the probability, due to the imbalance of protons and anti-protons (and neutrons and anti-neutrons) in the observable universe, that baryon number (a property of quarks) is not necessarily conserved, and is falsifiable in that sense: if for example we discover with better telescopes that the amount of antimatter in the universe is the same as the amount of matter, that will at least strongly suggest that even at Big Bang energies baryon number is conserved, rendering the possibility of ever converting the quarks which constitute most of the mass of neutrons and protons into non-quarkish things (like electrons, positrons or photons) extremely unlikely. It's also somewhat imminently testable insofar as if LHC and similar colliders continue to fail to crack the proton, that further dims prospects. Feasibility, however, is not so testable: one could argue that, even if baryon number was not conserved in the Big Bang, and even if we soon discover how to crack the proton in high-energy colliders, we may never figure out a method, analogous to catalysis in chemical reactions, to crack protons at economically low energies or to productively recycle the energies used to perform the conversions rather than it being dispersed as waste heat.
(h/t: the phrase "Cosmic Malthus" to describe Hanson's theory is from commenter Norman at Robin's blog).
Wednesday, September 16, 2009
Nondeterminism and legal procedure
A deterministic process is one in which, for any state of the world -- a state being a theoretical description of everything that might change the future -- there is only one next state. The omniscient Laplace's daemon could in principle predict everything about the future if the universe were deterministic. In a nondeterministic process, there can be more than one future state, and not even Laplace's daemon can know for sure, and may not know at all, which one will happen. We can model simple processes as "state machines": in the present the process is in one state, in the next instant in the future the process may have transitioned to another state, and so on.
Here's a picture of a deterministic process -- one with only one possible future:
Here's a picture of a nondeterministic process:
If as in the picture above there are more than two possible future states, this can also be modeled as a sequence of binary events:
An event can be natural or a human act. If it's a human act, the decision to act often can or should be based on good estimates of in which state(s) the world is or was in. In legal procedure, generally an arrest should only be made based on an estimate that the person arrested in fact committed a specific crime, for example.
If causally related nondeterministic processes repeat themselves often enough, we can develop a probabilistic model of them. Physicists have developed probability density models for very small-scale phenomena in quantum mechanics, for example.
Practical nondeterminism stems from at least four sources: (1) some of the physical world is inherently nondeterministic, (2) even where deterministic in principle, the configuration of physical states in the world is vastly more complex than we can completely describe -- nobody and no computer comes anywhere close to resembling Laplace's demon, (3) people are far, far more complex than we can mutually comprehend -- especially if you get more than a Dunbar number of us together, and (4) the words and phrases we communicate with are often ambiguous.
Most of the nondeterminism in legal procedure stems from questions of who did what when and where, and the legal consequences that should ensue based on codes and judicial opinions written in ambiguous language. Law summarizes this uncertainty with a number of qualitative probabilities often called "burdens of proof". The following list is sort of, but not necessarily (as they come from different legal domains and are not necessarily comparable), in order of lesser to greater burden of proof:
- Colorability
- Air of Reality
- Reasonable suspicion
- Prima facie case
- Probable cause
- Preponderance of the evidence
- Clear and convincing evidence
- Beyond reasonable doubt
- (To reverse a jury verdict) No reasonable jury could have reached the verdict
These label the probabilities -- not in the sense of numbers between 0 and 1, but in the sense of kinds of evidence and degrees of convincing argument -- required for various decisions of legal procedure to be made: for a search warrant to issue, for an arrest to be made, for property to be confiscated, for a legal motion to be accepted or denied, for liability to accrue in a civil trial, for a sentence of guilty in a criminal trial, for decisions about jurisdiction, and so on.
It's useful to look at these, not merely as classical probabilities, but in the style of quantum physics, as a superposition of states. When a nondeterministic event -- or a deterministic event for which we lack important knowledge -- has happened in the past, we can treat it as a superposition of all the possible events that might have happened. When a person or persons undertakes a procedural act -- arrests a person, issues a verdict, and so on -- under law they should be doing so based on a judgment, to some degree of certainty, that one particular set of facts occurred that justify the act. We can thus see a criminal defendant, for example, as in the state "guilty or not guilty" until a jury "collapses" this probability distribution to a verdict (which collapse, however, unlike quantum mechanics, can sometimes be reversed by an appeals court if deemed erroneous). A suspect is in the state "beyond reasonable suspicion" or "not beyond reasonable suspicion" until a police officer acts, for example to pull over your car on the highway, in a way that requires reasonable suspicion. In principle, at least, this decision too shoul dbe reversible (for example, if the officer pulled over your car without reasonable suspicion and noticed an open beer bottle, that evidence could be thrown out of court based on the lack of reasonable suspicion in the original stop).
Legal procedure needs to control nondeterminism so that people can operate in an environment of reasonably reliable legal rights. Think, for example, about how inefficient the economy would be if most property was in imminent danger of being taken from one owner and given to another due to frequent decisions reversing old court cases, or how full of worry our lives would be if we could be taken off the street and put in jail at random. Thus there is, for example, a strong presumption in English-law countries that a jury's decision is final: and this effected by putting the burden of proof on the court reversing the decision high: "no reasonable jury could have reached the verdict", a burden of proof in a criminal case much higher than the jury's own "beyond reasonable doubt."
Saturday, September 05, 2009
The Coase Theorem in action
A nice illustration of the major flaw I have described in the Coase Theorem. Much of what is valuable in the above link I actually wrote in the comments and will now foreground with some minor edits:
A music store is next door to a doctor's office. The music store would prefer (if the office of a rich doctor who wants quiet for his patients did not exist next door) to let its customers test its electric guitars at volume VM1 > 0. The doctor prefers it to be quieter (VD < VM1). Coase theory assumes that the only possible choices are within the range {VM1, VD}, i.e. any volume of electric guitar testing in between or including these two preferences. In the absence of transaction costs and given only this range, one can indeed conclude that the music store and the doctor will bargain to an efficient outcome. But these aren't the only choices. The music store can, at additional cost to itself C, turn up the volume nobs on its amplifiers and play the music at volume VM2 > VM1. If the doctor is willing to pay the music store P1 to change the volume from VM1 to VD, and P2 > P1 + C to turn the volume down from VM2 to VD, the music store has an incentive to play the music at volume VM2 instead of VM1, or to threaten same, in order to extract for itself a greater benefit from the situation.
In other words, the same physical effect that produced the externality gives rise to an opportunity and incentive to play a negative-sum game. Here it changes the music store's prefered volume in the absence of a rich doctor next door from VM1, to VM2 > VM1, due to the opportunity to extort extra payments from the doctor by creating an even less bearable din, for which the doctor is willing to pay even more to avoid. The music store is willing to incur an extra cost C to itself in order to extract the greater payment P2 from the doctor. For the overall game the payment P2 is a wash and C makes it negative-sum. (In the music store example, cost C comes from the music store chasing away some of its own customers, albeit at a slower rate than it chases away the doctor's customers, by testing its guitars more noisly than it would prefer in the absence of the doctor).
If, as in reality, there are transaction costs causing bargains to sometimes not be reached, the outcome is even worse, as noise VM2 is costlier, perhaps far more costlier, to the doctor's practice than VM1: such outcomes are often far worse outcome under transaction costs than the range of possible outcomes that Coaseians contemplate.
Of course, more generally in the absence of proper prior legal allocations of rights the doctor and music store could threaten each other in other ways: the doctor could threaten to poison the guitar frets, the music store could call in the mob on the doctor, etc.
(Furthermore, even with tort law preventing these other negative-sum games the music store has an incentive to falsely "reveal" preference VM2 instead of VM1 to the doctor and to the judge -- a common problem that good tort law usually, but hardly with perfection, tackles).
The example of the music store and its amplifier volume shows that the externality itself contains potential or actual coercion -- the same physical effect that causes the externality often makes negative-sum games possible, and in the absence of any prior legal limits on the externality, opportunities and incentives for coercive negative-sum games are inherent in the externality -- so that analyses of such externalities with the Coase Theorem, which assumes such games don't exist, will often lead to misleading or false conclusions.
The game being played here by the music store is negative-sum for the same reason a tax is, firstly because the music store's coercion distorts the behavior of the doctor and his patients. Assuming the doctor is helpless to stop the noise without making the payoff (e.g. we artificially assume he can't order a mob hit on the music store, or poison its customers, or emit any other such "extreme" externality to avenge or deter the music store's excess externality) he will go golfing more, and see fewer patients, if he is paying P2 to the store instead of P1. Fewer patients will be healed, a net loss of welfare. Since we assume the music store is rational, it will demand only the Laffer-maximum amount of extortion, but Laffer-maximum taxes still have plenty of distoritve effects that produce inefficiencies compared to the no-taxation case. Secondly, the behavior of the music store is also distorted because it has excess profits to spend. It will invest its extra money in opening new music stores and concert halls next to other doctor's offices, nursing homes, and similar because that is a lucrative source of profit, and so other activities that would prefer quiet will be distorted in turn. It is often unreasonable to assume that Coaseian payees are spending their extra money efficiently. Interestingly, Gary Becker assumed the Coaseian payor's behavior was not distorted and that the Coaseian payee was spending its extra profits efficiently, and used this Coaseian reasoning to argue that governments themselves are efficient outcomes of Coaseian bargaining. Becker's argument is wrong for the same reason that [anarcho-capitalist David] Friedman's [Coaseian] argument is wrong for legal protection agencies: it doesn't account for the economic distortions caused by coercion.
To see where these negative-sum games lead, let's take the case of roving loudspeakers. Pickup trucks drive through the city, parking in front of every business in turn and demanding large payments to take their noise elsewhere. The optimal extortion for the extortors in this case is nearly 100% of all business wealth in the city (again assuming the victims are defenseless), because if extortor A doesn't extort any remaining wealth extortor B will be happy to come in and take it. The economy is so distorted that practically nothing gets produced or distributed and the city's economy collapses. This is the "roving bandit" case studied by Mancur Olson. Where two stores are next to each other and neither can move constitute "stationary bandits", as do gangs or governments with "monopolies of coercion" over fixed territories. As the roving loudspeakers case illustrates, rational stationary bandits collect a far lower percent of their victims' profits in taxes than do roving bandits. (But stationary bandits with the much lower rate than 100% end up collecting a far higher absolute amount -- recall the Laffer curve) . If on the other hand we assume the victims are not defenseless, we have negative-sum games like hawk/dove, negative tit-for-tat, etc. which again are paradigmatically very different from voluntary Coaseian bargains.
We can measure the effectiveness of an excess (or coercive) externality for extracting super-Coasiean payoffs by how great a harm the externality can produce for the least cost to the emitter. The ubiquity of technology that is very effective in producing the greatest harm for the least cost, i.e. weapons, in our world should be a very good clue that our world is not Coaseian. Music volume, spark emission, and so on beyond the "preferred" level Coaseians falsely assume to be maximal are logically weapons. Their harm/cost ratio is lower than guns, tanks, bombers, missiles, flamethrowers, herbicides, and so on, but they have an advantage in being physically hard to distinguish from merely Coaseian externalities, which would come in handy in a world where judges and other lawmakers actually based law on the Coase theorem (the good news is that they mostly don't).