A deterministic process is one in which, for any state of the world -- a state being a theoretical description of everything that might change the future -- there is only one next state. The omniscient Laplace's daemon could in principle predict everything about the future if the universe were deterministic. In a nondeterministic process, there can be more than one future state, and not even Laplace's daemon can know for sure, and may not know at all, which one will happen. We can model simple processes as "state machines": in the present the process is in one state, in the next instant in the future the process may have transitioned to another state, and so on.
Here's a picture of a deterministic process -- one with only one possible future:
Here's a picture of a nondeterministic process:
If as in the picture above there are more than two possible future states, this can also be modeled as a sequence of binary events:
An event can be natural or a human act. If it's a human act, the decision to act often can or should be based on good estimates of in which state(s) the world is or was in. In legal procedure, generally an arrest should only be made based on an estimate that the person arrested in fact committed a specific crime, for example.
If causally related nondeterministic processes repeat themselves often enough, we can develop a probabilistic model of them. Physicists have developed probability density models for very small-scale phenomena in quantum mechanics, for example.
Practical nondeterminism stems from at least four sources: (1) some of the physical world is inherently nondeterministic, (2) even where deterministic in principle, the configuration of physical states in the world is vastly more complex than we can completely describe -- nobody and no computer comes anywhere close to resembling Laplace's demon, (3) people are far, far more complex than we can mutually comprehend -- especially if you get more than a Dunbar number of us together, and (4) the words and phrases we communicate with are often ambiguous.
Most of the nondeterminism in legal procedure stems from questions of who did what when and where, and the legal consequences that should ensue based on codes and judicial opinions written in ambiguous language. Law summarizes this uncertainty with a number of qualitative probabilities often called "burdens of proof". The following list is sort of, but not necessarily (as they come from different legal domains and are not necessarily comparable), in order of lesser to greater burden of proof:
- Air of Reality
- Reasonable suspicion
- Prima facie case
- Probable cause
- Preponderance of the evidence
- Clear and convincing evidence
- Beyond reasonable doubt
- (To reverse a jury verdict) No reasonable jury could have reached the verdict
These label the probabilities -- not in the sense of numbers between 0 and 1, but in the sense of kinds of evidence and degrees of convincing argument -- required for various decisions of legal procedure to be made: for a search warrant to issue, for an arrest to be made, for property to be confiscated, for a legal motion to be accepted or denied, for liability to accrue in a civil trial, for a sentence of guilty in a criminal trial, for decisions about jurisdiction, and so on.
It's useful to look at these, not merely as classical probabilities, but in the style of quantum physics, as a superposition of states. When a nondeterministic event -- or a deterministic event for which we lack important knowledge -- has happened in the past, we can treat it as a superposition of all the possible events that might have happened. When a person or persons undertakes a procedural act -- arrests a person, issues a verdict, and so on -- under law they should be doing so based on a judgment, to some degree of certainty, that one particular set of facts occurred that justify the act. We can thus see a criminal defendant, for example, as in the state "guilty or not guilty" until a jury "collapses" this probability distribution to a verdict (which collapse, however, unlike quantum mechanics, can sometimes be reversed by an appeals court if deemed erroneous). A suspect is in the state "beyond reasonable suspicion" or "not beyond reasonable suspicion" until a police officer acts, for example to pull over your car on the highway, in a way that requires reasonable suspicion. In principle, at least, this decision too shoul dbe reversible (for example, if the officer pulled over your car without reasonable suspicion and noticed an open beer bottle, that evidence could be thrown out of court based on the lack of reasonable suspicion in the original stop).
Legal procedure needs to control nondeterminism so that people can operate in an environment of reasonably reliable legal rights. Think, for example, about how inefficient the economy would be if most property was in imminent danger of being taken from one owner and given to another due to frequent decisions reversing old court cases, or how full of worry our lives would be if we could be taken off the street and put in jail at random. Thus there is, for example, a strong presumption in English-law countries that a jury's decision is final: and this effected by putting the burden of proof on the court reversing the decision high: "no reasonable jury could have reached the verdict", a burden of proof in a criminal case much higher than the jury's own "beyond reasonable doubt."
increased uncertainty leads to increased hedging. I suspect this inefficiency is part of the reason for our intrinsic dislike of information asymmetry even when it is of net benefit.
When a nondeterministic event -- or a deterministic event for which we lack important knowledge -- has happened in the past, we can treat it as a superposition of all the possible events that might have happened.
Isn't the past already determined? That is to say, if it's already happened, our uncertainty is just a matter of ignorance in both cases?
Tina, good catch, thanks. Nazgulnarsil, when is information asymmetry of intrinsic benefit?
doesn't the division of labor imply that information asymmetry is beneficial due to mere volume? perhaps I am over applying a concept postulated for specific (game theoretic) circumstances?
Am I incorrect in thinking that Probable Cause should be higher on the list that Prima Facie Case?
gah! I meant to say, ". . . higher on the list than Prima Facie. . ."
nazgulnarsil, that's a very good point -- economies of specialization create (lesser, at equilibrium) transaction costs due to information asymmetry. This suggests that the more efficient and higher population a civilization is, the higher its transaction costs as a proportion of the the economy. A corollary is that social nondeterminism increases with the higher division of labor of a more advanced economy -- we become more mutually unpredictable. Very interesting.
It's worthwhile thinking then about why division of labor is valuable. If it's valuable for taming the nondeterminism of nature (for example the vast complexity of the human body in order to improve our health), then one can speculate that we will reach a point in the future when any further gains against nature from division of labor will be offset by increased transaction costs, and division of labor have reached a natural maximum.
Of course, this might already have occurred in some industries -- perhaps in some service industries, for example, where value is primarily social rather than material.
xon, you're probably right, but it's somewhat incomparable in that the prima facie case refers to only one partisan's presentation of the case (i.e. it ignores rebuttal evidence), whereas in probable cause the argument must withstand the other parties' rebuttal evidence. Albeit another difference is that probable cause refers more to the reasonable beliefs of an actor (e.g. a police officer deciding to make an arrest), not to the totality of the evidence.
Here's an example ripped from Wikipedia but extended by me. Police officer walks into the bar, and sees Jim, John, and Joe all standing next to the smoking gun and the murder victim. Just that information is insufficient for probable cause to arrest any one of them. However, a prima facie argument "Joe was standing next to the smoking gun" sounds pretty good without the rebuttal evidence that so were Jim and John.
However, that's a somewhat concocted example, so prima facie may well belong below probable cause on the list.
I like your general point about nondeterminism; it's hard to see who would argue for nondeterminism in law seriously.
I've never been a big fan of precedent as a primary tool for enforcing determinism + predictability of the laws; there's something very Kafkaesque about having to wait for a trial to go through before being confident you understand what a particular law will actually do.
It seems a better long-term approach (blue-sky) would be to force laws to be written like source code is today:
- with a formal spec for what the overall body of laws is attempting to accomplish, including some attempt at metrics for checking if it is accomplishing what it is intended to
- with attached comments as to the intent of the writers (which would be gamed, of course, for political gain, but are better than no such comments); why does each writer include this clause?
- with attached unit tests (list of example scenarios and a summary of what the law does/does not entail in that context, including commentary)
Sure some stuff will always slip through; in the case a law fails to determine an outcome you'll need a court system to settle -- in a consistent fashion -- on which of the possible interpretations is the operative interpretation.
It just seems you'd be best off coming up with a better way of writing laws.
All that said: it'd also be nice to see the gradation of levels-of-proof semi-formalized and the actual legal system shown to be such that its outcomes become less-and-less nondeterministic the further you go through the system; eg that at each step of the way revisiting a previous step requires a higher degree of evidence than was originally needed to complete that step.
Are there any places right now where the trend to determinism goes backwards for some step?
Post a Comment