Friday, October 19, 2007

Calveley nukes Amazon one-click patent

Peter Calveley has succeeded in getting the U.S. Patent and Trademark Office (USPTO) to throw out most of the claims in Amazon's infamous one-click patent, including the broadest claims. Amazon now has an opportunity to respond and convince the USPTO to change its mind, but its prospects are dim. From Peter's report:
In a recent office action, the USPTO has rejected the claims of the Amazon.com one-click patent following the re-examination request that I filed on 16 February 2006.

My review resulted in the broadest claims of the patent being ruled invalid.

In its Office Action released 9 October 2007, the Patent Office found that the prior art I found and submitted completely anticipated the broadest claims of the patent, U.S. Patent No. 5,960,411.

I had only requested the USPTO look at claims 11, 14, 15, 16, 17, 21 and 22 but the Office Action rejects claims 11-26 and claims 1-5 as well!

I reported on this soon after it got started and am proud to have assisted Peter in this endeavor. Here is Peter's full report.

Tuesday, October 09, 2007

The new gigaprojects

A new era of commercial gigaprojects is upon us, and the source is almost entirely our insatiable demand for hydrocarbons. Despite high political risks, global warming and "peak oil" fears, and shortages of skilled employees, there are dozens of energy-related projects in the works or on the drawing boards in the multi-billion dollar range, with several topping $10 billion. Is it the sign of a petroleum bubble, or does it simply reflect the new reality of a long-term higher worldwide demand for hydrocarbons? Here are some of the more interesting projects. Note that the dollar cost of these projects is often a bit higher than quoted here because of depreciation of the dollar versus most currencies since the cost quotes in my sources were made.

One of the biggest expansions into relatively new territory is that of tar sands. It is estimated that over $100 billion will be invested in Canadian tar sands projects over the next decade. Two of these projects are Petro-Canada's Fort Hills Project($25 billion) and the Chinese/Canadian Northern Lights Project ($4.5 billion).

Many of the biggest projects are to extract natural gas (methane). $11 billion of investment was recently completed for Qatargas II, and China is cooperating with Iran on the $16 billion project to develop the Iranian North Pars field. Woodside Corp.'s $9 billion investment in the Pluto project is expected to eventually top out at over $30 billion. Sakhalin 2 is a $20 billion project of Shell and Gazprom to extract oil and gas from beneath the coast of Sakhalin Island, along the east coast of Russia.

Oil tends to be a bit more dispersed than natural gas, and consequently inidividual oil projects tend to be smaller. But in total, OPEC countries have projects underway or plans for $254 billion worth of oil field development and expansion. The largest projects tend to be pipelines: a proposed pipeline from Siberia to China and the Sea of Japan for $12-$16 billion, Malaysia wants to build a $7 billion pipeline to save oil tankers from the troubles and risks of sailing through the Straits of Malacca, and a $4 billion pipeline is being built between Chad and Cameroon. But other oil-related gigaprojects are in the works. Kuwait is planning a $14 billion monster oil refinery. CONOOC and Shell have agreed to build a $4.3 billion plastics (ethylene and propylene) factory in Guandong Province, China. Investors have ponied up more than $1 billion simply to search for oil in New Zealan's Great South Basin. And oil revenues have allowed Saudi Arabians to dream of building a city from scratch for a cool $26 billion. But topping our list of gigaprojects, the overall cost of developing the Kashagan fields in Kazakhistan is now estimated at $136 billion, which may make it economically unviable.

Meanwhile, there are also numerous offshore oil operations being developed in the $500 million - $2 billion range, with a few such projects (such as the Hebron field off Newfoundland, $5.6 billion) surpassing that range. More on deep sea projects -- now including not just oil, but diamonds and other minerals -- here.

Friday, September 28, 2007

Secure time broadcast

From my article Advances in Distributed Security:
Broadcasts using sound or radiation, from sources such as bell towers, radio towers, satellites, and pulsars, must send out the same value to every receiver. A remote beacon such as a pulsar has perfect security: the access structure is any party, and its complement, the attack structure, is the empty set. For human controlled broadcasts, the attack structure consists only of the broadcaster and the access structure is any receiver.

Natural broadcasts are thus immune to the problem, described in the discussion of the Byzantine Generals problem below, of a transmitter sending different values to different receivers. Indeed, as we will see below, distributed researchers have gone to great lengths just to recreate this simple property on the Internet with logical broadcast protocols.

Nature provides clocks that are oblivious to the malicious intentions of any outside parties. In the case of a remote high-energy system such as a pulsar, this means anybody. Pulsars are many orders of magnitude more accurate than random delays that face attackers on the Internet. If critical Internet servers were synchronized to natural clocks in a secure and timely fashion, they would be immune to attacks that relied on uncertainties in timing.

More here.

Detecting pulsars is still, alas, a difficult process, but not so hard that amateurs have had no success doing so. Amateurs have had some success tracking pulsars with software defined radio and quagi antennas combined with digital signal processing.

Wednesday, September 19, 2007

Real options analysis for space projects

For the purposes of estimating the value of a project, risk has for simplicity often been treated as spread out evenly over the life of a project. For normal projects these risks can usually be fairly approximated by estimating risk per unit time. This occurs in, for example, the net present value (NPV) formula for estimating the value of a project where a constant interest rate risk premium is used. In a typical space mission, however, risk tends to be concentrated at particular events such as launches, burns, flybies and landings. Just floating out in space is relatively quite low risk, which is why we can fly to Jupiter and Saturn with failure rates not significantly greater than missions to the moon or Mars. All four of the spacecraft that have so far intentionally been flown beyond Saturn -- Pioneers 10 and 11, and Voyagers 1 and 2 -- had many initial glitches. But since passing beyond Saturn they have travelled on for nearly two decades, far beyond Pluto, with no substantial and unexpected failure. The Voyagers continue to send back valuable data on the very edge of the solar system, where the solar wind gives way to the interstellar influence -- more than 80 times as far from the sun as our own earth.

There is a small risk of component failure that tends to obey a Poisson distribution that grows over time. But the risk in even earth-orbiting satellites is dominated by launch and orbit-insertion failures, and other failures at the start of the satellite's lifetime, which are unrelated to the satellite's expected lifetime.

Thus the vast majority of the risk of most space projects does not grow exponentially with their duration, and indeed is usually not closely correlated to their duration in any way. We would thus get an exponentially wrong answer by a line of reasoning that estimated the risk of a generic deep space mission as X%/year, and deduce by plugging that risk into the net present value (NPV) equation that, for example, an 8 year mission is substantially costlier (due to a risk that grows exponentially over time) than a 2 year mission. An example of this fallacy is NPV analysis that assumes a constant risk premium for comparison of futuristic lunar and asteroid mining scenarios. All such papers that I've seen (e.g. this one) fall victim to this fallacy. To use NPV properly we need to account for the risks of particular events in the mission (in the mining scenario primarily launches, burns, and mining operations) to estimate a total risk, and divide that total risk by the duration of the mission to get a risk premium. The risk premium per year for the longer mission will thus probably be substantially lower than for a shorter mission (implying an overall risk slighly higher for the longer mission, all other things being equal).

An even more accurate method for evaluating risk in space projects is called real options analysis. It has changed valuation from the old static NPV model to a dynamic model based on specific risks. One improvement this brings is removing the assumption of constant risk, which we've seen is wildly inappropriate for deep space missions. Another idea real options brings us is that designing a project to postpone choices adds value to the project when there will be better information in the future. A science mission example: if a scientific event could occur at either target A or target B, it's best to postpone the choice of target until we know where the event is going to occur. If that's possible, we now have a scientifically more valuable mission.

Orbital planning for deep space missions tends to plan for a fixed mission ahead of time. Real options analysis says that the project gains value if we design in options to change the project's course in the future. For orbital mechanics that means designing the trajectory to allow retargeting at certain windows, even at some cost of delta-v and time. (Whether the tradeoff is worth it can be analyzed using real options mathematics, if we can make comparison estimates of different scientific targets using a common utilitarian scale).

In the context of an Jupiter orbiter swinging by various Jovian moons, such options might include hanging around an interesting moon longer or changing the trajectory to target. The idea is instead of plotting a single trajectory, you plot a tree of trajectories, with various points where the mission controllers can choose trajectory A or trajectory B based on mission opportunies.

A shorthand way to think of real options analys is that the project is modeled as a game tree with each node on the tree representing a choice (i.e. a real option) that can be made by the project managers. The choices are called "real options" because the math is the same as for financial options (game trees, Black-Scholes, etc.) but they represent real-world choices, for example the option of a vineyeard to sell its wine this year or let it age at least another year, the option of a mine to open or close, or expand or shrink its operations, etc.

The orbital planning I've seen tends to plan for a fixed mission ahead of time. Real options analysis says that the project may gain value if we design in options to change the project's course in the future. For orbital mechanics that means designing the trajectory to allow retargeting at certain windows, even at some cost of delta-v and time. (Whether the tradeoffs between delta-v, time, and particular real options are worth it can be analyzed using real options mathematics, if we can compare different scientific targets using a common utilitarian scale).

In the context of the Jovian moons project, such options might include hanging around Europa longer if a volcano is going there (like the one discovered on the similar moon Enceladus) or if some evidence of life is found (or leaving it sooner if not), or changing the trajectory so that the next target is Europa instead Ganymede if a volcano suddenly springs up on Europa, or to Io if an interesting volcano springs up there. The idea is instead of plotting a single trajectory, we plot a tree of trajectories, with various points where the mission controllers can choose trajectory A or trajectory B (sometimes with further options C, D, etc.) based on mission opportunities. Other trajectory options might include hanging around a particular moon for longer or changing the view angle to the target. We may trade off extra delta-v, extra time, or both in order to enable future changes in the trajectory plan.

Here is more on real options analysis. Real options analysis is also quite valuable for the research and developoment phase of a project. Here is a good paper on real options analysis for space mission design. My thanks to Shane Ross and Mark Sonter for their helpful comments on space project evaluation and planning.

UPDATE: This post is featured in the Carnival of Space #22.

Sunday, September 16, 2007

Tamper evident numbers

From The Playdough Protocols:
Evolving beyond clay tokens, accounting was the first use of the external marks and started to take a familiar form. Along with the tamper evident clay, the Sumerians developed a kind of virtual tamper evidence. It took the form of two sets of numbers. On the front of the tablet, each group of commodities would be recorded separately -- For example on the front would be recorded 120 pots of wheat, 90 pots of barley, and 55 goats. On the reverse would simply be recorded "265" -- the same objects counted gain, probably in a different order, and without bothering to categorize them. The scribe, or an auditor, would then verify that the sum was correct. If not, an error or fraud had occured. Note the similarity to tamper evident seals -- if a seal is broken, this meant that error or fraud had occured. The breaker of the seals, or the scribe who recorded the wrong numbers, or the debtor who paid the wrong amounts of commodities would be called on the carpet to answer for his or her discrepancy.

Checksums still form the basis of modern accounting. Indeed, the principle of double entry bookeeping is based on two sets of independently derived numbers that must add up to the same number. Below, we will see that modern computers, using cryptographic methods, can now compute unspoofable checksums.

Sunday, September 09, 2007

Mining the vasty deep (iii)

In the first two parts of this series I described undersea oil and diamond mining operations. In addition at least two startup companies, Neptune Minerals and Nautilus Minerals, are moving into mining the seafloor for metals. They're planning to dig into extinct black smokers for copper, and possibly also zinc, gold, silver, and other minerals. David Heydon, CEO of Nauilus Minerals, says of the remarkably high quality of the ores, "we haven’t seen these types of mineral deposits since the beginning of modern mining."

Nautilus and Neptune don't plan to mine live black smokers, which would be dangerous (black smokers are hot springs with temperatures well above 200 centigrade, although not boiling because at high pressure) and environmentally questionable (live black smokers team with exotic chemosynthetic life). Rather, they plan to mine the probably far greater number of extinct and lifeless black smokers that populate the oceans.
autilus paydirt:
Cutting of an extinct black smoker at 1,600 meters.


Geologist Steven Scott, a longtime advocate of sea mining, describes the geology of seafloor massive sulfides (SMS's). "These deposits are essentially metalliferous mud formed from hot, dense brines." They form within and around black smokers, the famous deep-sea vents discovered in 1979. Black smokers deposit "sinter-like mounds and chimneys of metal sulfides, oxides, silica and sulfates. Veins, disseminations and stockworks of relatively low metal grade impregnate the underlying lavas."

Curiously, we may already be mining black smokers: billion-year-old, long-extinct black smokers whose remains now lie on dry land. "The [current live black smoker] deposits have similarities to so-called volcanogenic massive sulfide ores being mined on land in Canada and elsewhere and which formed in ancient oceans as much as 2700 million years ago. Elements of potential commercial interest in both the modern and ancient deposits are copper, zinc, lead, silver, gold and barium. About 150 seafloor sites are known, most of which lie between 1500 and 3500 meters water depth."

Neptune paydirt, topside.

Both Neptune and Nautilus recently performed assays of several promising extinct smokers. Nautilus' assay was performed by two methods: a drill based on a ship (but dropped through 1600 meters of water) and a remotely operated vehicle (ROV) that took cutting samples. Here are at least some of the results. Heydon observes that "with 97% of the ocean floor yet to be extensively explored, it is likely that numerous deposits remain undiscovered."

Here is more from Steven Scott.

Business Plans

Nautilus estimates that it could cost as much as $300 million to ramp up to full-scale mining. Its plans include a $120 million specialized mining ship, the Jules Verne. This ship is similar to the FPSO used for deep sea oil, but will separate ore from sludge and water rather than oil from water. A large ROV, based on the trencher ROV used to lay pipes in the deep sea oil industry, but adding a large rotating drum scraper like those used in coal mining, will scrape the smokers creating sludge which will be pumped up to the Jules Verne and processed. Since the most expensive FPSOs can cost upwards of $800 million, the $120 million is a relative bargain. Indeed, the basic mothership/ROV setup is taken straight from the deep sea oil industry FPSO/ROV methodology.
Nautilus will contribute $120 million to develop undersea tools, including pumps, pipes and two subsea crawler-miners. It will also partner with a group of engineering companies with expertise in drilling and geophysics. If Jules Verne is in place by 2009, it could stay at the Papua New Guinea site until 2014, before moving on to exploit other deposits. Nautilus estimates that it will be able to mine 6000 tonnes a day from its target site.

Although there are technological hurdles to be overcome, ... 'the technology already exists; it just hasn't been integrated in this way. One beauty of the sea floor model is that a floating equipment production chain is our one main investment. Once we have the chain we can easily bring it up and redeploy it to another area.'
(More on Nautilus' plans here).

Nautilus' ROV with cutting arm to take samples.

Like many start-ups, Nautilus has a speculative and risky business plan, but the payoff could be vast:
[Nautilus] doesn't expect to reach first-scale production until 2009. There's no comparable traditional mining business model, since many of the individual deposits underwater are too small to be economically viable if they were on land. But because offshore drilling operations can move, whereas mines on land are stuck, the combined value of deposits in a wider area makes the operation worthwhile.
...
The worth of the oil, gas and minerals in the world's oceans is estimated to be in the trillions of dollars. If Mr. Heydon's estimates are correct, deep sea mining could have the potential to supply the world's growing demand for gold, copper and silver, among other metals. The resulting revenue could be in the billions of dollars for deep sea mining companies, of which only two currently exist.
...
Disputes over who owns what in the ocean have been a fact of global politics for decades. For reasons of security, potential resources and sometimes just pride, countries are constantly claiming control over new chunks of underwater property. As an indicator of just how rare it is to be able to mine hassle-free in the ocean, the exploration licence Papua New Guinea granted Nautilus was a world first.
More here. See also here.

In forthcoming posts in this series, we will look at some of the technological, environmental, political, and legal considerations involved in this new endeavor.

Saturday, September 08, 2007

Partial and total orders

From Advances in Distributed Security:
A basic issue of security and fault tolerance that must be resolved is the secure determination of which order events occured in. If a contract specifies a deadline and it goes down to the wire, how can a relying party or third party adjudicator determine whether the deadline was met? The outcome itself, and its fairness, may rest on fairly deciding who came first. If Alice tries to double-spend a piece of digital cash [C82], only the recipient who checks with the bank first is entitled to its value. But if the bank servers are replicated, which of the two recipients Bob or Charles checked with the bank first? In the case of a replicated property title service [S98] we have a similar problem -- if Alice transfers her title to two other owners, which new owner actually received the deed? If property is homesteaded on a first-come first-serve basis, which of two or more "land rushers" competing for a lucrative parcel is entitled to the land?

Lamport (Causal) Order

Imagine a network where computers don't know how to keep time very well -- they are always getting out of synchronization. (Alas, all you have to really think of here is the actual Internet with PCs). Such a network, called an asynchronous network, lacks an accurate and secure global clock time by which computers can determine the order in which events, which might be messages sent or instructions executed on a particular local machine, have happened. Lamport [L78] was among the first to tackle the problem of how to determine the order of events in such a network.

A partial order means that we know in what order some of the elements are, but we aren't sure about some of the others, or some of the others may be equal. An example is the "less than or equal to" relationship among a group of integers, some of which can repeat. Some of the integers we know are less than some others, but an integer paired with itself is equal. A total order, on the other hand, is like the "less than" relationship among unique integers -- we can always tell when one integer is less than another -- there is no ambiguity left. In the case of events, a partial order means for some pairs of events we know whether one occured before another, and for some others we don't know. We use the same symbols as we would use for the analogous case of the integers, so that "x <= y" means "x either occured before y or we don't know whether it occured before or after y". In a total of events, we know for any two events which one happened first. We write "x < y" meaning "x occured before y."

Lamport's answer to the event ordering problem was to show that parties (or, we use the terms equivalently here, nodes on the network) can agree on a partial order of events based on causal relationships between these events -- or at least the subset of events where we can determine that causation could occur. On a network, parties influence each other by talking to each other -- in other words, by sending each other messages. Lamport used these messages as the basic building block for constructing his partial order, according to the following rules:

  • 1. If an event is local to node P, every nonfaulty node agrees on P's opinion of it.
  • 2. Every correct node agrees that every message was sent before it was received.
  • 3. If we agree that event A occured before event B and that event B occured before event C, then we agree that event A occured before event C. In other words, this partial order is transitive.

Breaking Ties -- Creating a Fair Total Order

The partial order leaves us with the need to agree on how to break ties -- how to resolve the ambiguities where we can't agree which event took place first -- and thus create a total order of events. We want to do so in a way that is fair, in other words, in a way that cannot be manipulated to the advantage of any particular party.

An unfair way to create a total order would be to impose a certain predictable rule for breaking ties. For example, we could decide on a total order for the processes and break ties in the causal order by referring to this total order.

However, such a procedure creates a bias that may, depending on the application, favor certain servers over others, and therefore allow those servers to favor certain clients over others.

One way to break ties fairly is have the participants toss fair coins -- in other words, generate random numbers in a way that cannot be manipulated and then assign those random numbers to events. There are several ways to toss fair coins over a network and we describe one such way below.

Another way to break ties fairly is to have the participants agree to a global clock time that is more accurate than the message delays faced by those who would manipulate timing in favor of some party. This entails using a network with very predictable message lag for the clock synchronization protocol and a less predictable one for the other services.
More here.

Thursday, September 06, 2007

Institutional changes prerequisite to the industrial revolution

In the ongoing debate at Marginal Revolution over Gregory Clark's new book A Farewell to Alms -- which I highly recommend, despite my criticisms -- he wrote the following:

"The widespread impression that between 1300 and 1800 England experienced significant institutional improvements is just wrong. There were changes, yes. But not improvements."

To which I responded:
Some important institutional changes -- some of them arguably radical improvements -- in England between 1300 and 1800:

* The mechanical clock, 14th century. The resulting rise of clock culture and the time wage may have slowly but radically improved the coordination and work habits of Europeans. Earlier adaptation to clock culture, a process that may take centuries to evolve, may explain the large discrepencies between European and many non-European laborer work habits that Clark cites.

* The printing press and the rise of book consciousness, which radically decreased the costs of teaching economically important knowledge to both children and adults. The rise of book consciousness, reflected in the literacy and book cost data Clark graphs, explains the most prominent puzzle revealed by Clark's data: the fact that skills and innovation rose dramatically even as the rewards to skills were stagnant or declined.

* Nationalization the Church in England and secularization of family law, 16th century.

* The incorporation of the Lex Mercatoria into the common law, and the resulting rise of modern contract law, 18th century. Indeed, much of this occured in the same decades as the start of the industrial revolution.

* The "Romanization" of property law, rendering land more freely saleable, divisible, and mortgageable, which Adam Smith noted was an important improvement still in process at his time.

* The rise of marine insurance (e.g. Lloyd's of London) and the associated rise of colonialism and world trade, 17th-18th century.

* The decline of guilds and monopolies, 16th-18th centuries. Medieval England was certainly not a highly competitive market economy. Commerce in goods was dominated rather by monopolies and a variety of price and quality controls instituted by guilds and towns.
Here is more of the debate at MR.

Tuesday, September 04, 2007

Secure property titles

From Secure Property Titles with Owner Authority:
In all cases of property rights there is a defined space, whether a namespace or physical space, and the task is to agree on simple attributes of or rights to control subdivisions of that space. In some cases a name or other symbol corresponds to a person or object owned or controlled by that person. For example, Internet users must agree on which domain name corresponds to which web site operator. In other cases we are simply concerned with control over a subdivision of the space. With real estate we must agree on who owns various rights (to occupy the surface, to mine the minerals under, etc.) to a piece of land. With radio spectrum we must agree on who owns what range of frequencies and in what physical space (or transmitting power as an easily observed approximation of physical space used).

...all such [multiparty problems of] control over the semantics of symbols, to be made and respected across trust boundaries, are problems of agreeing on and maintaining property rights...

...New advances in replicated database technology will give us the ability to securely maintain and transfer ownership for a wide variety of kinds of property, including not only land but chattels, securities, names, and addresses. This technology will give us public records which can "survive a nuclear war", along the lines of the original design goal of the Internet. While thugs can still take physical property by force, the continued existence of correct ownership records will remain a thorn in the side of usurping claimants...

The ideal title database would have the following properties:

(1) Current owner Alice should be able transfer her title to only a single relying counterparty (similar to the "double spending" problem in digital cash)

(2) Servers should not be able to forge transfers

(3) Servers should not be able to block transfers to or from politically incorrect parties.

...Using these results [of Byzantine quorum systems] it looks like we can approach our ideal title database as follows:

(1) Alice signs the title and Bob's public key, and sends this message to 2f+1 servers, committing her to transfer title to Bob. Bob checks at least 2f+1 servers before relying on Alice's transfer.

(2) No collusion of servers can forge Alice's signature (we achieve at least this property ideally!)

(3) A conspiracy of >=(1/4)n servers can block a transfer. Alice's recourse is to use some other channels to broadcast her intention, demonstrating that the registry did not follow her wishes, and hoping the alternative channels are more reliable. Bob only has similar recourse if he signed a document with Alice demonstrating their intentions to transfer title from Alice to Bob. The most basic recourse is a correct subset of servers which exits the property club and establishes a new one, then advertises its correctness (and proves the incorrectness of its rival group) as described above.

More here.

Sunday, September 02, 2007

Bilinear group cryptography

An important recent development in public key cryptography is the bilinear group, which for abstract algebra wonks is defined as follows (if you're not into abstract algebra feel free to skip to below):
Bilinear groups are a set of three abstract algebraic groups, G1, G2 and GT , together with a deterministic function e, called a bilinear map, that takes as input one element from G1 and one element from G2 and outputs an element in GT . Suppose all three groups have order Q, element g1 generates group G1, and element g2 generates group G2. Then, one special property called bilinearity of the map e is that for all a, b < Q, we have that e(g1^a , g2^b) = e(g1, g2)^ab. This new element, e(g1, g2)^ab, is in group GT . The key observation is what happens to the exponents, a and b, during the mapping: they are multiplied. The group GT is distinct from both G1 or G2; thus the output of the mapping cannot be fed back into the map e as input.
Elliptic curves are generally used for the groups, although bilinear schemes in at least some other algebras are also possible.

Two of the main applications of bilinear groups are proxy re-signatures and proxy re-encryption. In proxy re-signatures, a semi-trusted party transforms Alice's public key signature into Bob's. The proxy does not have, cannot derive, and thus cannot sign with either Bob's secret key or with Alice's, but can only transform Alice's signature into Bob's. The proxy re-signer is thus "semi-trusted" -- it is trusted with some things we normally would trust a proxy signer with, but not with others. For example it is not trusted with either Alice's or Bob's private key, only with a special key that allows the signature transformation.

The target signature could also be a group signature. Thus, for example, Alice could sign her e-mail with her own digital signature, and a proxy re-signer sitting on the corporate e-mail firewall could re-sign the e-mail with the corporate group signature.

Proxy re-signers can be chained in a series, so that signature A is transformed by proxy AB into signature B, which is transformed by proxy BC into signature C, and so on. The last signature Z proves that the message was signed by each proxy in the chain in order. Proxy re-signers can also be chained together in a tree or directed acyclic graph. (Note that threshold signatures by contrast do not require or prove that the signatures took place in a particular order).

Proxy re-encryption is the same idea for public key encryption, with the added bonus that the re-encryptor can't read the message. So, for example, we could have the following scheme to restrict the distribution of content:

(1) Content owner Alice encrypts her content with her public key and publishes it to proxies P1, P2, etc., along with re-encryption keys AC1, AC2, etc. for each customer.

(2) Proxy allows customer to access the content only if paid. When paid, the proxy re-encrypts to the customer using the re-encryption key for that customer.

The proxies themselves are trusted neither with an ability to view the content nor with the discretion to distribute to additional customers not desired by content owner Alice. The proxy is trusted only to restrict access to a customer. (I present this scheme mainly just to illustrate what proxy re-encryption does. As an application, this particular content distribution scheme seems to me to only be useful if it somehow lowers transaction costs to route all payments through proxies rather than paying Alice directly, the latter which could be done by normal public-key cryptography, and of course it doesn't protect against a cheating customer re-publishing the content to the world).

I suspect proxy re-encryption could simplify the design of digital mix schemes like onion routing -- this is left as an exercise for the cryptographically inclinded reader.

This thesis is my source for most of this blog post; it discusses bilinear group cryptography for proxy re-encryption, proxy re-signing, and for reducing the trust needed for blinded offline digital cash.

Legal caveat: many, if not most protocols based on bilinear groups seem to have been recently patented.

Thursday, August 30, 2007

Money and the problem of cooperation

From "Shelling Out: The Origins of Money":
Evolutionary psychology starts with a key mathematical discovery of John Maynard Smith [D89]. Using models of populations of co-evolving genes, from the well-developed area of population genetics, Smith posited genes that can code for strategies, good or bad, used in simple strategic problems (the "games" of game theory). Smith proved that these genes, competing to be propagated into future generations, will evolve strategies that are Nash equilibria to the strategic problems presented by the competition. These games include the prisoner's dilemma, a prototypical problem of cooperation, and hawk/dove, a prototypical problem of aggression and its mitigation.

Critical to Smith's theory is that these strategic games, while played out between phenotypes proximately, are in fact games between genes at the ultimate level -- the level of competition to be propagated. The genes -- not necessarily the individuals -- influence behavior as if they were boundedly rational (coding for strategies as optimal as possible, within the limits of what phenotypes can express given the biological raw materials and previous evolutionary history) and "selfish" (to use Richard Dawkins' metaphor). Genetic influences on behavior are adaptations to the social problems presented by genes competing through their phenotypes. Smith called these evolved Nash equilibria evolutionary stable strategies..

The "epicycles" built on top of the earlier individual selection theory, such as sexual selection and kin selection, disappear into this more general model which, in a Copernican manner, puts the genes rather than individuals at the center of the theory. Thus Dawkins' metaphorical and often misunderstood phrase, "selfish gene", to describe Smith's theory.

Few other species cooperate on the order of even Paleolithic humans. In some cases -- brood care, the colonies of ants, termites, and bees, and so forth, animals cooperate because they are kin -- because they can help copies of their "selfish genes" found in their kin. In some highly constrained cases, there is also ongoing cooperation between non-kin, which evolutionary psychologists call reciprocal altruism. As Dawkins describes it [D89], unless an exchange of favors is simultaneous (and sometimes even then), one party or the other can cheat. And they usually do. This is the typical result of a game theorists call the Prisoner's Dilemna -- if both parties cooperated, both would be better off, but if one cheats, he gains at the expense of the sucker. In a population of cheaters and suckers, the cheaters always win. However, sometimes animals come to cooperate through repeated interactions and a strategy called Tit-for-Tat: start cooperating and keep cooperating until the other party cheats -- then defect yourself. This threat of retalation motivates continued cooperation.

The situations where such cooperation in fact occurs in the animal world are highly constrained. The main constraint is that such cooperation is restricted to relationships where at least one of the participants is more or less forced to be in the proximity of the other. The most common case is when parasites, and hosts whose bodies they share, evolve into symbiotes. If the interests of the parasite and the host coincide, so that both working together would be more fit than either on their own, (i.e. the parasite is also providing some benefit to the host), then, if they can play a successful game of Tit-for-Tat, they will evolve into symbiosis -- a state where their interests, and especially the exit mechanism of genes from one generation to the next, coincides. They become as a single organism. However, there is much more than cooperation going on here -- there is also exploitation. They occur simultaneously. The situation is ananalogous to an institution humans would develop -- tribute -- which we will analyze below.

Some very special instances occur that do not involve parasite and host sharing the same body and evolving into symbiotes. Rather, they involve non-kin animals and highly constrained territory. A prominent example Dawkins describes are cleaner fish. These fish swim in and out of the mouths of their hosts, eating the bacteria there, benefiting the host fish. The host fish could cheat -- it could wait for the cleaner to finish its job, then eat it. But they don't. Since they are both mobile, they are both potentially free to leave the relationship. However, the cleaner fish have evolved a very strong sense of individual territoriality, and have stripes and dances that are difficult to spoof -- much like a difficult to forge brand logo. So the host fish know where to go to get cleaned -- and they know that if they cheat, they will have to start over again with a new distrustful cleaner fish. The entrance costs, and thus the exit costs, of the relationship are high, so that it works out without cheating. Besides, the cleaner fish are tiny, so the benefit of eating them is not large compared to the benefit of a small number of, or even one, cleaning.

One of the most pertinent examples.is the vampire bat. As their name suggests, they suck the blood of prey mammals. The interesting thing is that, on a good night, they bring back a surplus; on a bad night, nothing. Their dark business is highly unpredictable. As a result, the lucky (or skilled) bats often share blood with the less lucky (or skilled) bats in their cave. They vomit up the blood and the grateful recipient eats it.

The vast majority of these recipients are kin. Out of 110 such regurgitations witnessed by the strong-stomached biologist G.S. Wilkinson, 77 were cases of mothers feeding their children, and most of the other cases also involved genetic kin. There were, however, a small number that could not be explained by kin altruism. To demonstrate these were cases of reciprocal altruism, Wilkinson combined the populations of bats from two different groups. Bats, with very rare exception, only fed old friends from their original group. [D89]. Such cooperation requires building a long-term relationship, where partners interact often, recognize each other, and keep track of each other's behavior. The bat cave helps constrain the bats into long-term relationships where such bonds can form.

We will see that some humans, too, chose highly risky and discontinuous prey items, and shared the resulting surpluses with non-kin. Indeed, they accomplished this to a far greater extent than the vampire bat. How they did so is the main subject of our essay. Dawkins suggests, "money is a formal token of delayed reciprocal altruism", but then pursues this fascinating idea no further. We will.

Among small human groups, public reputation can supercede retaliation by a single individual to motivate cooperation in delayed reciprocation. However, reputational beliefs can suffer from two major kinds of errors -- errors of about which person did what, and errors in appraising the value or damages caused by that act.

The need to remember faces and favors is a major cognitive hurdle, but one that most humans find relatively easy to overcome. Recognizing faces is easy, but remembering that a favor took place when such memory needs to be recalled can be harder. Remembering the specifics about a favor that gave it a certain value to the favored is harder still. Avoiding disputes and misunderstandings can be improbable or prohibitively difficult.

The appraisal or value measurement problem is very broad. For humans it comes into play in any system of exchange -- reciprocation of favors, barter, money, credit, employment, or purchase in a market. It is important in extortion, taxation, tribute, and the setting of judicial penalties. It is even important in reciprocal altruism in animals. Consider monkeys exchanging favors -- say pieces of fruit for back scratches. Mutual grooming can remove ticks and fleas that an individual can't see or reach. But just how much grooming versus how many pieces of fruit constitutes a reciprocation that both sides will consider to be "fair", or in other words not a defection? Is twenty minutes of backscratching worth one piece of fruit or two? And how big a piece?

Even the simple case of trading blood for blood is more complicated then it seems. Just how do the bats estimate the value of blood they have received? Do they estimate the value of a favor by weight, by bulk, by taste, by its ability to satiate hunger, or other variables? Just the same, measurement complications arise even in the simple monkey exchange of "you scratch my back and I'll scratch yours".

For the vast majority of potential exchanges, the measurement problem is intractible for animals. Even more than the easier problem of remembering faces and matching them to favors, the ability of both parties to agree with sufficient accuracy on an estimate of the value of a favor in the first place is probably the main barrier to reciprocal altruism among animals.

Just the stone tool-kit of even early Paleolithic man that has survived for us to find was in some ways too complicated for brains of our size. Keeping track of favors involving them -- who manufactured what quality of tool for whom, and therefore who owed whom what, and so on -- would have been too difficult outside the boundaries of the clan. Add onto that, quite likely, a large variety of organic objects, ephemeral services (such as grooming), and so on that have not survived. After even a small fraction of these goods had been transferred and services performed our brains, as inflated as they are, could not possibly keep track of who owed what to whom. Today we often write these things down -- but Paleolithic man had no writing. If cooperation occured between clans and even tribes, as the archaeological record indicates in fact occured, the problem gets far worse still, since hunter-gatherer tribes were usually highly antagonistic and mutually distrustful.

If clams can be money, furs can be money, gold can be money, and so on -- if money is not just coins or notes issued by a government under legal tender laws, but rather can be wide variety of objects -- then just what is money anyway? And why did humans, often living on the brink of starvation, spend so much time making and enjoying those necklaces when they could have been dong more hunting and gathering? Nineteenth century economist Carl Menger [M1892] first described how money evolves naturally and inevitably from a sufficient volume of commodity barter. In modern economic terms the story is similar to Menger's.

Barter requires a coincidence of interests. Alice grows some pecans and wants some apples; Bob grows apples and want some pecans. They just happen to have their orchards near each other, and Alice just happens to trust Bob enough to wait between pecan harvest time and apple harvest time. Assuming all these conditions are met, barter works pretty well. But if Alice was growing oranges, even if Bob wanted oranges as well as pecans, they'd be out of luck -- oranges and apples don 't both grow well in the same climate. If Alice and Bob didn't trust each other, and couldn't find a third party to be a middleman [L94] or enforce a contract, they'd also be out of luck.

Further complications could arise. Alice and Bob can't fully articulate a promise to sell pecans or apples in the future, because, among other possibilities, Alice could keep the best pecans to herself (and Bob the best apples), giving the other the dregs. Comparing the qualities as well as the quantities of two different kinds of goods is all the more difficult when the state of one of the goods is only a memory. Furthermore, neither can anticipate events such as a bad harvest. These complications greatly add to the problem of Alice and Bob deciding whether separated reciprocal altruism has truly been reciprocal. These kinds of complications increase the greater the time interval and uncertainty between the original transaction and the reciprocation.

A related problem is that, as engineers would say, barter "doesn't scale". Barter works well at small volumes but becomes increasingly costly at large volumes, until it becomes too costly to be worth the effort. If there are n goods and services to be traded, a barter market requires n^2 prices. Five products would require twenty-five prices, which is not too bad, but 500 products would require 250,000 prices, which is far beyond what is practical for one person to keep track of. With money, there are only n prices -- 500 products, 500 prices. Money for this purpose can work either as a medium of exchange or simply as a standard of value -- as long as the number of money prices themselves do not grow too large to memorize or change too often. (The latter problem, along with an implicit insurance "contract", along with the lack of a competitive market may explain why prices were often set by long-evolved custom rather than proximate negotiation).

Barter requires, in other words, coincidences of supply or skills, preferences, time, and low transaction costs. Its cost increases far faster than the growth in the number of goods traded. Barter certainly works much better than no trade at all, and has been widely practiced. But it is quite limited compared to trade with money.

Primitive money existed long before large scale trade networks. Money had an even earlier and more important use. Money greatly improved the workings of even small barter networks by greatly reducing the need for credit. Simultaneous coincidence of preference was far rarer than coincidences across long spans of time. With money Alice could gather for Bob during the ripening of the blueberries this month, and Bob hunt for Alice during the migration of the mammoth herds six months later, without either having to keep track of who owed who, or trust the other's memory or honesty. A mother's much greater investment in child rearing could be secured by gifts of unforgeable valuables. Money converts the division of labor problem from a prisoner's dilemma into a simple swap.

The proto-money used by many hunter-gatherer tribes looks very different from modern money, now serves a different role in our modern culture, and had a function probably limited to small trade networks and other local institutions discussed below. I will thus call such money collectibles instead of money proper. The terms used in the anthropological literature for such objects are usually either "money", defined more broadly than just government printed notes and coins but more narrowly than we will use "collectible" in this essay, or the vague "valuable", which sometimes refers to items that are not collectibles in the sense of this essay.

Reasons for choosing the term collectible over other possible names for proto-money will become apparent. Collectibles had very specific attributes. They were not merely symbolic. While the concrete objects and attributes valued as collectible could vary between cultures, they were far from arbitrary. The primary and ultimate evolutionary function of collectibles was as a medium for storing and transfering wealth. Some kinds of collectibles, such as wampum, could be quite functional as money as we moderns know it, where the economic and social conditions encouraged trade. I will occasionally use the terms "proto-money" and "primitive money" interchangeably with "collectible" when discussing pre-coinage media of wealth transfer.
More here.