Discussion: (0 comments)
There are no comments available.
| The American
View related content: Society and Culture
The most revolutionary invention in history is so ingrained in our daily lives that we scarcely consider it an invention at all.
The most revolutionary invention in history is so ingrained in our daily lives that we scarcely consider it an invention at all.
One technology replaces another only when the new technology is better or cheaper (or both) than the old. When an invention reduces the price of an important part of the economic system radically enough, the result is an economic—and therefore political and social—revolution. The printing press greatly lowered the cost of dispersing knowledge, spelling the end of the already waning Middle Ages, just as the micro processor has reshaped the world by making the storage, retrieval, and manipulation of information inexpensive.
Such world-transforming inventions are usually material in nature, but not always. The scientific method developed in the 17th century is an intellectual tool of transcendent strength that helped powerfully to make the West the world’s dominant culture. And like the scientific method, some of the most important inventions are not the creation of one person or team, but the creation of the collective genius of the human race.
Consider another invention, one so deeply embedded in our daily lives that we seldom think of it as an invention at all: money. It was a creation not of an individual but of the free market. And while money was once exclusively a material object, it is today increasingly abstract.
Human beings are, uniquely, trading animals. Trade has allowed both individuals and entire countries to specialize in what they are good at or have in abundance, selling the surplus to acquire what they lacked. Trade allows us to exploit what economists call “comparative advantage.” Because both parties to a transaction value what they receive more than what they trade away, wealth is created. And nothing has facilitated trade more—thus fostering the enrichment of the world—than the invention of money.
Cattle were a high-value commodity in the ancient world, but they could not be broken down into smaller units. Half a cow, after all, isn’t a cow at all; it’s a side of beef.
Adam Smith thought that the “propensity to truck, barter, and exchange one thing for another” was part of the very fabric of human nature. Certainly archaeological evidence shows that humans have been trading with one another extensively, and over very considerable distances, since long before the dawn of civilization. Seashells of species endemic to the Red Sea have been found in Paleolithic graves in Switzerland. Amber beads from the Baltic have been found in Italy.
By Roman times, Western civilization had a highly integrated trading system centered on the Mediterranean. Egyptian wheat fed the Roman masses; British tin and Cypriot copper made Roman bronze statues possible. Rome traded directly as far away as India for spices and other commodities. The “global economy” of the Roman Empire through which all these commodities passed was only possible because of the existence of another commodity: money. Although, by definition, anything of which there is a supply and for which there is a demand is a commodity, most people do not think of money as a commodity at all. After all, money can’t be eaten or used for building. Money, by itself, is useless. Yet it is, again by definition, accepted in exchange for all the commodities that will feed you and shelter you.
Without money, people who want to trade what they have for something else they want must barter with someone with a complementary interest. If a man has apples and wants oranges, he must find a person who has oranges and wants apples. Economists, with their usual talent for turning an unmemorable phrase, call this a “double coincidence of want.”
But barter only works effectively when the number of goods being traded is limited. When humans in Neolithic times abandoned hunting and gathering in favor of agriculture, they settled down in one place for long periods and began developing ever more technologies (including pottery and beer fermentation). As the number of potential trade goods increased rapidly, the burden of finding the “double coincidence of want” increased exponentially, too.
As trade increased—spurred on by the appearance of professional merchants, who traded for a living rather than simply traded the goods they manufactured themselves—the need for a means of comparing the value of different commodities also grew. With the number of regularly traded commodities growing into the thousands, no one could possibly remember the value of every commodity in terms of every other commodity. Slowly, people began to talk about the value of all goods in terms of one standard good, usually an item of high value.
Many commodities came to serve as this unit of account, one of the vital functions of money. In northern Europe in the Bronze Age, tools served as an early marker of value. On Yap Island in the Pacific, enormous, doughnut-shaped limestone “coins” weighing hundreds of pounds were used as a form of money from prehistoric times until well into the 20th century. Wampum, beads made from clam shells and strung on leather belts, was used in much of colonial North America as a unit of account until the middle of the 18th century.
The first undoubted coins in the Western world were minted in the kingdom of Lydia in what is now Turkey.
In the Near East and Mediterranean, cattle often were used as a standard measure of value, as they still are by the cattle-herding Masai of East Africa. Indeed, cattle served as a unit of account for so long that echoes of the practice can still be heard in the English language. The word pecuniary, meaning of or pertaining to money, derives from the Latin word for cattle, pecus. Similarly, the word fee comes from the Old English word feoh, which also means cattle or livestock.
With cattle established as the unit of account, buyers and sellers could know the value of their goods in the marketplace much more precisely and thus exchange them more easily. It soon became clear that a standard commodity such as cattle could also function as a way around the tyranny of the double coincidence of want.
Because cattle had become the standard unit of value, they were more readily accepted in the marketplace than any other trade good. Even people who had no need for cattle would accept a cow in trade because they knew that it could be easily swapped for something they wanted. If a trader had four cows’ worth of wheat and wanted to buy barley, but couldn’t find a barley owner who wanted wheat, he might use the wheat to buy cattle instead and then trade them for barley. Thus cattle (or whatever the standard good was) became not only a unit of account but also a medium of exchange, the second vital function of money.
And while the wheat trader was haggling with the barley merchants, the cattle also served as a store of value, the third vital function of money. Wheat would fluctuate in price depending on growing conditions and the time of year. Usually it was worth more in the spring, when supplies were low, and less at harvest time when supplies peaked. But the value of cattle would vary much more slowly, both because the supply remained more constant (wheat was a dietary staple, beef a rare luxury) and because of the sheer psychological inertia caused by the fact that cattle were not only cattle but also the unit of account. Traders, in their own self-interest, wanted the price of cattle to remain steady. In economics as in politics, if enough people want something to be true, it often is true.
But while many commodities can serve perfectly well as a unit of account, most have grave disadvantages as a medium of exchange and a store of value. Cattle were a high-value commodity in the ancient world, but they could not be broken down into smaller units. Half a cow, after all, isn’t a cow at all; it’s a side of beef. Using cattle as a medium of exchange was rather like a modern economy maintaining a money supply with nothing smaller than thousand-dollar bills. That’s fine if you’re in the market for an automobile or a house, but lunch might be a problem.
Cattle also must be fed, watered, and watched over. Worse, they get sick or grow old and die. Because of cattle’s limited utility as units of account, when the New Stone Age slowly gave way to the Bronze Age, traders began to employ other commodities instead: metals.
Gold was the first metal to be used by human beings, from about 6000 B.C., followed by copper about 1,800 years later. Both are easily smelted and worked but too soft for most practical uses. And metal had many advantages over cattle as a standard commodity. It is easily divisible and cannot be destroyed. It is easily transported and stored. But perhaps its biggest advantage is that it is fungible. A pound of pure copper is exactly the same as any other pound of pure copper, whereas cattle vary tremendously in weight, fecundity, and health.
Metal changed hands between traders on the basis of weight, making a scale a necessity for transactions. But trading was quickly simplified by casting various metals in standard-size bars, called ingots. Egypt was using this system as early as the fourth millennium B.C. The system is still in use, although nowadays only governments settle accounts between themselves with bars of metal. (A substantial portion of the world’s monetary gold is stored deep beneath the Federal Reserve Building on Liberty Street in New York. There, gold bars each weighing 400 troy ounces—about 27.4 pounds avoirdupois—are regularly trundled back and forth between cages that hold each nation’s gold.)
But precious metals in bars of standard weight are not money. They are bullion. It was only a short step, though, from metal bars of uniform weight to coins of uniform value. The main difference is that while bullion is traded on the basis of weight, coins pass strictly on the basis of what economists call tale (from the same root as “tally”) and what everyone else thinks of as the coins’ number value.
Metal discs that may have been coins have been found in the ruins of Minoan Crete and date from the 13th century B.C. But the first undoubted coins in the Western world were minted in the kingdom of Lydia in what is now Turkey, perhaps as early as the ninth century B.C., but more likely around 650 B.C. China and India also started minting coins about this time, independently coming up with one of humankind’s brighter ideas.
In the 19th century, only the rich had checking accounts and few families had savings accounts. Most people received their pay in cash and kept their savings under the mattress.
The first coins, made of electrum, an alloy of silver and gold, had no other function but to serve as a medium of exchange, a store of value, and a unit of account. But there was one other thing needed to make these first coins true money: universal acceptance, the sine qua non of money.
It is always the marketplace, not government, that determines what is money. Governments may declare something “legal tender,” but if sellers in the marketplace won’t take it, or won’t take it at face value, then it is not money. If the marketplace decides to designate something as money, then it is. In Germany after World War II, with the old reichsmark worthless and the new deutschmark not yet introduced, American cigarettes functioned as money for small transactions. Cigarettes couldn’t be counterfeited, came in convenient denominations of a single cigarette, a pack, and a carton, and were even inflation-proof, as owners would smoke them if they declined in value. And they were universally accepted in exchange for other goods; in other words, cigarettes were money.
The first coins were almost certainly minted by merchants, probably those with the largest businesses and the best reputations. But it wasn’t long before governments were minting coins, and the right to do so soon became a zealously guarded government monopoly. One reason for this was to guarantee a uniform currency and ensure economic stability. But another was that rulers soon learned that they could make money by making money.
The Greek drachma was originally defined as 1/6000th of a talent of silver (a talent was a unit of weight equal to about 56 pounds). But Solon, the leader of Athens at the beginning of the sixth century B.C., ordered the minting of 6,300 one-drachma coins from each talent of silver, an instant profit of 5 percent. (In fairness, it should be pointed out that setting the face value of the coinage slightly above the bullion value also helped safeguard the money supply, because people would not be tempted to melt down the coins for the metal content.) The profit from minting coins with a face value greater than the bullion value is called seigniorage, a word that clearly implies the sovereign nature of the power to coin money.
But if the money supply soon came under the exclusive jurisdiction of the sovereign, the sovereign, all too often, became the money supply’s greatest threat. For while governments possess the power to coin money, they have always been hard-pressed to pay their bills—proof, if any were needed, that money and wealth are by no means the same thing.
Rome didn’t mint its own coins until 268 B.C., some 400 years after coins first appeared in the Mediterranean. Rome financed its war needs with massive short-term borrowing—often involuntarily loaned—as well as taxes. After each of the first two Punic Wars (264–241 B.C. and 219–202 B.C.), the Roman state paid back these debts by the simple expedient of calling in the coinage, reminting it with a much lower silver content, and paying off its creditors with the new, debased coinage.
Ordinary Romans, of course, weren’t stupid, and despite the psychological inertia regarding the value of money, the new denarius, as the coin was called, simply bought fewer goods and services than had the old denarius with a higher silver content. In other words, prices, reckoned in denarii, rose steeply, an effect that today is called inflation.
But with its nemesis Carthage destroyed (the third Punic War, 149–146 B.C., was, comparatively, a mop-up operation), the Roman Empire grew to encompass the entire Mediterranean world. With the loot of conquest pouring in, Rome and its coinage remained stable for centuries. But the flow of loot ebbed after the first century of the Christian era, even as the cost of the legions defending the frontiers remained, and Roman emperors began, once again, to lower the precious-metal content of the coinage. By the middle of the third century, the once-proud silver denarius had been reduced to a copper coin only thinly plated with silver.
Inflation soared and prices rose tenfold between 258 and 275 A.D. Meanwhile, gold and silver were hoarded and barter again became a major means of trading. Unable to raise the needed funds in taxes, the government took to requisitioning supplies for the army and paying the soldiers in kind instead of in money.
In Germany after World War II, American cigarettes functioned as money for small transactions, and functioned well.
The emperor Diocletian, who reigned from 284 to 305 A.D., was one of the great men of late antiquity and managed to reform the Roman coinage and taxation system, making it more just, if no less onerous. He even created the empire’s first annual budgets. But Diocletian lacked enough gold and silver to create an adequate money supply in these metals, and he was forced to issue base-metal coins as well, giving them an artificial value.
It didn’t work. What would later be called Gresham’s Law—“bad money drives out good”—made it impossible. If there are two forms of money, with one perceived as being more valuable than the other, the more valuable money will be hoarded and the less valuable spent. Diocletian, unable to contain inflation by setting the price of money, tried to do it by setting the price of everything else. Price controls didn’t work, either, despite the liberal use of the death penalty to enforce them. Goods simply went into hiding or were bartered to evade the legal prices, and black markets sprang up—the usual result when governments try to control free markets. As an old Vietnamese proverb has it, “Trying to stop a market is like trying to stop a river.” But Diocletian’s other reforms stabilized the Roman economy and gave the empire another 150 years of life.
The so-called Dark Ages that followed the final collapse of Roman power in Western Europe can be seen as a deep and protracted economic depression. What caused this depression was the disappearance of money when coinage largely ceased. Without money, long-distance trade withered and the once tightly integrated economy of the Roman world collapsed into an infinity of local economies that were mostly conducted on a barter basis. Only when coinage was revived under the Carolingians in the eighth century—and especially after the year 1000 as new, powerful kingdoms emerged in Western Europe—did the economies there rebound and long-distance trade return.
In the 16th century, the Spanish conquest of Mexico and Peru caused a flood of gold and silver to enter the European economy. This had the effect of greatly increasing the money supply relative to the amount of goods and services available. As a result, prices in Europe generally rose over the course of the century by about 400 percent, proof that money is just a commodity and its price is set by supply and demand like every other commodity.
Because of the inflation, and because of the great increase in the gold supply, coins of higher denomination began to be issued and gold coins became common for the first time in Europe since the height of the Roman Empire. But keeping significant amounts of gold coins around was dangerous. So merchants and wealthy individuals began depositing their gold with goldsmiths, who had the facilities to safeguard it. When someone deposited a sum of gold with a goldsmith, the smith would give him a receipt, noting the gold’s value. Very soon, something simple, obvious, and profound began to happen: The receipts—the symbol of gold on deposit but simpler to handle and less likely to attract the attention of a robber—began to be used for transactions instead of the gold itself. Money had started to become abstract.
Goldsmiths, of course, soon noticed this phenomenon and seized on its implications. Before long, when goldsmiths wanted to make a loan—secured by collateral such as real property, an increasingly common practice—instead of lending the gold itself, the smiths simply wrote out a receipt, which the borrower could then take into the marketplace and use as he wished. As long as the goldsmith’s reputation was a good one, his receipts circulated as money. By doing so, the smith was, quite literally, creating money out of ink and paper. The gold in his safe was still money, but now so was the receipt for it.
Soon the goldsmiths figured out that they could make loans not only on their own assets but on the gold on deposit with them as well. After all, as long as the receipts for gold passed as money, nobody wanted to truck around the actual metal. So the goldsmiths, who by this point had become de facto bankers, could safely issue more receipts than there was gold to back them. (This wasn’t dishonest, although a lot of very smart people, including John Adams and Thomas Jefferson, thought so. The collateral that secured the loans made up the difference.)
A substantial portion of the world’s monetary gold is stored deep beneath the Federal Reserve Building on Liberty Street in New York City.
There was, of course, one big catch. The market had to have faith in the integrity and, even more important, the solvency of a particular goldsmith’s bank. If it lost that faith, people would try to redeem their receipts while the redeeming was good, and the last in line would be out of luck. The banker, meanwhile, would be out of business. So prudent bankers always kept adequate reserves on hand to redeem any receipts that were tendered for gold. As long as everyone who asked for his gold got it, few would ask.
Until the late 17th century, gold-backed banks were loosely scattered across Europe, turning up wherever traders needed them; England was the first country to develop a fully functioning central bank, the Bank of England, in 1694. Though it remained privately held, the Bank of England was closely tied to the government and became a major force behind Britain’s economic expansion in the 18th and 19th centuries. The gold standard, begun in 1821 when the Bank of England guaranteed to buy or sell unlimited quantities of pounds sterling at a fixed price in gold, carried on the tradition established several millennia earlier: It was simply a means of pricing numerous commodities—in this case various national currencies instead of wheat and barley, and so forth—in terms of a single commodity, this time gold instead of cattle. As countries across the world followed England’s lead, the relative value of each currency could be instantly established, and trade—the only reason money has ever existed—was greatly facilitated, this time across international boundaries. The globalizing economy, so much in the news today, really began 200 years ago as world trade increased by orders of magnitude, thanks to both the Industrial Revolution and the gold standard.
War is hell on money as well as people. Desperate to pay the vast costs of war, governments beginning in 1914 went off the gold standard to help finance the military effort with inflation. By the outbreak of World War II, the domestic gold standard had been largely abandoned (most countries still backed their international dealings with gold until 1971). Gold-backed money was replaced by fiat money, the value of which was simply decreed by government. When a national economy is healthy, fiat currency works efficiently, but in a struggling economy, fiat currency often results in disaster. After all, there is a physical limit to how much metal coinage can be debased, but there is no practical limit to how much paper money can be printed. In 1923, Germany was issuing trillion-mark notes and they were worthless.
But as long as governments keep the printing presses under control, paper money retains its vouched-for value. The world’s currency markets, now essentially unified, are a powerful force exerting discipline on governments to control their money supplies. By the late 20th century, paper currency had truly come to be regarded as money in a psychological way, its history as a receipt for precious metals largely forgotten. The abstraction of money was thus much advanced. Part of the reason for this is that bank deposits—a completely abstract, ledger-entry form of money—had greatly increased. In the 19th century, only the rich had checking accounts and few families had savings accounts. Most people received their pay in cash and kept their savings under the mattress. But by the mid-20th century, members of the middle class as well were keeping their funds in bank accounts and paying their bills increasingly with checks, which are technically simple bills of exchange.
Today, a new form of money, plastic, is completing the process of money’s abstraction. Checks, of course, were never money in the strict sense because they were not universally accepted. But debit cards are money. They are not legal tender, but everyone, or very nearly everyone, will take them in exchange for goods and services. Even prostitutes often accept plastic these days in exchange for their services.
Debit cards instantly transfer sums from one account, the buyer’s, to another, the seller’s. But credit cards are also, for all intents and purposes, money. And banks, which a few decades ago were very picky as to whom they extended credit, now routinely extend it, in the form of credit cards, to virtually everyone who is not a proven deadbeat. The ability of individual governments to control this credit creation, and thus the money supply, is limited under existing law, and the laws that would be needed to do so would not be tolerated in a democratic society. People will not stand for the idea of government telling them what to do with their plastic any more than they would for government telling them what to do with the cash in their wallet. It is banks, by setting personal credit limits, that control the money supply, to the extent that it is controlled by institutions at all. In a deeper sense it is individual self-discipline that controls the world’s money supply—a profoundly democratic development, one nearly as revolutionary as the invention of money itself.
John Steele Gordon is the author of “An Empire of Wealth: The Epic History of American Economic Power” (HarperCollins).
Images by iStockphoto and Corbis.
There are no comments available.
1150 17th Street, N.W. Washington, D.C. 20036
© 2014 American Enterprise Institute for Public Policy Research