You are here

Money, Resources, and the Myth of the Gold Standard

Markets do not exist in a vacuum — they are a product of rules and social order. Markets function only because governments have created a system of institutions and rules that allow them to work. The free market works optimally only within the framework of rules created by government. Government created property titles, courts, police, money, and taxation — all of which are necessary in order for markets to function optimally. Without this governmental backdrop, markets could not exist, much less function efficiently.

The Origins of Money

The classical economic narrative of the origin of money, found in works such as Adam Smith's Wealth of Nations and Carl Menger’sOn the Origins of Money, assumes that people originally traded directly through barter and gradually adopted precious metals as a medium of exchange for the sake of convenience. Gold eventually became the universal medium of exchange because it had an inherent value. Eventually, banks developed as warehouses for people to store gold in and "dollar bills" were issued as warehouse receipts for the gold on store at the bank. Indeed, the term dollar originally designated a specific amount of gold. Over time, people began trading the warehouse receipts themselves in place of the gold. And this was the origin of paper money. While this narrative is quite elegant and seems to be a logically satisfying explanation of the origin of money, archeological evidence and anthropological research have thoroughly disproven this theory. In reality, the earliest forms of money were government-issued and credit-based currencies, not commodity-based mediums of exchange emerging spontaneously from market consensus. While the term dollar did originally designate a certain amount of gold, the “dollar bill” was not actually a warehouse receipt. In reality, the “dollar bill” was just a promissory note representing a credit valued at the amount of a dollar of gold. It was never actually backed by gold.

In Debt: The First 5,000 Years, David Graeber’s anthropological study of the history of money, it is explained that money historically took two primary forms, either as virtual credit or asmetal bullion/coin. The earliest form of money was actually the virtual credit form, while the metallic form came later. In the earliest agrarian societies, money was actually much more like modern “fiat” currency than like so-called commodity money. They would keep records of debts on clay tablets. The money didn’t actually exist but was just a record on a tablet, similar to the way the money in our modern bank accounts doesn’t actually exist but is just a record in a computer. Over time, some of these records actually morphed into tradeable promissory notes, simply specifying that the debt was to be paid “to the bearer” rather than to some specific individual.

The use of precious metals for currency first came about when kings had their loyal armies invade and conquer neighboring communities. The soldiers would pillage and loot, taking all the valuable possessions that they could find. Precious metals were commonly used in jewelry that would be stolen from conquered people, so kings would have the metal jewelry melted down and minted into coins. The coins would then be given to the soldiers. The first form of money to arise was the virtual credit form, which arose in relatively peaceful agrarian societies. The second form, metallic money, arose later in history
with the emergence of warring city-states that would plunder neighboring communities. Modern money systems are actually a mixture of these two forms. Modern money combines both virtual credits created by private loans and government-issued currency.

The Myth of the Gold Standard

You may be thinking, “Well, what about the gold standard?” There never was a “gold standard” such as the one we learned about in school. In Milton Friedman’s words, for most of the gold standard era the United States “though ostensibly on a gold standard…actually was on a fiat standard combined with a government program for pegging the price of gold.” The government kept a reserve of gold, but it was never anywhere near enough to redeem the whole money supply. With the “gold standard,” there were fixed international exchange rates and a price-specie-flow mechanism to automatically correct trade imbalances between nations. This balance-of-payments adjustment mechanism was actually the primary function of gold under the gold standard. The idea that the dollar was backed by gold under the gold standard is fundamentally a misunderstanding. Under the gold standard, the government created a fixed exchange rate between the dollar and gold — theCoinage Act of 1873 and the Gold Standard Act of 1900 artificially pegged the dollar to a specific amount of gold.

The dollar was never backed by gold, but merely arbitrarily pegged to it. The dollar is a promissory note to pay in the future (i.e. a debt), not a warehouse receipt of some sort. When the government originally issued currency, it didn’t say “Here’s a receipt for so-much gold;” instead, it said, “We don’t have enough gold to pay you right now, so here’s an IOU.” These IOUs, in theory, were for an amount of gold, but gold naturally fluctuates in value and so do paper notes when used as currency. The dollar bills would be traded and have their own market value independent of the market value of gold. When we were on the “gold standard,” a certain amount of gold was used as a unit of account and artificially pegged to the dollar—government price fixing! It was an attempt to keep the value of money stable by government decree as if by magic. It didn’t work, which is why we ended up having to abandon the gold standard.

The United States never really had a gold standard in the sense that most people imagine; and, prior to the late 19th Century, we actually had a de facto silver standard. The U.S. Constitution (Article 1, § 10) says, “No State shall…make any Thing but gold and silver Coin a Tender in Payment of Debts.” This left open the possibility of the Federal Government changing the currency system or issuing fiat paper money, but it also set us on the course for bimetallism. The American monetary system, prior to the late 19th Century, was not a gold standard but a bimetallist system in which gold and silver were equally regarded as legal tender. Upon Alexander Hamilton’s recommendation, the legal tender exchange rate was set at “15 times as much for an ounce of gold as for an ounce of silver, whence the ratio of 15 to 1.”

However, by the end of the 1700s, the market value of gold relative to silver changed. Consequently, gold could not feasibly be used as legal tender anymore. “But shortly [after 1792] the world price ratio went above 15 to 1 and stayed there (see Jastram1981, pp.63–69). As a result, anyone who had gold and wanted to convert it to money could do better by first exchanging the gold for silver at the market ratio and then taking the silver to the mint, rather than taking the gold directly to the mint.” In effect, silver ended up being the only metal actually used as currency in America until 1834. This episode in monetary history demonstrates that even precious metals are not stable and the price level would fluctuate, even on a gold standard, unless steps were taken to stabilize the price level through regulation of some sort. Inflation and deflation will result from any change in supply and demand; and thesupply of as well as demand for precious metals will change if new mines open up, if new non-monetary uses are discovered, etc. The bimetallist system instituted by the Founding Fathers functioned by allowing consumers and bankers to switch between a de facto silver standard and a de facto gold standard based
on the market prices of the metals.

The inherent instability of price levels on a metal standard resulted in people preferring the paper notes of the Bank of the United States as a medium of exchange. When Andrew Jackson and his cohorts sought to bring down the Bank of the United States, all they had to do was pass legislation to make the Bank of the United States accept gold as a substitute for its notes at the legally established ratio which differed from the actual market rate. The Bank had only really been giving silver (one of the two legal tenders) in exchange for its notes because the market price of gold was much higher than the monetary value of gold artificially established through legislation. As a result, forcing the Bank to give gold (rather than silver) in exchange for its notes ensured that the Bank would fail. The central bank ended up being abolished as a result.

Eventually, more gold was discovered in California and in Australia, devaluing gold relative to silver. This resulted in the market changing its preference from silver to gold as the standard medium of exchange. This led to the emergence of the so-called “gold standard.” But the gold standard never really was a gold standard. Banknotes were always in use and these notes tended to have a market value independent of the value of the amount of gold that they were supposed to be pegged to. The Coinage Act of 1873 ended bimetallism and effectively put us legally on a gold standard. This proved to be a terrible decision. As the mines started to “dry up” and the supply of gold stopped increasing, sharp deflation followed, which resulted in an economic depression by the time William Jennings Bryan was running for President in the mid-1890s.

Classical and Austrian School economists tell us that the modern banking system has a fraudulent basis. Banks were supposed to be gold warehouses, but the practice of fractional-reserve banking led to the creation of more gold receipts than actual gold. This caused the dollar bills to be devalued relative to gold, which created instability and led to the collapse of the gold standard. In this narrative, it is falsely assumed that gold backs the currency and that the dollar bill is (or was) a redeemable warehouse receipt. The Classical and Austrian School economists considered the bankers to be committing fraud because they were telling people that they could redeem these receipts even though the bank didn’t actually have enough gold to redeem them if everyone came to redeem them at once. The reality, however, is that the dollar was never advertised as a warehouse receipt. It has always been a promissory note or IOU from the government or from a bank. When I take out a $500 loan and promise to pay it back, my promise to pay it back does not imply that I currently have the funds to do so. It only assumes that I have the capacity to acquire the funds and pay off the debt at a future time. There is no fraud here if you actually understand what is going on.

How Monetary Systems Work

Money is created by governments. A dollar is an IOU from the government, a credit token, which the government promises to accept as payment for its services. Governments create money by spending it into existence. Money is analogous to subway tokens. The municipality accepts its own subway tokens as payment for a ride on its subway trains. However, before they can accept subway tokens as payment, they must first issue those tokens. They first create the tokens, giving them out to people, then collect them back in exchange for their services. National governments do the same thing with money. Governments first created money and gave it out to the people, then later collected it back in exchange for their services. If there are 200 people who need to use the subway this morning, and the train has the capacity to carry that many people, but there are currently only 50 tokens in existence, then the municipality needs to increase the supply of tokens in order to meet the demand. As long as there is room on the train for all the people that want tokens, the subway can create more tokens without it devaluing the tokens. They will simply create an additional 150 tokens and sell them for the same price as the existing tokens. Likewise, a government must supply enough money to meet aggregate demand; and, as long as the increase in supply does not exceed the capacity of the economy to meet the demands of consumers, there will be no inflation. The subway issuing new tokens will not devalue the tokens unless they issue more tokens than they have the capacity to honor.

If, however, the subway train only has the capacity to transport 300 people and there are 600 people needing to ride the train at the moment, it does little good to just print 600 tokens. The additional tokens will now just devalue the tokens overall. Each person will now only have a 50% chance of being able to redeem their token in exchange for a ride on the train, which means that the real value of the tokens is half of what it was when a token gave one a 100%-guarantee of access to the train. In this scenario, half the tokens would be worthless, so the people that need access to the train most and have the most resources available at their disposal would scramble to buy up more tokens so that they could secure their place on the train. And, in all probability, the subway station would end up raising their price to 2 tokens per 1 ride. By increasing the supply beyond the capacity of the subway to honor the tokens at their current value, they have actually devalued the tokens. Likewise, the government can simply print more dollars (credit tokens) to meet the demand of consumers as long as the economy has the capacity to meet the demand of consumers. This will cause no inflation, just as creating new tokens for subway
rides causes no inflationas long as the subway has the capacity to let all the people ride on the train. However, if the economy does not have the resources to meet the demand of consumers and the government prints more money, the dollar will lose some of its value. The constraint on how much money the government can spend into existence without it causing inflation is not how much revenue it can bring in through taxation. The only real limit on the government’s capacity to spend without causing inflation isreal resource availability.

Conventional wisdom tells us that the government must first tax the people before it can spend. This is false. Government must first spend before it can tax. This is because government spending is the mechanism through which money is created. If the government does not first spend money into existence, there will be no money for it to take back in taxes. The term revenue literally means “that which has been returned”—it is that portion of the money that the government has created which has now been returned to the treasury. The government does not spend tax revenue. This is because taxation and spending are both monetary policies. Government creates money by spending it into existence, thereby expanding the money supply. Government removes some of the money from circulation, thereby contracting the money supply, via taxation. These are two of the mechanisms through which the government can regulate the supply of money. The government must constantly be collecting and issuing money, just as the municipality must sometimes issue new subway tokens and sometimes collect tokens and remove them from circulation.

Sometimes tokens are lost, deflating the supply, so the municipality will have to counteract this by creating new tokens to reinflate the supply. If there is an excess of demand for public transit and the subway expands to increase its capacity to transport people, the municipality will need to issue new tokens to meet the demand up to the level of the trains’ capacity to meet that demand, but not beyond that point. So, too, the government must increase the money supply by spending more money into existence if the economy grows and real resource availability increases. If the government does not do so, then there will be “too many hands chasing too few dollar bills”—there will not be enough money to meet the capacity of the economy and demand of consumers. If the government refuses to spend more money into existence at this point, this will result in a recession. There will not be enough money in existence for the people to purchase the goods and services that are available, resulting in a general glut, where the goods produced by society will not be bought up by consumers. During a recession, there is a disequilibrium of supply and demand—there is not enough demand to clear the market. If the deficiency in demand is simply a result of a lack of money (e.g. if the only reason people aren’t buying the excess products is because they don’t have enough money to do so), then the government simply needs to spend more money into existence in order to rectify this problem.

This is why fiscal conservatism makes no sense at all. Every dollar bill is a credit token from the government andcredit is debt! What is a credit to the private sector is a debit to the public sector. It is an IOU from the government which they promise to accept back as payment for their services. If the government pays off all of its debt, there will be no money in existence. Not having a national debt means not having a monetary system—the abolition of the market economy! Not only is paying off the federal debt absurd but so is the idea of a balanced budget. If the government does not run a deficit, that means no new money is being created. If government spending does not exceed what is collected back via taxation, the economy cannot possibly grow. A balanced budget prevents the economy from being able to operate at full capacity. For the economy to grow, the government must spend in excess of what it generates in revenue in order to supply the people with enough currency to purchase the resources that are actually available. As long as the economy is not operating at full capacity, using up all the real resources available, the government does not need to collect taxes in order to be able to spend. Government only needs to use taxation to offset spending if it is spending more money into existence than can be readily redeemed in real resources. The constraint on government spending is not revenue but rather real resource availability.