Inflation is the parent of unemployment and the unseen robber of those who saved. – Margaret Thatcher
The late 19th century saw one of the most dramatic battles in technological history: the fight to dominate the future of electricity. On one side stood Thomas Edison, already a household name and pioneer of direct current (DC). On the other side were George Westinghouse and Nikola Tesla, champions of alternating current (AC). This was no mere competition over technology—it was a clash for supremacy in an industry that would define the modern world. Edison, fiercely protective of his early lead, went to extraordinary lengths to maintain control, even as the advantages of his system waned.
Edison’s strategy was aggressive and theatrical. Knowing that AC technology threatened his DC system, he launched a public campaign to paint it as dangerous and unfit for use. He orchestrated shocking public demonstrations where animals were electrocuted using AC, framing it as a lethal and uncontrollable force. He even lent his support to the invention of the electric chair, powered by AC, to further associate the technology with death and destruction. This smear campaign was not merely technical—it played on public fears, leveraging Edison’s reputation to maintain trust in DC.
Meanwhile, Westinghouse and Tesla faced an uphill battle. AC was clearly more efficient and capable of transmitting electricity over vast distances with minimal loss, unlike DC, which required power plants every mile. However, Edison’s fearmongering made it difficult to win public trust. Westinghouse persevered, pouring resources into developing safer AC systems and partnering with Tesla, whose innovations in electrical engineering became crucial to their success. The challenge wasn’t just technological—it was overcoming the perception war waged by Edison.
The turning point came in 1893 at the Chicago World’s Fair. Westinghouse secured the contract to power the event, beating Edison and his backers. The Fair became a dazzling showcase for AC, lighting up the grounds and demonstrating the system’s reliability and potential. Millions witnessed the triumph of AC firsthand, cementing its place as the superior technology. Shortly after, AC became the standard for electrical systems in America and beyond, marking a decisive end to the battle.
Though a genius in his own right, Edison found himself on the losing side of history. His efforts to discredit AC had delayed its adoption but could not stop its inevitability. The battle between Edison and Westinghouse remains a vivid example of how entrenched players often resist innovation that threatens their dominance. It also underscores how fear and misinformation can be powerful tools in such fights.
Ultimately, the war between DC and AC serves as a testament to the resilience of better ideas. Progress may face resistance, but superior technology and innovation prevail, even when powerful incumbents try to suppress them. It’s a story of courage, ingenuity, and the relentless push for a brighter, more electrified future.
The battle between Edison and Westinghouse wasn't just about electricity—it was about entrenched systems resisting innovation, a theme that resonates in marketing and communication departments to this day. Much like Edison’s fearmongering, modern institutions often push back against disruptive forces. Yet, attempts to withhold disruption rarely succeed and usually lead to what Joseph Schumpeter called “creative destruction,” where those who adapt better to new market circumstances outlast those who resist change.
There is near-total consensus that printing endless amounts of money and allowing government debt to pile up will likely lead to problems at some point. The exception might be MMT enthusiasts, who view money creation as an essential cornerstone of a functioning economy. Yet, most agree that money is to our economy what fuel is to a car: without fuel, there’s no way to drive.
However, with the rise of Bitcoin (and, to some extent, gold) and various economic events in the current century, interest in tangible assets as a store of value has grown. In my opinion, this shift raises another, much bigger question: is the current “fuel” the best for producing good economic outcomes and sustained growth?
While the first question centers on investment choices, the second challenges a fundamental principle underpinning modern finance and economics: the claim that the total quantity of money must grow over time to keep prices "stable" or that gradual price increases are necessary to incentivize productive economic activity.
Several layers support this reasoning. First, some factors of production, like wages, are considered "sticky," meaning they don’t easily adjust downward in nominal terms. While many input goods are priced dynamically according to market conditions, wages in "stable" economies are typically adjusted only once a year.
Consequently, falling prices can pose significant challenges for businesses. People may resist accepting lower nominal wages, undermining profitability. Furthermore, deflation increases the real debt burden, as loans must be repaid at nominal value. Many fear this could reduce investments, lead to layoffs and stifle innovation if prices don’t rise gradually.
Another claim is that deflation often coincides with recessions and can exacerbate them. The usual example cited is the Great Depression, where deflation set off a downward spiral: consumers hoarded cash, and businesses held back on investments. This is why central banks, like the Federal Reserve, expand the money supply and slash interest rates during recessions or when inflation falls "too low," according to their judgment.
To avoid deflation, central banks define "stable prices" as a gradual increase in the general price level of around 2%. But how do central banks achieve this? And how does the current monetary system work? Let me give you a brief summary.
Currently, central banks set the price of money for short-term borrowing, known as short-term interest rates. While interest rates, like all prices, are generally a function of supply and demand, central banks actively intervene in the market to maintain these rates within a targeted range. They achieve this by buying or selling financial assets—historically government issuances, but today, also other bonds such as mortgage-backed securities (MBS) or corporate bonds in the case of the ECB. This intervention ensures that short-term rates remain within a defined lower and upper bound. In contrast, longer-term interest rates are determined by the market and shaped by supply and demand, as well as expectations around inflation, growth, and compensation for duration risk.
This system directly impacts private sector borrowing, which is where most money creation originates. Businesses borrow money from banks, and the banks, operating under fractional reserve requirements, expand the money supply. When banks issue loans, they effectively create new money. While funds are transferred from one account to another, banks are only required to hold a small fraction of these deposits as reserves.
In essence, the money that customer A deposits and which banks lend to customer B creates a form of double counting: both A and B now have claims on the same initial deposit. Banks have long understood that depositors typically withdraw only a small portion of their funds at any time. This system functions smoothly as long as banks maintain sufficient liquidity to meet withdrawal demands. Problems arise when rumors of reckless behavior or financial instability trigger mass withdrawals, or "bank runs," as depositors demand their money en masse.
To be fair, fractional reserve banking isn’t inherently evil or has to be abolished. While philosophical arguments can be made—such as questioning how two people can have simultaneous claims on the same money—it is worth noting that fractional reserve banking historically evolved without extensive regulation. It became an established practice long before stringent oversight was introduced.
Moreover, fractional reserve banking is not inherently inflationary. When loans are repaid, the money supply contracts, and if overall economic output grows, the result could even be deflation. For example, fractional reserve banking was widespread in the 19th century, yet that period often experienced price deflation.
What makes the current system inflationary is the central banks' deliberate use of interest rate policies to encourage money creation. This approach stems from the belief that deflation hinders economic progress and can trigger economic depression by discouraging business investment and consumer spending. The prevailing argument is that controlled inflation—gradually devaluing money—motivates people to invest and drives economic productivity.
However, I’m not convinced that the evidence supporting this claim is robust. Much relies on concepts like the Phillips curve or the idea that lower real borrowing costs drive productive economic activity. My honest assessment is that the actual impact is, at best, limited.
What’s often overlooked is that constant inflation has consequences that aren’t immediately obvious. Inflation increases the marginal utility of money, creating stronger incentives for individuals to accumulate wealth—even at very high levels. Inflation also has significant distributional effects, a phenomenon known as the "Cantillon Effect."
Wealthier individuals, who typically have better access to inflation-resistant assets like stocks and real estate, are less impacted by inflation. They can use their existing assets as collateral to borrow more and acquire additional wealth, driving up asset prices. Meanwhile, those with little or no assets find it increasingly difficult to purchase such assets as their savings lose value over time.
One way to adapt to inflation is by moving further out on the risk curve—borrowing money to buy assets or items you might not have otherwise purchased. This behavior extends beyond investments to include higher-cost consumer goods, distorting entire industries in the process.
Consider the auto market, for instance. The widespread availability of car loans and leases encourages the production of more new vehicles. As a result, resources are disproportionately allocated toward manufacturing new cars, often at the expense of durability and longevity. Cars are designed for shorter lifespans, reflecting a broader trend: as interest rates decline, time preferences shift, with more people choosing immediate consumption over saving for the future.
The underlying mechanism is that the first recipients of newly created money can purchase goods and services at “old” prices before inflationary pressures adjust the broader market equilibrium. Beneficiaries of this system include the wealthy, who have easy access to credit, government agencies, bureaucrats, and social security recipients. Wealthy individuals with solid connections to policymakers benefit even more, as they can lobby for regulations that grant them additional advantages.
In this way, a system of gradual inflation disproportionately benefits risk-takers, wealthy individuals, and those dependent on government support, often at the expense of risk-averse and less affluent individuals. Put more simply, it favors borrowers over savers.
In this context, it’s unsurprising that, apart from classical mainstream economists and advocates of a more interventionist state, many investors and traders also favor such a system. A quick glance at the stock market over the past thirty years explains why. Investors have been able to compound wealth by holding a beta portfolio (such as a stock index ETF), partly because the continuous inflow of money also influences overall market performance. This inflow, driven by the money supply expansion, is often overlooked. Additionally, investors benefit from the Cantillon Effect.
Given these factors, it’s understandable why so many believe a modern economy requires a constantly growing supply of money. No one alive has witnessed a period of consistent deflation. The modern monetary system has always been inflationary, and, as many economists and investors argue, this inflation is justified.
But is this belief truly warranted? Why must the money supply grow in a modern economy? Is deflation as harmful as it’s often portrayed, and do we really need central banks to manage the money supply actively? I’ve been considering these questions for some time now, and my conclusion differs significantly from the traditional view.
To explore this, let’s look at a period of prolonged deflation. The most relevant example, which closely mirrors a modern economy, is the second half of the 19th century in the United States—an era known as the "Gilded Age." During this time of industrialization, the U.S. economy experienced robust growth.
From 1869 to 1879, the US economy grew at a rate of 6.8% for NNP (GDP minus capital depreciation) and 4.5% for NNP per capita. The economy repeated this period of growth in the 1880s, in which the wealth of the nation grew at an an-nual rate of 3.8%, while the GDP was also doubled.
Throughout this period, the U.S. economy also experienced continuous price deflation. In the 1870s, economists like Paul Krugman argued that the Panic of 1873 led to a six-year depression. However, much of this claim was based on an alleged monetary contraction at the time. In reality, this was simply another phase of deflation that actually drove up real wages, as Murray Rothbard observed:
Yet what sort of “depression” is it which saw an extraordinarily large expansion of industry, of railroads, of physical output, of net national product, or real per capita income? As Friedman and Schwartz admit, the decade from 1869 to 1879 saw a 3-percent-perannum increase in money national product, an outstanding real national product growth of 6.8 percent per year in this period, and a phenomenal rise of 4.5 percent per year in real product per capita. Even the alleged “monetary contraction” never took place, the money supply increasing by 2.7 percent per year in this period.
This brings me to another common argument against a deflationary monetary system: that deflation causes more frequent recessions. Advocates of inflationary money often point to the U.S. economy of the 19th century to support their claims. They refer to the National Bureau of Economic Research (NBER), which indicates ten recessions between 1860 and 1900.
However, there’s evidence to suggest that this claim is flawed. NBER may have mistakenly interpreted price deflation periods as real output downturns. Joseph H. Davis examined these periods by analyzing industrial production and mining data to assess what was happening to the economy. His findings suggest that many of these supposed recessions were not actually recessions at all. In fact, Davis argues that NBER’s claim that the Federal Reserve "stabilized the business cycle" is most likely incorrect.
Thus, the Gilded Age doesn’t suggest prolonged deflation leads to more recessions or a perpetual depression. In fact, the findings indicate that price deflation is a natural outcome in the absence of a continuous expansion of the money supply. Though the money supply increased during the Gilded Age, it grew slower than output.
With price deflation and growing real output, it’s no surprise that interest rates fell during this period. Real interest rates, however, remained relatively stable throughout. This brings us to another argument made by proponents of the modern monetary system: that deflation leads to massive business failures and stifles long-term investment.
At first glance, this argument seems logical, as loans must be repaid at their nominal value plus interest. In deflationary times, the real value of debt increases, which can strain businesses and consumers. However, I would argue that this only holds in an inherently inflationary system like today's one.
In a hypothetical economy characterized by constant deflation, banks could easily agree to lend to businesses where less than the nominal amount would need to be repaid as long as the real repayment is positive. In such a system, banks might even offer negative nominal interest rates as long as they expect the real value of repayment to exceed the original loan.
Furthermore, there could be more alternative forms of financing beyond traditional bank loans. For example, "appreciation-indexed loans" or equity-based financing products could combine debt and profit-sharing, allowing banks to benefit from the borrower’s success while reducing reliance on traditional interest rates.
Under such a system, the banking industry and business sectors would look quite different. Banks would need to shift their focus from expecting depreciation of money’s purchasing power to anticipating appreciation. Risk management would also shift, with banks concentrating on the potential for asset appreciation rather than relying on inflation to ensure returns.
Businesses offering higher investment yields would likely benefit more, as banks prefer less risky projects. However, this reduction in risky lending by banks might be offset by other economic actors, such as savers organizing high-risk investment entities or venture capital.
Of course, the winners and losers in this system would change. Excessive borrowing would be penalized while saving and having a lower time preference would become more valuable. This would lower the marginal utility of money for the wealthy, who might choose to invest in society rather than chase one opportunity after another. Or perhaps that’s a hopeful, nostalgic view?
In such an environment, businesses would be incentivized to be more cost-efficient and innovative rather than relying on past performance. This would lead to greater competition, which would ultimately benefit consumers, as larger businesses' advantage through easier access to financing would likely diminish.
Wage earners, too, would benefit, as the real value of their wages would increase over time, especially if they saved some of their income. In my view, the claim that workers wouldn’t accept lower nominal wages is unconvincing. With falling prices for goods and services, their purchasing power would rise. I think it’s reasonable to assume that such a system would benefit those with less wealth far more than the rich.
If an economy can adapt to annual price increases, it should be able to adjust to annual price decreases, provided those decreases are gradual on average. This brings me to my final point: could gold or Bitcoin serve as the cornerstone of such a system?
At present, the answer clearly favors gold, primarily because Bitcoin is still highly volatile. That volatility alone makes it an unlikely candidate for a stable currency right now. Conversely, gold has a relatively stable inflation rate and a long history of stable price appreciation. Could this change in the future? It’s possible, but it depends on several factors.
First, bitcoin adoption would need to increase substantially. This doesn’t necessarily mean it must be used for trade, but rather that it must gain enough users to become a more widely accepted asset. More demand for Bitcoin would drive up its price, but much of that price movement would be due to its extreme scarcity. During its price discovery phase, high volatility is to be expected.
Second, as more people acquire Bitcoin and some use it for everyday transactions, its liquidity will increase. This transition is crucial, as Bitcoin would need to shift from being a store of value to a medium for buying and selling goods and services. While it’s still far from becoming a currency, I believe it remains a possibility—provided innovations like the Lightning Network enable faster transactions.
However, several challenges might be difficult to overcome. Governments would lose control over currency, and large corporations would be at a disadvantage compared to the current system. Both would lose easy access to capital, a significant obstacle to implementing a deflationary system. Too many powerful entities would lose their advantages, making it unlikely for such a system to gain traction. As a result, it’s more realistic to view Bitcoin as an asset for now rather than as a currency. It's not just prices that would be "seeing red."
Read me all my rights, I’ll never grow tired of your great advice,
Won't somebody tell me, what I believe,
I'm sorry for your sacrifice, I guess I must’ve sounded like the anti-christ,
Won’t somebody tell me, won't somebody tell me - what I believe?Architects – Seeing Red
Have a great weekend!
Fabian Wintersberger
Thank you for taking the time to read! If you enjoy my writing, you can subscribe to receive each post directly in your inbox. Additionally, sharing it on social media or giving the post a thumbs-up would be greatly appreciated!
All my posts and opinions are purely personal and do not represent the views of any individuals, institutions, or organizations I may be or have been affiliated with, whether professionally or personally. They do not constitute investment advice, and my perspective may change over time in response to evolving facts. IT IS STRONGLY RECOMMENDED TO SEEK INDEPENDENT ADVICE AND CONDUCT YOUR OWN RESEARCH BEFORE MAKING INVESTMENT DECISIONS.