Not far from food riots of his own and loathe to unleash such a bacterium against his own homeland, a “Sealed Train” carrying Vladimir Ilyich Lenin and 31 dissidents departed from exile in Switzerland on April 9, complements of the Kaiser.
The “War to End all Wars” entered its third year in 1917, seeming as though it would go on forever. Neither side seemed able to gain strategic advantage on the front. The great battles of 1916 seemed only yesterday, in which any single day’s fighting produced more casualties than every European war of the preceding 100 years, combined. At home, the social fabric of the combatant nations was unraveling.
By 1916 it was generally understood in Germany, that the war effort was “shackled to a corpse”, referring to Germany’s Austro-Hungarian ally. Italy, the third member of the “Triple Alliance”, was little better. On the Triple Entente side, the French countryside was literally torn to pieces, the English economy close to collapse. The Russian Empire, the largest nation on the planet, was on the edge of the precipice.
With the American declaration of war in April 1917, both sides understood that the balance was about to shift. For Kaiser Wilhelm’s Germany, it was time to throw a knockout punch, before the US arrived in force.
Imperial Russia had seen the first of what would be two revolutions back in February, when food riots led to the overthrow and exile of the Imperial family. Full scale civil war broke out in 1918, resulting in the Bolshevik murder of the Czar and Czarina, together with their children, servants and dogs.
In the midst of this chaos, the Kaiser calculated that all he had to do was “kick the door in”, the Russian Republic would collapse, and they would be out of the war. He was right.
Following the overthrow of the Romanov dynasty, the more moderate Menshevik “Whites” vowed to continue the war effort. The split which had begun with the failed revolution of 1905 was more pronounced by this time with the more radical Bolsheviks (“Reds”) taking the more extreme road. While Reds and Whites both wanted to bring socialism to the Russian people, the Mensheviks argued for predominantly legal methods and trade union activism, while Bolsheviks favored armed violence.
In a small town in the northeast of Sweden, there is a train station. A bronze plaque on a blue tile wall, proclaims: “Here Lenin passed through Haparanda on April 15, 1917, on his way from exile in Switzerland to Petrograd in Russia”.
Lenin was in exile at this time, and Imperial Germany was at war with Russia. British historian Edward Crankshaw writes, the German government saw “in this obscure fanatic one more bacillus to let loose in tottering and exhausted Russia to spread infection”.
Not far from food riots of his own and loathe to unleash such a bacterium against his own homeland, a “Sealed Train” carrying Vladimir Ilyich Lenin and 31 dissidents departed from exile in Switzerland on April 9, complements of the Kaiser. Leaving Zurich Station amid the jeers and the insults of 100 or so assembled Russians shouting “Spies!” “Traitors!” “Pigs!” “Provocateurs!” Lenin turned to a friend. “Either we’ll be swinging from the gallows in three months, or we shall be in power.”
North through Germany and across the Baltic Sea, the group traveled the length of Sweden, crossing at the border village of Haparanda into Russian-Occupied Finland. The group arrived at Finlandsky Vokzal (Finland Station) in Petrograd on the evening of April 16, 1917. Like the handful of termites that brought down the mighty oak, that small faction inserted into the picture that April, would help to radicalize the population, and consolidate power on the Bolshevik’s side.
By October, Russia would experience its second revolution in a year. The Kaiser’s Germany could breathe easier. The “Russian Steamroller”, was out of the war. Chief of the General Staff Paul von Hindenburg and his deputy Erich Ludendorff could move their divisions westward, in time to face the arrival of the AEF.
Since the end of the Soviet era, Russian historians have come to believe that Vladimir Ilyich (Ulyanov) Lenin personally ordered the murder of the czar and his family, and that the Lenin era was every bit as bloody, as that of his successor Josef Stalin.
Lenin called for “Mass Terror” during the civil war of 1918, resulting in executions in the tens of thousands. Historian Alexander Margolis had the last word on the subject, if not the understatement of the century: “If they had arrested Lenin at the Finland Station, it would have saved everyone a lot of trouble”.
If you enjoyed this “Today in History”, please feel free to re-blog, “like” & share on social media, so that others may find and enjoy it as well. Please click the “follow” button on the right, to receive email updates on new articles. Thank you for your interest, in the history we all share.
Most of us remember the names, of the great monsters of history. Who remembers the name of the man who saved the lives of seven times the number, of this whole Parade of Horribles, put together?
Too often, history is measured in terms of its monsters.
President Robert Mugabe of Zimbabwe once orchestrated the murder of 20,000 civilians from a single province, after failing to receive a single vote. Josef Stalin deliberately starved as many as ten million Ukrainians, in a “terror famine” known as Holodomor. Pol Pot and a Communist cadre of nine – the Ang-Ka – killed between 1.7 and 2.5 million fellow citizens of 1970s Cambodia: about a fifth of the population. Mao Tse-Tung’s policies and political purges killed between 49 and 78 million of his own people, between 1949 and 1976.
You’re really playing in the Big Leagues when they can’t get your body count any closer than the nearest twenty-nine million.
From Adolf Hitler to Idi Amin, from Enver Pasha to Hideki Tojo and Leopold II of Belgium, the top ten dictators of the last 150 years account for the loss of nearly 150 million souls. Most of us remember the names, of the great monsters of history. Who remembers the name of the man who saved the lives of seven times the number, of this whole Parade of Horribles, put together?
Today, we live in a time and place where the National Institutes of Health (NIH) writes “The U.S. is one of the wealthiest countries in the world and accordingly has high obesity rates; one-third of the population has obesity plus another third is overweight”.
It wasn’t always so. In 1820, 94% of the world’s population lived in “absolute poverty.” The American economic historian and scientist Robert Fogel, winner (with Douglass North) of the 1993 Nobel Prize in Economics, wrote that: “Individuals in the bottom 20% of the caloric distributions of France and England near the end of the eighteenth century, lacked the energy for sustained work and were effectively excluded from the labor force.”
It’s hard to get our heads around the notion of “food insecurity”. I’m not talking about what’s in the fridge. This is the problem of acute malnutrition, of epidemic starvation, of cyclical famine and massive increases in mortality due to starvation and hunger-induced disease.
Norman Ernest Borlaug was born this day in 1914, on his grandparents’ farm near Cresco, Iowa. The boy’s grandfather, Nels Olson Borlaug, once told the boy “You’re wiser to fill your head now if you want to fill your belly later on.”
A farm kid educated during the Great Depression, Borlaug periodically put his studies on hold, in order to earn money. As a leader in the Civilian Conservation Corps working with unemployed people on Federal projects, many of his co-workers faced near-catastrophic levels of hunger. He later recalled, “I saw how food changed them … All of this left scars on me”.
Borlaug earned his Bachelor of Science in Forestry, in 1937. Nearing the end of his undergraduate education, he attended a lecture by Professor Elvin Charles Stakman, discussing plant rust disease, a parasitic fungus that feeds on phytonutrients in wheat, oats, and barley crops. Stakman was exploring special breeding methods, resulting in rust-resistant plants. The research greatly interested Borlaug, who later enrolled at the University of Minnesota, to study plant pathology under Stakman. Borlaug earned a Master of Science degree in 1940, and a Ph.D. in plant pathology and genetics, in 1942.
Borlaug attempted to enlist in the military following the attack on Pearl Harbor, but his application was rejected under wartime labor regulations. He was put to work in a lab, doing research for the United States armed forces.
Between 1939 and ’41, Mexican farmers suffered major crop failures, due to stem rust. In July 1944, Borlaug declined an offer to double his salary, traveling instead to Mexico City, heading a new program focusing on soil development, maize and wheat production, and plant pathology.
“Pure line” (genotypically identical) plant varieties possess only one to a handful of disease-resistance genes. Random mutations of rusts and other plant diseases overcome pure line survival strategies, resulting in crop failures. “Multi-line” plant breeding involves backcrossing and hybridizing plant varieties, transferring multiple disease-resistance genes into recurrent parents. In the first ten years Borlaug worked for the Mexican agricultural program, he and his team made over 6,000 individual crossings of wheat. Mexico transformed from a net-importer of food, to a net exporter.
In the early sixties, Borlaug’s dwarf spring wheat strains went out for multi-location testing around the world, in a program administered by the US Department of Agriculture. In March 1963, Borlaug himself traveled to India with Dr. Robert Glenn Anderson, along with 220lbs of seed from four of the most promising strains.
The Indian subcontinent experienced minor famine and starvation at this time, limited only by the US shipping 1/5th of its wheat production into the region in 1966 – ’67. Despite resistance from Indian and Pakistani bureaucracies, Borlaug imported 550 tons of seeds.
Biologist Paul Ehrlich wrote in his 1968 bestselling book The Population Bomb, “The battle to feed all of humanity is over … In the 1970s and 1980s hundreds of millions of people will starve to death in spite of any crash programs embarked upon now.” Ehrlich said, “I have yet to meet anyone familiar with the situation who thinks India will be self-sufficient in food by 1971…India couldn’t possibly feed two hundred million more people by 1980.”
He could not have been more comprehensively wrong.
Borlaug’s initial yields were higher than any other crop, ever harvested in South Asia. Countries from Pakistan to India to Turkey imported 80,000 tons and more of seeds. By the time of Ehrlich’s book release in 1968, William Gaud of the US Agency for International Development was calling Borlaug’s work a “Green Revolution”. Massive crop yields substituted famine and starvation with a host of new problems. There were labor shortages to harvest crops, and insufficient numbers of bullock carts to haul it to the threshing floor. Jute bags were needed, along with trucks, rail cars, and grain storage facilities. Some local governments even closed school buildings, to use them for grain storage.
Borlaug won the Nobel Peace Prize in 1970, for his contributions to the world food supply. The man’s name is nearly synonymous with the Green Revolution.
According to former Director General of the International Water Management Institute David Seckler, “The environmental community in the 1980s went crazy pressuring the donor countries and the big foundations not to support ideas like inorganic fertilizers for Africa.” The Rockefeller and Ford foundations withdrew funding, along with the World Bank.
Well fed environmentalist types congratulated themselves on their “success”, as the Ethiopian famine of 1984-’85 destroyed over a million lives. Millions more were left destitute, on the brink of starvation.
Borlaug became involved in 1984, at the invitation of Ryoichi Sasakawa, chairman of the Japan Shipbuilding Industry Foundation (now the Nippon Foundation). Sasakawa wondered why methods used so successfully in Asia, were not being employed in Africa. Since that time, the Sasakawa Africa Association (SAA) has trained over 8 million farmers in SAA farming techniques. Maize crops developed in African countries have tripled, along with increased yields of wheat, sorghum, cassava, and cowpeas.
Norman Ernest Borlaug died of Lymphoma in 2009, at the age of 95. Prime Minister of India Manmohan Singh and President Pratibha Patil paid tribute, saying “Borlaug’s life and achievement are testimony to the far-reaching contribution that one man’s towering intellect, persistence and scientific vision can make to human peace and progress”. The United Nations’ Food and Agriculture Organization (FAO) described Borlaug as “… a towering scientist whose work rivals that of the 20th century’s other great scientific benefactors of humankind”. American author and journalist Gregg Easterbrook wrote in 1997, “[The] form of agriculture that Borlaug preaches may have prevented a billion deaths.”
The world population when Ehrlich released his book in 1968, was about 3.53 billion. Today, that number stands at 7.6 billion and, when we hear about starvation, such events are almost exclusively man-made. The American magician and entertainer Penn Jillette described Norman Borlaug as “The greatest human being who ever lived…and you’ve probably never heard of him.” Let that be the answer to the self-satisfied and well-fed, environmentalist types.
“I now say that the world has the technology—either available or well advanced in the research pipeline—to feed on a sustainable basis a population of 10 billion people. The more pertinent question today is whether farmers and ranchers will be permitted to use this new technology? While the affluent nations can certainly afford to adopt ultra low-risk positions, and pay more for food produced by the so-called ‘organic’ methods, the one billion chronically undernourished people of the low income, food-deficit nations cannot.” – Norman Borlaug, 2000
“Something has gone seriously awry with this Court’s interpretation of the Constitution”, Thomas wrote. “Though citizens are safe from the government in their homes, the homes themselves are not”.
In 1775, Connecticut Governor Jonathan Trumbull proposed a fortification at the port of New London, situated on the Thames River and overlooking Long Island Sound. The fort was completed two years later and named for the Governor. During the Revolution, Fort Trumbull was attacked and occupied by British forces, for a time commanded by the turncoat American General, Benedict Arnold.
By the early 20th century, the Fort Trumbull neighborhood consisted of 90 or so single and multi-family working class homes, situated on a peninsula along the fringes of a mostly industrialized city center.
In 1996, chemists working at Pfizer Corporation’s research facility in England were studying compound UK-92, 480 or “Sildenafil Citrate”, synthesized for the treatment of a range of thoracic circulatory conditions. Study subjects were expected to return unused medication at the end of the trial. Women showed no objection to doing so but a significant number of male subjects refused to give it back. It didn’t take long to figure out what was happening. The chemical compound which would one day bear the name “Viagra”, had revealed itself to be useful in other ways.
For the newly divorced paramedic Susette Kelo, the house overlooking the Fort Trumbull waterfront was the home of her dreams. Long abandoned and overgrown with vines, the little Victorian cottage needed a lot of work, but where else was she going to find a waterfront view at such a price? It was 1997, about the time that Connecticut and New London politicians resurrected the long-dormant New London Development Corporation (NLDC), in an attempt to revitalize the city’s waterfront.
Susette Kelo sanded her floors on hands and knees as Pfizer Corporation, already occupying the largest office complex in the city, was looking at a cataract of new business based on their latest chemical compound. The company was recruited to become the principal tenant in a “World Class” multi-use waterfront campus, including high-income housing, hotels, shopping and restaurants, all centered around a 750,000 sq. ft. corporate research facility.
Connecticut College professor and NLDC President Dr. Claire Gaudiani liked to talk about her “hip” new development project. Fort Trumbull residents were convinced that stood for “High Income People”. With an average income of $22,500, that didn’t include themselves.
Most property owners agreed to sell, though not exactly “voluntarily”. There was considerable harassment of the reluctant ones, including late-night phone calls, waste dumped on properties, and tenants locked out of apartments during cold winter weather.
Seven homeowners holding fifteen properties refused to sell, at any price. Wilhelmina Dery was in her eighties. She was born in her house and she wanted to die there. The Cristofaro family had lost another New London home in the ’70s, taken by eminent domain during yet another “urban renewal” program. They didn’t want to lose this one, too.
In 2000, Susette Kelo came home from work the day before Thanksgiving, to find an eviction notice taped to her door.
Letters were written to editors and protest rallies were held, as NLDC and state officials literally began to bulldoze homes. Holdout property owners were left trying to prevent personal injury and property damage, from flying demolition debris.
Facing a prolonged legal battle which none of the homeowners could afford, the group got a boost when the Libertarian law firm Institute for Justice took their case pro bono. There was cause for hope. Retired homeowner Vera Coking had faced a similar fight against Now-President Donald Trump’s development corporation back in 1993, when the developer and Atlantic City New Jersey authorities attempted to get her house condemned to build a limo lot.
Eminent domain exists for a purpose, but the most extreme care should be taken in its use. Plaintiffs argued that this was not a “public use”, but rather a private corporation using the power of government to take their homes for economic development, a violation of both the takings clause of the 5th amendment and the due process clause of the 14th.
Vera Coking won her case against the developer and the municipality. The casino itself later failed and closed its doors. New London District Court, with Susette Kelo lead plaintiff, “split the baby”, ruling that 11 out of 15 takings were illegal and unconstitutional. At that point, the ruling wasn’t good enough for the seven homeowners. They had been through too much. All of them would stay, or they would all go.
Connecticut’s highest court reversed the decision, throwing out the baby AND the bathwater in a 3-4 decision. The United States Supreme Court agreed to hear the case, argued before the seven justices then in attendance on February 22, 2005.
SCOTUS ruled in favor of New London in a 5-4 decision, Justices Stevens, Kennedy, Souter, Ginsburg and Breyer concurring. Seeing the decision as a reverse Robin Hood scheme that would steal from the poor to give to the rich, Sandra Day O’Connor wrote “Any property may now be taken for the benefit of another private party, but the fallout from this decision will not be random. The beneficiaries are likely to be those citizens with disproportionate influence and power in the political process, including large corporations and development firms“.
Clarence Thomas took an originalist view, stating that the majority opinion had confused “Public Use” with “Public Purpose”. “Something has gone seriously awry with this Court’s interpretation of the Constitution“, Thomas wrote. “Though citizens are safe from the government in their homes, the homes themselves are not“. Antonin Scalia concurred, seeing any tax advantage to the municipality as secondary to the taking itself.
In the end, most of the homes were destroyed or relocated. State and city governments spent $78 million and bulldozed 70 acres. The 3,169 new jobs and the $1.2 million in new tax revenue anticipated from the waterfront development, never materialized. Pfizer backed out of the project, moving 1,400 existing jobs to a campus it owns in nearby Groton. The move was completed around the time when tax breaks were set to expire, raising the company’s tax bill by 500%.
Susette Kelo sold her home for a dollar to Avner Gregory, a preservationist who dismantled the little pink house and moved it across town. A monument to what Ambrose Bierce once called “The conduct of public affairs for private advantage”.
Movie Trailer and feature image above from the film “Little Pink House”, scheduled for release in April, 2018.
In 2011, the now-closed redevelopment area became a dumping ground for debris left by Hurricane Irene. The only residents, were feral cats.
A convincing case may be made that it was the Reduction of government spending in the years following WWII, that put wealth back in the pockets of the people who created it in the first place, finally ending the Depression.
Warren Harding entered office on March 4, 1920, in the midst of the sharp recession following WWI.
Harding’s Treasury Secretary, Andrew Mellon, believed that money was driven underground or overseas as income tax rates increased. Mellon held the heretical belief for that time, that lower tax rates led to greater levels of economic activity and that, as people had more of their own money to work with, the higher activity level resulting would increase tax revenues.
Based on Mellon’s advice, Harding cut taxes, starting in 1922. The top marginal rate was reduced annually in four stages from 73% in 1921 to 25% in 1925. Taxes were cut for lower incomes starting in 1923.
Vice President Calvin Coolidge became President in August 1923, following Harding’s untimely death. Coolidge would follow Harding’s economic policies of low taxation and high growth, the result would become the “Roaring 20s”.
Revenues to the treasury increased substantially, resulting in a 36% reduction of the national debt. President Kennedy tried the same tactic with the same result in the 1960s.
Opponents called it “Voodoo Economics” when President Reagan used the same tactic in the 1980s, but the results were the same. Same thing when President Bush the younger did it in the 2000s.
Economists and historians debate, because that’s what they do, but the results speak for themselves.
Unemployment and inflation both declined throughout the 1920s, while wages, profits and productivity increased. The decline in what Carter-era economists called the “Misery Index”, was the sharpest in history.
The twenties became a time of wealth and excess, and speculation in the stock market increased exponentially. New investors poured into the market in the belief that, like the housing market of the 2000s, prices could never go down. It was a nine year run when the Dow Jones Industrial Average increased tenfold, peaking at 381.17 on September 3, 1929.
Rising share prices encouraged more people to invest, even if they didn’t have the money to do so. Brokers were routinely lending investors up to two thirds of the face value of stocks. Over $8.5 billion hung out on such loans, more than the amount of currency circulating in the entire country, at that time.
As with 2007-08, there were early tremors that showed the bubble was about to burst. Then as now, such signals were seen only in hindsight, as the rising crescendo that was 1929, continued.
There was a brief contraction in March, but the first of the “Crash” began on “Black Thursday”, October 24, 1929. The market lost 11% at the opening bell, amidst heavy trading. To quell the frenzy, Wall Street financial firms Morgan Bank, Chase National and National City Bank of New York stepped up and bought large blocks of US Steel and other “blue chip” stocks, at prices well above where they were trading.
The tactic had the effect of stopping the slide, much as it did during the Panic of 1907. This time however, the relief would be short lived. “Black Tuesday”, October 29, saw the Dow Jones contract by 12% on a volume record which would stand unbroken for forty years. The president of the Chase National Bank said at the time “We are reaping the natural fruit of the orgy of speculation in which millions of people have indulged. It was inevitable, because of the tremendous increase in the number of stockholders in recent years, that the number of sellers would be greater than ever when the boom ended and selling took the place of buying“.
Fears of the Smoot-Hawley tariff act fueled a further contraction in the following weeks. Apparently, for good reason. When President Hoover signed the protectionist measure into law in 1930, American imports and exports shriveled by more than half.
Historians debate whether the stock market crash led to the Great Depression, or if the two events coincided. Only 16% of US households were actually invested in the stock market at the time, but the psychological effect was profound.
Easy credit and unbounded confidence had led to a speculative bubble which had finally burst.
Economists still argue about the interventionist policies, which followed. The guy who needed to support his family was grateful to be put to work on a WPA project, but the government doesn’t produce wealth. Every dollar spent had first to be extracted from the wealth producing, or “private”, part of the economy. You can’t fill a swimming pool, by draining one end of it into the other.
The stock market and unemployment rates staggered throughout the 1930s. It was WWII that finally put people back to work.
Yet that was merely activity, from an economic point of view. War production wasn’t growth, it was more like giving sugar to the kids, and watching them run around the house. A convincing case may be made that it was the Reduction of government spending in the years following WWII, that put wealth back in the pockets of the people who created it in the first place, finally ending the Depression.
An indicator of that wealth, the Dow Jones Industrial Average, wouldn’t retake the high ground of 1929, until 1954.
Paper money crashed in the post-Revolutionary Articles of Confederation period, when you could buy a sheep for two silver dollars, or 150 paper “Continental” dollars. Creditors hid from debtors, not wanting to be repaid in worthless paper currency. For generations after our founding, a thing could be described as worthless, as “not worth a Continental”.
You’ve worked all your life. You’ve supported your family, paid your taxes, and paid your bills. You’ve even managed to put a few bucks aside, in hopes of a long and happy retirement. “Inflation” is such a bloodless term. What if you hadn’t touched that “nest egg”, and its purchasing power was suddenly diminished…by 10%…40%…70%.
Throughout Roman antiquity, coinage retained a high silver content as a matter of law. Precious metal made the coins themselves objects of value and, for 500 years, the Roman economy remained relatively stable. Republic morphed into Empire over the 1st century BC, leading to a conga line of Emperors minting mountains of coins in their own likenesses. Slaves were worked to death in Spanish silver mines, to satisfy an endless need for the metal. Birds fell from the sky over vast smelting fires, yet there was never enough silver. Silver content was inexorably reduced, until the currency itself became worthless. Roman currency collapsed in the 3rd century reign of Diocletian. An Empire and its citizens were left to barter as best they could, in a world where money no longer had any value.
In the waning days of the Civil War, the Confederate dollar wasn’t worth the paper it was printed on. Paper money crashed in the post-Revolutionary Articles of Confederation period as well, when you could buy a sheep for two silver dollars, or 150 paper “Continentals”. Creditors hid from debtors, not wanting to be repaid in worthless paper currency. Generations after our founding, the worthlessness of a thing could be described as “not worth a Continental”.
The assistance of French King Louis XVI was invaluable to Revolutionary era Americans, but French state income was only about 357 million livres at the time, with expenses of over half-billion. France descended into its own Revolution, as the government printed “assignat”, notes purportedly backed by 4 billion livres in property expropriated from the church. 912 million livres in circulation in 1791 rose to almost 46 billion in 1796, of a note whose purchasing power had diminished by 99%.
The money in their pockets was literally, not worth the paper it was printed on. One historian described the economic policy of the Jacobins, the leftist radicals behind the reign of terror, as: “[A]n utter exhaustion of the present at the expense of the future”.
In each of these historic cases, nothing defined and established the value a currency, except what a willing buyer and a willing seller agreed it was worth. There was no “there”, there. It all sounds depressingly familiar.
The Austro-Hungarian Empire was on the losing side of WW1, and broken up after the war. Lacking the governmental structures of established states, the newly independent nation of Hungary began to experience inflation. Before the war, a US Dollar would have bought you 5 Kronen. In 1924, it was 70,000.
Hungary replaced the Kronen with the Pengö in 1926, pegged to a rate of 12,500 to one.
Hungary became a battleground in the latter stages of WW2, between the military forces of Nazi Germany and the USSR. 90% of Hungarian industrial capacity was damaged, half of it destroyed altogether. Transportation became difficult, with most of the nation’s rail capacity damaged or destroyed. What remained was either carted off to Germany, or seized by the Soviets, as reparations.
The loss of all that productive capacity led to scarcity of goods, and prices began to rise. The government responded by printing money. Total currency in circulation in July 1945 stood at 25 billion Pengö. Money supply rose to 1.65 trillion by January, 65 quadrillion in April and 47 septillion in July. That’s a Trillion Trillion. Twenty-four zeroes.
Banks received low rate loans, so that money could be loaned to companies to rebuild. The government hired workers directly, giving out loans to others and in many cases, outright grants. The country was flooded with money, the stuff virtually grew on trees, but there was nothing to back it up.
Inflation took a straight line into the stratosphere. The item that cost you 379 Pengö in September 1945, cost 1,872,910 by March, 35,790,276 in April, and 862 billion in June. Inflation neared 150,000% per day, as the currency became all but worthless. Massive printing of money had accomplished the cube root of zero. The worst hyperinflation in history peaked on July 10, 1946, when that 379 Pengö item from September, cost you 1,000,000,000,000,000,000,000,000.
The government responded by changing the name, and the color, of the currency. The Pengö was replaced by the Milpengö (1,000,000 Pengö), which was replaced by the Bilpengö (1,000,000,000,000 Pengö), and finally by the (supposedly) inflation-indexed Adopengö. This spiral resulted in the largest denomination common currency note ever printed, the Milliard Bilpengö. A Billion Trillion Pengö.
The thing was worth twelve cents.
One more currency replacement and all that Keynesian largesse would finally stabilize the currency, but at what cost? Real wages were reduced by 80% and creditors wiped out. The fate of the nation was sealed when communists seized power in 1949. Hungarians could now share in that old Soviet joke: “They pretend to pay us, and we pretend to work”.
The ten worst hyperinflations in history occurred during the 20th century, including Zimbabwe in 2008, Yugoslavia 1994, Germany 1923, Greece 1944, Poland 1921, Mexico 1982, Brazil 1994, Argentina 1981, and Taiwan 1949. The common denominator in all ten were massive amounts of government debt, and a currency with no intrinsic worth.
In 2015, Boston University economist Laurence Kotlikoff testified before the Senate Budget Committee. “The first point I want to get across” he said, “is that our nation is broke. Our nation’s broke, and it’s not broke in 75 years or 50 years or 25 years or 10 years. It’s broke today”.
Kotlikoff went on to describe the “fiscal gap”, the difference between US’ projected revenue, and the obligations with which our government has saddled the taxpayer. “We have a $210 trillion fiscal gap at this point”, Kotlikoff testified. 11.6 times GDP, the total of all goods and services produced in the United States.
On top of that, the United States owes something close to twenty trillion dollars, in fiscal operating debt, and our currency is unmoored from anything of inherent value. We spend a lot of time, talking about politics. Perhaps we should be talking about math, instead.