Where are the rulers of the nations,
and those who lorded it over the animals on earth;
those who made sport of the birds of the air,
and who hoarded up silver and gold in which people trust,
and there is no end to getting;
those who schemed to get silver, and were anxious,
but there is no trace of their works?They have vanished and gone down to Hades, and others have arisen in their place.
—Baruch 3:16–19
*
Churches . . . have their problems with a Jesus whose only economics are jokes. A savior undermines the foundations of any social doctrine of the Church. But that is what He does, whenever He is faced with money matters. According to Mark 12:13 there was a group of Herodians who wanted to catch Him in His own words. They ask “Must we pay tribute to Caesar?” You know His answer: “Give me a coin—tell me whose profile is on it!” Of course they answer “Caesar’s.”
The drachma is a weight of silver marked with Caesar’s effigy.
A Roman coin was no impersonal silver dollar; there was none of that “trust in God” or adornment with a presidential portrait. A denarius was a piece of precious metal branded, as it were, like a heifer, with the sign of the personal owner. Not the Treasury, but Caesar owns the currency. Only if this characteristic of Roman currency is understood, one grasps the analogy between the answer to the devil who tempted Him with power and to the Herodians who tempt Him with money. His response is clear: abandon all that which has been branded by Caesar; but then, enjoy the knowledge that everything, everything else is God’s, and therefore is to be used by you.
The message is so simple: Jesus jokes about Caesar. He shrugs off his control. And not only at that one instance . . . Remember the occasion at the Lake of Capharnaum, when Peter is asked to pay a twopenny tax. Jesus sends him to throw a line into the lake and pick the coin he needs from the mouth of the first fish that bites. Oriental stories up to the time of Thousand Nights and One Night are full of beggars who catch the fish that has swallowed a piece of gold. His gesture is that of a clown; it shows that this miracle is not meant to prove him omnipotent but indifferent to matters of money. Who wants power submits to the Devil and who wants denarri submits to the Caesar.
—Ivan Illich
Republicans are fighting Democrats. Both parties are led by scoundrels and buffoons. From afar, it seems there’s a similar dynamic across the water in the UK between Tories and Labor, and only God knows what the hell is going on in Canada. The “wokes” are in a pitched battle with the “woke-panicked.” In churches, the “trads” are in a war with the “progressives.”
Jean, it seems to me that many people are only saying the things they think that I want to hear. Accordingly, what I am getting is not information, it’s fucking bullshit!
—apocryphal French commander
Okeydoke
What I’m about to write was provoked by a tweet from John Milbank—a theologian I follow in spite of our disagreements—wherein he, a conservative socialist referring to himself as a “natural Tory,” denounced the Tories’ desperate turn to Brexit and Thatcher-worship. It would appear, at least from way over here, that the UK is experiencing what the US is—the outworking of a longue durée economic and political shit-storm. From Professor Milbank’s comments, it appears that Labor’s latest combat sally is badly recycled Blairism.
In the US, the Democratic Party is likewise “stuck in the nineties,” which has been (too broadly) characterized by our phalangist culture warriors as an affinity for “technocratic rule.” As an Illichian Catholic and a subsistence socialist, I somewhat agree with that assessment. It’s necessary yet insufficient; and when the postmodern American phalangist uses it, it’s fucking bullshit.
The Atlantic anglophone culture war generate enough battleground smoke to conceal the true sources of our metacrisis, and one might suspect that’s exactly the point. And as those of us who’ve lived in the South for any amount of time, we’d recognize the folk adage: “Don’t fall for the okeydoke.” Means, “Those MFs are trying to trick your ass.”
No one’s talking about our global encirclement by speculative/rent capital, or “finance capital,” to which all parties contending for political position still bend their collective knee. That’s because finance capital has the world and its misleadership by the short hairs . . . and beyond financialization, two greater Balrogs are poised ominously just outside our door—nuclear war and biospheric collapse.
In a tweet the same day from Carlo Lancellotti, he said, “One of the hardest things is to convince old leftists [think Bernie Sanders or Jeremy Corbin—SG] that they cannot have FDR's left without FDR's country (in terms of religion, family structure, educational system etc).” It’s true, of course, but again “necessary yet insufficient” to explain the political conjuncture. This does not mean, however, we can’t or shouldn’t still advocate for basic social democratic measures, necessarily including the re-imposition of “rentier repression.”
Of what?
Rent
“Rent-seeking is the act of growing one's existing wealth by manipulating the social or political environment without creating new wealth.” Not just landlords, but bond-traders, userers, stock traders, hedge funds, “banks,” etc. People who acquire wealth through rent seeking are called rentiers. (If you want to sound as French as the word, you’ll say ron-tee-AY.)
Prevention of the re-imposition of New Deal rentier repression has long been a priority of both US political parties. Right-wing populism is just one inchoate reflex against financialization’s street-level consequences. The “establishment Republicans” are still committed to the rentier class, or what’s left of establishment Republicans after Trumpism. Likewise, the toothless, gerontocratic Democratic and Labor parties (now tactically “woke”) still carry water for finance capital, too.
In 2020, the Democratic Party put as much effort and treasure into blocking social democrats during the primary process as it did in defeating Trump, who was actually defeated by a genetically-engineered virus created by the US National Institutes of Health in a Chinese lab.
The institutions and instruments of high finance flourished throughout the Trump interregnum (he’s mostly bluster, that guy), the pandemic, inflation, the outbreak of a European land war, and now Israel’s genocidal destruction of Gaza. The entire world economy is still the rentier class’s captive.
The interests of speculative/rent capital, ever since Bill Clinton’s government, constitute the indispensable bipartisan consensus—even in the face of popular opposition. When the Republican renegades mounted their campaigns, they went after immigrants as their scapegoats and speculative/rent capital’s “free-trade” deals as a policy focus. When the Sanders/Corbin insurgencies took hold, they too took aim at neoliberal economics. The latter at least placed the blame where it belonged, on rent capital instead of brown people fleeing war, famine, climate emergencies, and homegrown thuggery.
Republicans and Tories are on the rocks now, having suffered what we called a populist “perimeter penetration” in my Vietnam occupation days—a tactical disaster that cannot be contained. Liz Cheney—rightly identified as a neoconservative (as is Hillary Clinton)—is seldom spoken of as a diehard defender of Wall Street; but this is precisely why her stock has gone up (so to speak), not with the electorate, but with the neoliberal press. Her shrinking “wing” within the Republican Party is a last-ditch attempt to wrest control back by Wall Street, whose fortunes are now lashed to the almost-as-ridiculous Democrats.
What would work, hypothetically, in the US (can’t speak for our UK cousins) would be a new party that advocates the cancellation of neoliberal trade deals, the re-implementation of social democratic rescue packages (like single-payer medical care [with limits!] and a Federal jobs guarantee program), and at the same time refused to bow to absurd petite bourgeois inventions like gender ideology and NGO “equity” narratives. There’s a big “middle” there being forced by finance capital’s “wealth primaries,” who would love to choose someone apart from these sychophantic scoundrels and bloviating buffoons. But finance capital’s not having it. It needs saying, Trump, for all his lying-ass bombast, did nothing to repress finance capital—quite the contrary. Trump is himself part of what come refer as the FIRE sector (rent capital): finance, insurance, real estate.
In a moment, I’ll support my assertions here with evidential details, but I’ll have to break away from the UK at one point to delve more deeply into a particularity in the American economy which is at least as significant as finance capital: our war economy. (America has come to rule the world at the behest of finance capital, but we support ourselves with war-making.)
What do I mean by speculation and rent?
Investopedia defines speculative capital as “funds that are considered expendable in exchange for the opportunity to generate outsized gains.” This describes a motive, but I’m going to revert to defining “capital” as a simultaneously a collection of particular people (the investors) and as a “social relation” between them and those upon whom they parasitically depend.
In political economy, “rent” means money paid not for a product but as a royalty. The rent for a house, the interest on a loan, etc. Speculative rents are acquired by gambling, buying stocks cheap and selling them dear, for example, with a bunch of Brooks Brothers bookies as middlemen. Their royalties are not part of any actual production process, like making water bottles or toys or cars or tools or appliances. Rents are money accumulated for producing nothing, nada, jackshit.
History
During the 1970s and early 1980s, global economic forces were shaping a new form of international economy, with corresponding changes in United States foreign policy. In 1973, as a protest against the US rescue of Israel from an impending defeat by the Egyptians in the Yom Kippur War, Arab nations implemented an oil embargo against the US, creating day-long gas lines that broke up only when filling stations pumped out their last drop of gasoline.
Oil prices rose dramatically, creating a tremendous windfall profit for oil producing states. Oil was denominated in US dollars, and those additional dollars were “recycled” through Wall Street (the world center of gravity for speculative finance), the only market large enough to absorb the mountains of money being accumulated by the same oil producers who were withholding gasoline from the US.
Wall Street doesn’t sit on money. Wall Street firms, as rentier capitalists, accumulate money through interest on loans, stock dividends, financial “derivatives,” and whatnot, instead of production; and so the glut of petrodollars from the Arab oil states was converted by Wall Street into vast “development” loans for poorer countries, especially in Latin America. These loans, not unlike the financial derivatives stirred together with subprime mortgages that would crash the economy in 2008, had adjustable interest rates. The people making the loans were allowed to raise the interest rates during the life of the loan. Latin American governments, squeezed by an American-led international financial order, had taken huge loans for “development,” then more emergency loans to service the expanding debt.
During Jimmy Carter’s presidency, beginning in 1977, the United States suffered something the economists’ faulty models couldn’t anticipate: simultaneous lack of “growth”—stagnation—and the rapid loss of purchasing power-per-dollar—inflation. This “anomaly” was called stagflation.
The reason economic models could neither predict nor account for stagflation is because, well, the models are wrong. The fact that they could do neither is exactly what proved them wrong. Those same models are still being used today even by the US Federal Reserve, in spite of the fact that in the wake of the 2008 meltdown, this model continues to fail. It fails both predictively and prescriptively.
Fed
The Federal Reserve is the United States’ Central Bank. It’s a quasi-medieval institution that runs things beyond any democratic control. It’s run by bankers, with a chief appointed by the state. In recent decades, the “Fed,” as it is called, has tweaked the US economy by using the Federal funds interest rate as a tool to create unemployment as a way of taming inflation. Bankers dislike inflation—X amount of money losing its relative purchasing power (rising prices, in other words)— because banks and other lenders thrive on debt repayment in the future, which drawn out over time means that each passing day reduces the purchasing power of the money owed them. They have long enshrined their class-warfare from above in this economic pseudoscience, which leads them to conclude that inflation is caused principally by workers receiving too much money, which is a double-problem.
(The fact is, inflation is—apart from whatever cause—rising prices. Price rises are caused by a host of factors, not just “too much money chasing too few goods.” This univocal account is the biggest bunch of bullshit in the psuedoscience of economics. Monopoly, price gouging, consumer mania, you name it, can all jump up prices.)
First, full employment means workers have more to spend, which the finance capitalists believe stimulates sellers to charge more for goods, which leads to inflation. Second, and perhaps more importantly from a class standpoint, when workers are fully employed and there are more jobs than people to fill them, the workers are in a stronger economic position vis-à-vis employers. Employers have to compete, via higher wages and interpersonal decency, to convince workers to choose them. This translates into greater social and political power for the working class vis-à-vis the employing class. When money is more available (via lower interest rates), more people build more businesses—so the economic logic goes—and therefore create more jobs, which expands profits overall, but also sets up that seller’s market for workers. More jobs for the same number of people. So when inflation takes off, the Fed raises interest rates, cutting off the creation of new jobs until unemployment reaches a certain threshold that transforms the worker-friendly market into an employer-friendly market again.
This “logic” has dictated that the Fed raise interest rates until unemployment is acceptably high, then drop the rates if unemployment threatens to slow profitability when unemployed people begin to hoard their money in order to pay for things like housing, food, running water, and electricity. Balance between inflation on the one hand and stagnation on the other, which are seen as antithetical—one can only exist apart from the other—is the ostensible goal of the Fed. That’s why stagflation presented the Fed with a conundrum (and a rebuttal to their theories).
This idea is still the “economic wisdom,” even though this “logic” has serially failed, which has nonetheless not disabused economists of their “theory.” In the last instance of failed economic “theory,” after 2008, it has failed so spectacularly that the Fed was forced to leave interest rates below two percent for ten years (and under one percent for eight-and-a-half of those years). What they know how to do does not work.
Behind this superficial account of the Fed, however, there are a number of historical and economic complexities, not the least of which has been moribund capitalism’s transfer of political power from industrial/commercial to financial capital.
Poles
Industrial/commercial capital cobbles together some money, buys its infrastructure, hires workers, and makes widgets or bricks or computers or studio dance lessons or taco dinners. Then it sells its stuff at a profit, pays off its loans, and lives happily ever after (until the market is saturated, or a new widget comes along, or raw materials run out, and it’s over).
Financial capital doesn’t accumulate money from producing anything. It accumulates money parasitically as royalties. Financial capital loans money to industrial/commercial capital, but it also issues credit cards, lends for houses and college, and engages in all kinds of financial gambling, some riskier than others.
As to gambling, the stock market, derivatives markets, etc., are finance capital’s casinos. This gambling, on a massive scale, can create dramatic instability, as we learned from the Great Depression, whereupon the state implemented a policy of financial-capital repression, beginning with the enforced separation of savings and loan institutions from speculative financial enterprises.
That wall between these two “poles” of capital was eventually eroded (by derivatives, keep reading), and Bill Clinton drove the last nail in finance-pole repression’s coffin with the repeal of the Glass-Steagall Act during his administration. By then, the finance capitalists, headquartered at Wall Street, had pretty much taken over the government and the global economy, which they repeatedly crashed through financial bubbles without any consequences (to them).
In fact, they had become “too big to fail,” and taxpayers were repeatedly forced to bail their sorry asses out. (There have been 995 enterprises bailed out by the US government.) The last crash was 2007–8, and the much-hyped Obama slow-recovery—during which the interest rates remained near zero—was accomplished by the Fed creating billions of new dollars to buy . . . wait for it . . . mortgage-backed-securities . . . (trash assets), a scheme called quantitative easing (QE). No one is ever going to buy this trash back, and everyone knows it. It’s just a gift to finance capital for its incompetence. (More on this further along, because QE has proven something that the rentiers don’t want anyone to know . . . the government can print plenty of money without causing inflation.)
This has served only to reflate the speculative bubble that popped fifteen years ago, ensuring that the next crash will be a whopper. Economic historians, honest ones at least, will tell you that the US and global economy has blundered from one crisis to the next ever since the tail end of the US invasion and occupation of Vietnam.]
But back to the stagflation crisis . . .
Conditionalities
In response to the 1974 stagflation crisis, Federal Reserve Chair Paul Volcker implemented something called the “Volcker Shock,” that is, since inflation was the greater danger to the rentier capitalists, he raised the interest rate from 7.5 percent to 21.5 percent, doubling US unemployment rates, while making large creditors whole. These elevated interest rates were passed along, via Wall Street institutions using those arbitrarily adjustable interest rates, to the Latin American countries that had received the aforementioned development loans. The US, as metropolitan center, exported its crisis to the poorer margin—creating a crisis in Latin America. (The Volcker shock doctrine lasted from 1979 to 1982.)
When President Ronald Reagan was in office in 1982, Mexico announced that it was going to default on its Wall Street loans—that is, refuse to pay what was unpayable—stranding Wall Street with more than $100 billion in losses. This panicked the Washington-Wall Street nexus.
Not for the first time, and certainly not for the last, the US government stepped in to bail out Wall Street’s rentiers. This came in the form of a “bailout” loan from the US to Mexico, but the ultimate intent was to ensure that Wall Street didn’t get hammered by the Mexican default. The vehicle for loans to cover the previous loans to Mexico was the International Monetary Fund (IMF), an international institution formed in the latter years of World War II, in which the US exercises a near-dictatorial role. But this time, the bailout loans had something attached to them in addition to interest, called “conditionalities.” Like a loan shark saying, “You’ll pay the loan, the vig, and you’ll give me a controlling interest in your business, too.
Ultimatums—“do this or else”—that (1) Mexico’s internal markets be opened to US-based investors, including US multinational corporations, (2) that labor and environmental standards be rolled back to increase the rate of profit in order to pay back the restructured loans, and (3) and regressive taxes to parasitize the general population in Mexico . . . all to assist in the payback of the loans. A structural imperative, though not one of the specified conditions, was also that Mexican enterprises—in particular, Mexico’s agriculture—be converted from production for local consumption to (mostly agricultural) export products to get more of the US dollars required to service their restructured but now vastly expanded external debt.
Dollars, after all, were the world’s general currency, exchangeable for goods anywhere in the world, and required for the international payment of debts and especially for petroleum, the fuel for virtually all modern production. Mexico’s business class, through debt, was being enclosed, and in turn, they enclosed Mexico’s peasantry to convert their small holdings into combined, industrial, monoculture export crops. This created a huge pool of unlanded peasants who were forced to work for starvation wages on what was formerly their own land, seek survival in city sweatshops, or head north in search of work.
This dollar arrangement was named “dollar hegemony” by Chinese economist Henry C.K. Liu:
World trade is now a game in which the US produces dollars and the rest of the world produces things that dollars can buy. The world’s interlinked economies no longer trade to capture a comparative advantage; they compete in exports to capture needed dollars to service dollar-denominated foreign debts and to accumulate dollar reserves to sustain the exchange value of their domestic currencies. To prevent speculative and manipulative attacks on their currencies, the world’s central banks must acquire and hold dollar reserves in corresponding amounts to their currencies in circulation. The higher the market pressure to devalue a particular currency, the more dollar reserves its central bank must hold. This creates a built-in support for a strong dollar that in turn forces the world’s central banks to acquire and hold more dollar reserves. Dollar hegemony is created by the geopolitically constructed peculiarity that critical commodities, most notably oil, are still largely denominated in dollars (Russian oil now trades in yuan, too, but only with China). Everyone accepts dollars because dollars can buy oil. The recycling of petro-dollars is the price the US has extracted from oil-producing countries for US tolerance of the oil-exporting cartel since 1973.
There’s a popular internet-fueled theory that the Chinese, the Russians, and others will end dollar hegemony by banding together, which will somehow crash the dollar and collapse the US. While it’s true that many countries seeking “multipolarity” want to reduce their dependency on US dollars (as China is slowly trying to do now), there is quite simply no other currency that can replace the dollar’s global seigniorage. Currency in volume sufficient for international trade must be backstopped by an economy with sufficient scale to absorb it, and no other economy—not even the EU, which has the additional problem of being spread over several semi-sovereign nation-states—approaches that of the US. Anyway . . .
Using similar crises, the IMF proceeded over the next few years to impose these “conditionalities”—called structural adjustment programs, or SAPs—on the majority of nations on the global margins, effectively undermining their national sovereignty inasmuch as the IMF, the World Bank, and the World Trade Organization, all US-dominated pre-market institutions that manage the so-called “free” market, came to dictate the economic policies of these structurally-adjusted nations.
While these were originally emergency measures used to overcome the Mexican debt crisis, the Reagan administration soon realized that they had stumbled onto a model that could be employed around the world to open previously protected home markets to US investment under conditions that were extremely advantageous to US investors. Moreover, it was a way to capture the political leadership of debtor nations in a dollar-dominated system, which would come to be known as . . . here’s that word . . . neoliberalism.
Neoliberalism
This is a flyover at several thousand feet, and we’re necessarily overlooking many of the details of this process, but we need to establish a kind of historical context wherein this neoliberalism can be understood. Once we understand neoliberalism’s outlines, we can begin to analyze a great deal about US foreign policy, which has until very recently been largely formed by the imperatives of neoliberal policy ever since the 1980s. It is a crisis of neoliberalism is which is provoking the urge in many placed to “decouple” from the world economy—easier said than done.
The four leaders now most associated with the origins of neoliberalism were Reagan, Thatcher, Pinochet, and Deng Xiaoping.
Neoliberalism itself—the world financial power structure now in its death throes—also created tens of trillions of dollars of fictional value in loans to fuel runaway speculation, resulting in a series of so-called financial “bubbles.”
By fictional value, we mean money that’s not matched in value by actual commodities in the home market (which should cause inflation, no?). Money then becomes not just a destructive “sign without a referent,” as Alf Hornborg called it, which dissolves ecosystems and communities; it becomes an entitlement for the rich, but an entitlement that obeys what my friend Dennis O’Neil calls “the cartoon law of gravity”: like Wiley Coyote, the rentier economy runs along madly, shoots over the cliff, then suddenly realizes there is no ground underfoot, whereupon it drops disastrously.
(The problem is, in the US and elsewhere, Wiley Coyote’s never left to suffer the consequences. After every crash, the very rentier capitalists who made their parasitic fortunes as reckless speculators were “bailed out.” Their stranded assets were bought up by the government, and the presses ran overtime to print money for the rentiers, to replace all that fictional value. It’s like gambling in a casino where the house gives the biggest gamblers back their losses. That’s because—as we’ll get to eventually—these socipathically irresponsible parasites choose who does or does not hold public office in the US.)
The US domestic economy has steadily, over the last five decades, lost productive power and increased the share of total wealth—held by fewer and fewer people—accumulated through rents, or “capital gains” In 1960, US manufacturing jobs accounted for 28 percent of all US jobs. That’s fallen to seven percent. Meanwhile, adjusted GDP per capita since 1960 has risen eight percent. That widening gap has been filled by capital gains (rentier returns on investment). During this same process, wealth inequality his risen astronomically. These combined trends jumped into high gear in just after 1980 with the emergence of the rentier neoliberal order. In 1978 the top 0.1 percent owned 7 percent of total US wealth. Today, they own 22 percent (a figure ominously mirroring 1929).
The Flip
The US-dominated financial system, called the “Dollar-Wall Street regime” by Peter Gowan and Susan Strange, also found a way to exercise managerial control over first world economies like Western Europe and emerging market economies like China and Brazil. This power was exercised not in the US role as creditor, but paradoxically in the US role as debtor, which requires a bit of an explanation.
This story actually begins at the end of World War II and continues to the present. The Soviet Union—itself savagely wounded by the war—attempted to secure a post-war partnership with its capitalist war allies in order to regroup. More than 27 million Soviet citizens had been killed, and cities as well as farms were in ruins all the way to Stalingrad. Tensions and mutual suspicion, as well as a struggle between the US and the USSR over the governance of postwar Europe led to hostility.
The Truman administration opted for the National Security State as an industrial strategy that could capitalize on the ramp-up for the war, and it needed an enemy to justify the expenditures of what Eisenhower would later call the “military-industrial complex.” The overtures from the USSR for a post-war peace were rejected in favor of official hostility by Truman—who had dropped atomic bombs on two cities as a message to the Soviets. This provocative posture locked Western Europe into a military alliance with the US, called the North Atlantic Treaty Organization (NATO), and put an official stamp on the US foreign policy of Soviet “containment.”
This inaugurated a long period of proxy wars, the first in Korea, later in Vietnam. The US was enjoying the fruits of post-war dollar dominance, Keynesian high employment, and a robust trade surplus. But the militarization of US domestic and foreign policy simultaneously created a mounting national debt. The United States was indebting itself to other “developed” nations, borrowing money from Europeans to finance its military adventures in Asia, then printing extra money at home to make up the difference. Because the dollar’s value was fixed for redemption at 1/35th of an ounce of gold, the US could print money without fear of draining the dollar of its value, which was then being used for investment in Europe.
In the classical theoretical market, the value of a currency is determined by how it balances against a total aggregate of commodities. Too few units of currency and prices fall. Too many units of currency and prices rise. The latter, they say, causes inflation.
So the dollar was losing purchasing power on the market, because it was being printed faster than actual commodities were being made, even as it remained exchangeable for European currencies at the same fixed rate. If you held Deutschmarks, for example, and the DM traded at four to one with a dollar—4 DM=$1—and the purchasing power of one dollar would buy four widgets on Monday but only three on Tuesday, then everyone in Germany being paid one dollar on Monday was losing a widget-worth of buying power during the exchange on Tuesday. But they had to accept this exchange as if it still bought four widgets whether they liked it or not.
The United States was printing more money, the dollar still fixed to gold, and the Europeans were watching their markets flooded with overvalued dollars, which they were then forced to accept. The market may have been saying that a dollar should be redeemable for francs or marks or pounds at one rate, but the post-war currency-control regime determined that Europeans had to continue to give away purchasing power with every currency exchange for devalued dollars. The US was exporting its inflation to Europe by repaying its military expansion debts to European lenders in dollars that exchanged the same while buying less.
When the first Special Forces advisors went to Vietnam in 1957, the system which seemed robust on the surface was already generating the conditions for its next crisis.
The Europeans, buying gold elsewhere at well above $35 an ounce, held onto their dollar denominated assets, hoping to redeem their dollars at something approaching their initial investment later.
But by 1967, with the Vietnam War driving the US deficit to record levels, France realized this was a bad gamble and started cashing US dollars in for US gold, draining off the US gold stocks. France was calling the US bluff. The Keynesian system of tightly controlling rentier capitalists, which included fixed currency exchange rates pegged to a gold-backed dollar, began to collapse in the face of (ever more expensive) US militarism.
On March 31, 1968, millions of Americans heard Lyndon Johnson announce on television that he would not run for a second term, and that he would not substantially escalate the Vietnam War after the strategic setback of the Tet offensive nearly two months earlier.
Unknown by the public at large, the depletion of the US gold holdings had abruptly altered the country’s military policy. As financial historian Michael Hudson noted, “The European financiers were forcing peace on us. For the first time in American history, our European creditors had forced the resignation of an American president.”
When the 1968 elections arrived, we saw a scenario that mirrored our more recent quagmire in Afghanistan. Democrats could not publicly argue for an end to the war, because withdrawal would mark the destruction of the myth of US military invincibility.
The options available in response to the collapse of the US Gold Pool were (1) withdrawal from Vietnam, (2) continue the war and accept further losses of gold and with it the erosion of US global power, or (3) force the abandonment of the entire Bretton Woods regime—the post-World War II international financial regime—beginning with the gold standard.
Because the Democrats had alienated a huge fraction of their base by refusing to oppose the war (as they are now alienating a big fraction with Democratic support for Israel’s genocidal ethnic cleansing of Gaza), Republican Richard Nixon was elected.
In 1971, Nixon selected Option 3. He abandoned the gold standard. Now the dollar was truly a fiat currency. This was a staggering checkmate against the US’s alleged global allies. They then had to do something with their trainloads of dollars (and dollars owed to them by the US) to prevent the dollar’s uncontrolled devaluation.
Quoting Hudson:
By going off the gold standard at the precise moment that it did, the United States obliged the world’s central banks to finance the US balance-of-payments deficit by using their surplus dollars to buy US Treasury bonds [loans to the US Government], whose volume quickly exceeded America’s ability or intention to pay.
Twenty-five years [after WWII], the United States [discovered] the inherent advantage of being a world debtor. Foreign holders of any nation’s promissory notes are obliged to become a market for its exports as the means of obtaining satisfaction of their debts.
As the old saying goes, “if you owe the bank a thousand dollars, you have a problem. If you owe the bank a million dollars, the bank has a problem.”
Nixon had not only erased volumes of US debt held by allies and forced perpetual European support for US military expenditures with the threat of tearing everyone’s financial house down, he had opened the door for rentier capitalists to escape the limitations put on them during the New Deal. That is precisely why Peter Gowan referred to Nixon’s risky destruction of the Bretton Woods’ gold standard and subsequent abandonment of fixed currency exchange rates as the “global gamble.”
Hot Money
Susan Strange referred to the new way as “casino capitalism.” The rentier capitalists were free to gamble without constraints; but more importantly, the US government, in collusion with Wall Street, had inadvertently discovered a new weapon to use against recalcitrant “allies.”
We need to divert here for a moment to take up a new term: “hot money.”
Fiat currency increases the velocity of exchange (computers have now become a force multiplier for this exchange velocity). Karl Polanyi warned that money was a dangerous thing to turn into a commodity itself, but with multiple currencies, and the ability of people to accumulate money by gambling on shifting exchange rates, money itself came to be bought and sold on global markets.
When I was in El Salvador in 1985, the official (bank) exchange rate was four colones to one dollar; but the demand for dollars by people involved in international trade—legal and illegal—was so great that one could sell dollars for colones on the street through a mafiosi-network of street-corner money-changers for 8-10 colones per dollar. Who’s going to exchange them at the bank for four to one?
Ultra-rich rentiers do the same thing on world markets. They look at whole national currencies, at whether they expect them to go up or down against dollars, and they buy or sell accordingly. Speculators can also buy national bonds and other variable-interest instruments in order to cash out during the rise and fall of that asset’s value. With derivatives (more on these later), you can even bet on whether these assets will rise or fall.
This money that chases across the globe looking for a good bet and a fast return is called “hot money.”
Investment for production is cool money. The two poles—productive and speculative—can work together, because the financial/speculative pole often bets on stocks and so forth that are invested in some productive activity. But economies get into trouble when hot money dominates cool money. That’s why hot money has to be “repressed” . . . hard.
The Great Depression (1929-39) was the longest and harshest economic downturn in US history—one that dragged down a good part of the rest of the world. Rampant speculation led to a massive stock bubble that burst in October 1929, causing half of all US banks to fail, wiping out the life savings of millions, creating mass unemployment as people quit spending, and vaulting Franklin Roosevelt into the White House. The Roosevelt Administration’s restorative policies included a legal firewall between speculation and savings institutions called the Glass-Steagall Act.
What we discovered during the Great Depression is that people like to gamble more than they like to work, something every casino owner already knows; and when you let the “cool money” of enterprise investment mix too freely with the “hot money” of speculation, the hot money takes over. Money is not infinite (except apparently during rich-people bailouts). So, when everyone seems to be accumulating rapidly, there’s an incentive to get into the market “while it’s hot.”
The speculative pole then siphons off investment in the productive pole and comes to dominate the market, because more and more people with power and money are part of the speculative pole. A herd mentality takes over in these “capital markets.” Speculative markets in particular are psychological markets that can do things utterly at odds with the material reality around them. This is what fueled the Tulip craze in seventeenth-century Holland, where the public value of tulips and tulip bulbs were caught in a competitive price tsunami that led to one type of tulip—arguably one of the easiest to grow—being sold for the price of four oxen. Later economists would use a metaphor for these sudden and inexplicable flows of hot money into a fictional space of overvaluation: “bubbles.”
Financial speculation bubbles keep growing as long as everyone is willing to pretend that this fictional value is real, until a few people stop and say, “Hey, this can’t last forever, so I’m selling this shit while I can,” or when the market is hit with a rogue wave of loan defaults. Then we see the psychological contagion, or that “cartoon law of gravity.” Everyone starts selling, the value of the tulips or stocks or bundled financial instruments (like “derivatives”) plummets, and before everyone can get out the doors with their parachutes, it spirals down out of control and mushrooms into flame and smoke against the unforgiving ground of material reality.
The New Deal’s Glass-Steagall Act prohibited savings and loan institutions from working under the same roof with speculators, because during the Great Depression, this speculative crash wiped out the banks and with it most peoples’ life savings. (During the Clinton administration, this Act was repealed, and we have had serial bubble crises since. Since then, hot money from Wall Street has run the government. More on this further along.)
With that short overview, let’s return again to the Nixon administration.
Bad Marriage
Long story short, the US dollar was the dominant currency, and if you were a nation other than the United States, you needed to protect your own currency from fluctuations and even speculative attacks. This new reality obliged central banks abroad to hold US dollars—in the form of US Treasury Bonds—in reserve, as a defense against speculative attacks on their own currencies. These nations then became US creditors—the US owed them money. Treasury Bonds are IOUs from the US Treasury Department saying you have loaned money to the United States. But these new creditors were then the banks who—as in the banker joke—had the problem. Every holder to this day of US Treasury Bonds knows that the total debt owed by the United States is categorically unpayable. So to this day, no one—including China, about which there is a great deal of financial fear-mongering—can afford to just dump dollars and risk crashing the dollar’s purchasing power, nor can they demand their money back at once. Too many nations hold too many dollars to “sell the dollar down” without cutting off their noses to spite their faces. And yet, the US has neither the capacity nor the intention of paying back those loans.
In 2016 China held 1.3 trillion dollars in US Treasury Bonds. When Trump became belligerent with China, China sold its share down to about $821 billion, equal now to Saudi Arabia—still a hell of a lot of money. Japan holds $1.112 trillion. The UK holds $662 billion. Ireland holds $271 billion. Brazil holds more than $224 billion. The list goes on. If China were to initiate—as some sinophobes suggest—a cash-out of its Treasury Bills, and that cash-out caused a run on the dollar destroying half its value, China would lose $821 billion in purchasing power. This is a game of chicken that the US has, so far, won every time.
The key to dominance in the world of the late twentieth and early twenty-first centuries has been pure dependency, in many cases, and inter-dependency, but of a very unequal nature, in others.
The latter is like a bad marriage. The husband depends on his wife for the management of the household, for a lot of unpaid labor, and for the care of children, and the wife depends on the husband for economic security. In the event of a divorce, however, we find that the wife comes off much worse than the husband, giving the husband a constant if unspoken threat to hold over the head of the wife. They depend on one another, but that interdependence is not synonymous with equal status or parity of power. It is mutual dependency on an axis of domination and subordination.
This is how US foreign policy is constructed, as dependencies and-or inter-dependencies in which the US is always the dominant partner. And there are few things that human beings depend on more urgently than food; which brings us to a subject that is mixed up with finance, but not the same as finance, and which also impinges heavily on the subject of neoliberal trade deals and immigration.
Before we attack this subject, though, we need another short historical excursion.
Food
Money is not biologically necessary for life. Human life sustained itself before modern money. Human life cannot be biologically sustained, however, without food. The topic of food requires just a bit of chemistry and biology. We’ll start with nitrogen.
Nitrogen is a chemical component of our world, necessary for most plant growth, therefore necessary for food, and therefore necessary for our survival. Oddly enough, after Timothy McVeigh blew up a federal office building in Oklahoma City, everyone—even non-farmers—came to know that fertilizer is made with nitrogen. But nitrogen is the most abundant element in the atmosphere, so why should anyone have to “produce” it industrially as a fertilizer? We live our entire lives literally swimming in the stuff.
As it turns out, atmospheric nitrogen, like atmospheric oxygen, is a conjoined twin. It consists of two, attached molecules: N2, as it were. Plants have to break this down into single molecules (N), then mix it with other stuff in order to turn sunlight into food. The process is called biological nitrogen fixation. Prior to human intervention, this fixation process was accomplished by prokaryotes (non-nucleated bacteria) and diazotrophs (ammonia-making bacteria).
World War I saw the widespread introduction of a new technology: the machine gun. The adherence to pre–machine gun tactical doctrines led to huge armies being mowed down like grass, whereupon they hid from machine guns in pestilential trenches, where they became bogged down. One of the bright ideas for taking advantage of this horror-film stalemate was killing the enemy with poisonous gas.
During the war, Fritz Haber, a German-Jewish chemist, was appointed director of the Berlin-based Kaiser Wilhelm Institute for Physical Chemistry. One of his jobs became the development of chemical weapons. He would eventually invent a chemical called Zyklon B, a cyanide derivative, which would later be used to help exterminate millions of his fellow Jews; but during WWI he was preoccupied with chlorine and ammonia for the development of poisonous gases for the battlefield. His other preoccupation was artificial nitrogen fixation. He learned how to do that by combining hydrogen and N2 under heat and pressure, using an iron isotope and aluminum oxide as catalysts.
He’d already patented this process before the war; but it would take Carl Bosch, the eventual co-founder of I. G. Farben (the company that manufactured Zyklon B for the Nazis) to commercialize the process, which would eventually establish the basis for the population explosion from 1.6 billion in 1900 to around 8 billion today. How?
Well, what he’d made was chemical fertilizer, and it meant that even exhausted land might continue to be “productive.” The food that feeds that additional 5 billion people is largely produced with the assistance of chemical fertilizers and those chemical poisons which began thier careers killing soldiers in the trenches.
“Heat and pressure” to split nitrogen are not some seemingly infinite essence like space, nor are they immediately available like atmospheric nitrogen. They are dependent phenomena. They have to be concentrated through some procedure.
Haber was looking at a crisis created by the depletion of guano—bat and bird droppings used as fertilizer—mostly collected from the islands off the coast of Chile; so he fell on a scheme that depended on another substance, but one in greater abundance than the remaining guano: fossil hydrocarbons, or “fossil fuels.”
It worked like a charm.
By post-WWII, American farmers were using prodigious quantities of chemical fertilizer across prodigious expanses of arable land, along with nerve gas, or organophosphates, as insecticides, expanding their harvests far beyond the American public’s capacity to consume.
Enclosure
The American manufacturing base had also expanded during the war, and given that the United States didn’t suffer the devastation that Europe and Asia had during the war, the US emerged from the war as a uniquely powerful actor. The other variable in the expansion of food production was the mechanization of agriculture, another net consumer of fossil energy. The United States began to build farm machinery; and as part of its goal of maximizing profit for farm machinery industries, as well as agricultural chemicals, it began to promote something called “developmentalism” for the so-called “underdeveloped” nations (markets for farm machinery and ag-chemicals).
In 1943, the Rockefeller Foundation, Ford Motor Company, and the Mexican government established a joint venture called—in English—the International Center to Improve Corn and Wheat. Standard Oil—a Rockefeller company—was manufacturing fertilizer, and Ford was building tractors. This was the beginning of the organized effort by first world corporations, with the active support of the US government, to push agricultural commodities into these so-called “underdeveloped” nations as part of what Illich would name as the “war on subsistence.” By 1959, Big Ag had opened rural development academies in Pakistan, and by 1963 in the Philippines. These academies were performing research and development on high-yielding cultivars of wheat, corn, and rice. By the time of the Nixon administration, 120 of the largest agribusiness multinationals had established a joint program with the United Nations Food and Agriculture Organization (UNFAO). The transformation in agriculture that followed was called “the Green Revolution,” a term coined in 1968 by US Agency for International Development Director William Gaud.
If ever there were a “revolution from above,” this was it. And it did accomplish a great deal. Caloric intake from cereal grains worldwide increased 30 percent per capita by 1990, and the prices of grains fell. The availability of more staple grains also supported a doubling of world population between 1960 and 2000. (This is the opposite cause-effect relation from what Malthus had theorized.)
These very general statistics do not tell the whole story though. Jason Moore has defined four “strategic commodities” necessary to keep modern economies afloat as the “Four Cheaps”: cheap labor, cheap raw materials, cheap energy, and cheap food. If some or all of these commodities suddenly rise in cost, profit margins are threatened. Dramatic price increases in any or all of them are sometimes called “signal crises,” because they signal a profit crisis ahead if business and the state can’t get those costs back down. For any fresh upwave of accumulation, big business needs more cheap labor, and more cheap labor requires more cheap food.
Ivan Illich pointed out (above) how Jesus suggests that if you worship money, you worship power. We are looking at how money operates in enormous concentrations in the citadels of power, in a world where money-dependency has captured most of the population, through a successful “war against subsistence” (called development) . . . in other words, through global monetary enclosure.
Araghi and Karides describe modern enclosure as having five characteristics: “(1) the transformation of a complex system of customary rights to land usage to legal and written titles to land ownership, (2) the transformation of the concept of property from jurisdiction over ambiguously defined areas to concretely defined (and enclosed) physical spaces, (3) the rationalization of the use of such demarcated landed property as a form of capital and at the service of ‘primitive’ and expanded capital accumulation, (4) the increasing privatization of the earth’s surface through dispossession and displacement of peasants and indigenous populations, and (5) destruction of non-market access to food and self-sustenance and creation of a (mobile) global working class that is massively concentrated at the urban centers of the world economy (and often living a life under a regime of ‘forced underconsumption’).”
A built-in “condition” of World Bank development loans was that recipient nations industrialize their agriculture. Smallholders were pushed off land (enclosure) to make way for large monoculture fields. Mechanization cut the number of necessary field workers to a fraction, and a process began whereby millions of formerly rural people—who were monetarily poor, but capable of self-reliant subsistence agriculture—were pushed into cities, where they came to rely more directly on the mass-produced staple cereals, which they now had to buy, and where they provided a windfall to urban manufactories of desperately cheap labor.
Peripheral nation agricultural production was being exported, in order to get precious US dollars for use in international markets and to service external debts. The agri-barons of the periphery were not feeding their own countries but engaging in monoculture for export, like coffee, sugar, and bananas (ergo the term “banana republic”).
Dumping
Urban hunger is a specter that most leaders understand only too well. I witnessed two food riots when I was in Haiti, and I can say they were among the most memorable experiences of my life. Political leaders know very well that mass urban hunger is a recipe for political destabilization, and they avoid it at all costs. Because many of these nations were exporting crops (to get the dollars to pay the US rentiers), they fell short in providing basic nutrition to their own growing urban populations.
The United States, however, was uniquely positioned to take advantage of this situation, because the agricultural subsidies of the New Deal, originally meant to rescue family farms, had been carried forward to the benefit of large agribusiness corporations that were eagerly bulldozing the American family farm into history’s landfill. Price supports for US grains, as well as cotton, peanuts, and tobacco, meant that agribusiness could produce as much grain as possible, and for every bushel produced the government would pay them enough of a subsidy to secure a tidy profit.
This, along with the arable landmass of the American Midwest, quickly led to massive overproduction of US cereal grains in the face of periodic shortages around the world, which gave US agribusiness unprecedented pricing power in global grain markets. As the United States sought a way out of its occupation of Vietnam in the early 1970s, the dominance of US grain production in the world was used as a foreign policy weapon that rewarded clients and twisted the arms of nations that appeared reluctant to follow the American diktat.
Grain was on a lot of political minds those days. Hubert Humphrey, the 1968 Democratic challenger for the presidency, had allegedly received an illegal campaign contribution of $100,000—a fact that came out during the Watergate hearings. The same contributor would also give the Nixon administration $25,000 to assist in its cover-up of the Watergate break-in. These were not insubstantial sums then, as they may seem now. Not many people had then heard of this fountain of largesse, whose name was Dwayne Andreas. Andreas pushed through a historic grain sale to the Soviet Union for the Nixon administration, worth $700 million ($8.94 billion in today’s dollars), with his company as the middleman. That company was named Archer Daniels Midland.
It was the next year, however, when Green Revolution food production ran headlong into the aforementioned Arab oil embargo. It’s here that we can see how the requirement for cheap food to sustain accumulation and the history of the Green Revolution as an instrument of US foreign policy combined with the emergence of neoliberal finance that gestated during the Nixon administration.
Moore explains how the loss of exploitable frontiers for “cheap nature” forced big capitalists—whose rates of profit were falling as cheap resources began hitting their limits—began turning inward from productive capitalism to the dominance of the rentiers, taking rent-returns on investment from existing wealth.
Neoliberalism’s financialized and coercive strategies of redistribution are now looking like a case of killing the goose that laid the golden eggs. There are, it seems, few golden eggs left to appropriate. This extractive strategy revived accumulation, but it did so by cannibalizing the accomplishments of the Fordist-Keynesian order. On the one hand, finance capital achieved its hegemony at a moment when the system’s capacity to restore the Four Cheaps was weaker than ever. On the other hand, the hegemony of finance capital has exhausted capitalism’s greatest source of dynamism, found in successive scientifictechnological revolutions that have labor productivity, and subordinated extra-human nature in its pursuit. This double exhaustion of productivity and plunder strategies is not coincidental with the hegemony of finance capital, but the condition of its birth. Neoliberal capitalism, it seems, has been cooking goose for dinner. (Moore, “Cheap Food and Bad Money,” 231–32.)]
By 1973, the US was running not a trade surplus but a deficit of $6.4 billion. Even more momentously and permanently, US domestic production of sweet (easily pumped and refined) crude oil had peaked in 1972 and was now in a decline that would increase US dependence on imports of this commodity into the foreseeable future.
Oil remained the principle feedstock of American domestic agriculture, and of the Green Revolution which was forcing the more marginal and dependent nation-states into a new order of debt colonialism. At the same time, the US became increasingly dependent on fossil energy imported from abroad, not merely to power its machines and transport, but to eat and to maintain the power of the US over global food markets. Even the Soviet Union had been pulled into the American grain-trade orbit by Nixon, proving the recently departed psychopath (may God have mercy upon his soul) Henry Kissinger’s thesis that the food-weapon was “more powerful than missiles.
This peak was in “sweet crude,” and the decline was not reversed until hydraulic fracturing technology was widely employed during the Bush and Obama administrations. “Fracking,” as it is called, does not extract sweet crude, but oil trapped between rock layers. It created a fresh spike in US production, which waxed with high oil prices and waned with price slumps. However, these wells pollute water and destabilize local geographies, even causing earthquakes. Moreover, the average “fracking” well itself peaks within eighteen months, then goes into a gradual and inevitable decline. The burst of “fracking” activity is certainly temporary, and once played out it will return the US to pre-fracking levels and foreign dependence. (See Magill, “Fracking Boom Leading to Fracking Bust.”) Beginning in June 2014, the Saudis flooded the market with oil to drop the price worldwide, in a double-strike against US domestic producers and Iran, which led to temporary but historic lows over three years in prices at the pump. That ended with the Ukraine war, to which oil companies added their own price gouging.]
The increasing dependency of marginal nations on American agricultural goods, combined with American debt-coercion of those nations to adopt the industrial capitalist model for export agriculture, would lead to decreases in marginal nations’ per-capita food production as well as financial and ecological bankruptcy.
Nixon broke up the old order; but the new order was not firmly established until the Reagan administration. In the interim, after a period of three years’ stewardship of the White House by the immanently forgettable Gerald Ford, the next elected president would have a dual-resume: a former naval officer and an agribusiness CEO: Jimmy Carter.
Jimmy Carter—who later repented of much—was then a southern agribusiness plutocrat posing as a good ol’ boy (a peanut “farmer”). Under Carter, an interesting thing happened. Something Southern folk in my family used to call “white liquor” or “white lightning” became legal and began magnetizing massive cash flows from US taxpayers in the form of corn subsidies. Corn alcohol had been produced for many years by rural scofflaws. My own father did a short stretch in the hoosegow when he was discovered with a car trunk full of it in the 1930s.
But when Nixon was taking money from Dwayne Andreas, the CEO of the sugar and corn conglomerate Archer Daniels Midland, ADM was concocting a new scheme that would simultaneously justify more “farm” subsidies to agribusiness and claim to address the energy crisis of 1973, which was also such a windfall to Wall Street. The scheme was to make massive quantities of corn liquor, which is of course flammable, and re-christen it “ethanol.”
This was proposed as an “energy independence” measure for the United States. It is made, naturally, with sugar and corn.
ADM found a friend in Jimmy Carter. Carter called the energy crisis the “moral equivalent of war,” and his administration exempted ethanol-spiked gasoline from a federal fuel tax. Carter began a loan program to build ethanol plants, which was halted by the Reagan administration for a while, until “farm” lobbyists paid serial visits to Capitol Hill with their checkbooks, whereupon the Reagan administration recanted. Neither party would challenge agribusiness subsidies; and both parties finally became avid ethanol boosters.
It was this influence, in conjunction with neoliberal “free trade” policies that allowed US grain producers to begin a process called “agricultural dumping.” Dumping is introducing a surplus into a foreign market below market value, which results in local producers’ inability to compete. Taxpayer-subsidized US corn, for example, is still routinely dumped into foreign markets at prices as low as 30 percent of market value. This leads to bankrupted local markets, and a growing and increasingly poor, “enclosed” urban population that becomes hostage to money dependency (as opposed to subsistence) in an imperial food market (and a surplus population that leaves its home country en masse in search of survival elsewhere).
A Mexican farm family who grows traditional corn is wiped out by genetically modified, chemical-industrial corn that is subsidized by a foreign power. The family loses their land to debt, moves to the city, where they may or may not find work to get money to feed themselves, and barring that, they may take the risk of illegal migration to the north to find work in the United States. One seldom hears about neoliberalism or agricultural dumping when the subject of extralegal immigration comes up in the United States; but the connections are clear.
United States policies created the conditions that made mass migration inevitable. After many NAFTA provisions went into effect that allowed US dumping in Mexico, between 1997 and 2004, taxpayer-subsidized US corn exports increased by 413 percent, while Mexican corn production fell by 50 percent based on a 66 percent devaluation of Mexican corn. In the same period, US soybean production increased by 159 percent, and Mexican soybean production decreased by 83 percent based on a 67 percent devaluation. Mexican pork production fell by 40 percent, corresponding to a 707 percent increase in US exports. Pork itself is not directly subsidized, but the corn that feeds industrial pork is.
When the North American Free Trade Agreement (a neoliberal project) was implemented in 1994, proponents has claimed that opening US markets to Mexican agricultural products would have the knock-on effect of raising Mexican wages; but what it did was accelerated enclosure by industrial monocrop operations in Mexico and caused an epidemic of landlessness. Land loss, ag-dumping, and structural adjustment programs (debt serviced by regressive taxation) worsened dramatically with NAFTA, and within one year of its implementation, Mexican illegal immigration into the United States quadrupled. It’s not a coincidence that NAFTA corresponds to the most massive wave of Mexican immigration to the United States in history. No one leaves behind family and familiarity without a very compelling reason. People who are worried about immigration need to turn their ire away from the immigrants themselves and onto US finance capital and agribusiness.
Reviewing, the combination of developmental imperatives to mechanize and enclose agriculture for monocrop production, alongside as agricultural dumping by the United States had created a situation where most of the rapidly urbanizing world had become dependent on US grain, or US seeds and chemicals, in order to eat. US foreign policy pertaining to food had become that “war on subsistence.” The androcentric cliché for holding power over others, “having them by the balls,” might better be replaced by “having them by the bellies.”
Militarism
US international power politics still combines the neoliberal debt traps with food monopolization as a mechanism of indirect control over a good deal of the globe. This is not, however, sufficient to exercise the kind of total dominance the US would require to halt the very real decay of US power that results from various kinds of imperial overstretch and the world-ecology of neoliberal capitalism inevitably aims at its own physical limits. The debt regime is not sustainable. The energy regime upon which the current system depends is not sustainable. The material resources upon which economic expansion is based are finite. The destruction of ecologic stability upon which all these rely is not sustainable. And the tolerance of others reaches limits.
Sometimes that end-of-patience thing comes out sideways. Sometimes is gets blended with international rivalries. Sometimes both at once.
The fallback position of any imperial power, when indirect controls are no longer effective, is direct control in the form of violence. That’s one of the reasons the United States—with some of the best naturally defensible borders in the world, an impossibly large land mass for any would-be invader, and a population unfortunately armed to the teeth—maintains a military force that’s more expensive than the combined military forces of Russia, China, Saudi Arabia, the United Kingdom India, France, Japan, and Germany. If you include the Department of Energy budget’s nuclear weapons, as well as foreign intelligence budgets, our war spending exceeds the rest of the world.
Calling the War Department the Department of Defense is perhaps the most blatantly obvious example of PR-speak you might ever hope to encounter. The US military is almost exclusively dedicated to missions of occupation and aggression, or profiteering support of other militaries . . . abroad.
Moreover, the force component of US foreign policy is not merely the uniformed services; it includes a shadowy and well-financed covert operations component that allows military actions by US-directed surrogates to provide an element of plausible deniability to US actions which might undermine official ideological claims of commitment to principles like “freedom,” “human rights,” and “democracy.” This covert establishment employs (circa 2013, after which these numbers were hidden from public view) 83,675 civilians, 23,400 military, and 21,800 paramilitary contractors, at a cost of $99.6 billion annually.
Neoliberal theology asserts the primacy of the private and the value of small government; but neoliberal practice has been massively buttressed by both states and and an interstate rules regime. The assurance of the market economy—as Karl Polanyi pointed out almost 70 years ago—requires a network of financial and regulatory institutions as its overseers. Without the state’s affirmative actions on behalf of the international business class, including military force, this class and its power would collapse. Begin by thinking about how those six carrier strike groups from the US Navy are required to ensure the flow of fossil hydrocarbons into the high technology centers.
The failed attempt to conquer Iraq in 2003, while it certainly involved oil, was also part of an effort to maintain a forward deployed US military capable of strategic intervention far from home. The Cold War had ended, and the disposition of US military forces appeared to have become obsolete. Forward-based forces needed to be redeployed from positions that were calculated to contain the USSR into positions that would give the United States more capacity to intervene in energy-rich Southwest Asia, to put the imperial hand on the spigots of global energy. The failure to secure that political objective—given that military success is not based on tactical but on political outcomes—meant that the US objectively lost the war in Iraq. The goal of the Iraq invasion was permanent bases (what few small installations are left, mostly in Iraqi Kurdistan with few solid defenses are now under frequent attack); but instead the Bush administration managed to win the Iran-Iraq war . . . on behalf of its nemesis, Iran. And now the new war in Ukraine has brought the military focus back to Europe.
The Obama administration further decided that the next best thing is to put forward bases near the Middle East and in the Asia-Pacific Theater to contain China (a policy continued under both Trump and Biden); and the Obama administration vastly expanded the role of the covert operations forces, as well as armed mercenaries, in its expansion of the Afghanistan War to five additional countries, as well as increasing its covert operations against Iran. This evolved into a utterly incoherent set of actions by Trump, followed by the ignominious departure from Afghanistan under Biden (still a correct decision) in 2020.
Obama’s administration was instrumental in the execution and consolidation of the coup against the democratically elected president of Honduras in 2009, just as the Bush administration was in the failed coup against the democratically elected president of Venezuela in 2002, and the US-engineered successful coup against the democratically elected government of Haiti in 2004. In two cases, the offending parties—President Chavez of Venezuela and President Zelaya of Honduras—were guilty of defying the Washington Consensus, that is, of opposing neoliberalism. President Aristide had merely criticized neoliberalism.
More than strategic interests drive the reliance on military operations. Dollar hegemony had created a “strong dollar” overseas, which inhibited the purchase of American goods abroad. The expansion of the military-industrial complex through huge defense contracts has served as a counter-balance to this deficit by acting as a surrogate export market for manufacturing that remains inside the United States.
The reason the taxpayers are not bailing out Lockheed Martin, Northrup Grumman, Boeing, General Dynamics, Raytheon, KBR, SAIC, Dyncorp, Hewlett-Packard, and a host of other major American corporations, including General Electric, Motorola, Goodrich, and Westinghouse, is that the margin of earnings that ensure their continued viability as profit-taking enterprises comes from DOD “cost plus” contracts that ensure the government will pay for all “overruns.” If war spending were ended tomorrow, the United States would experience a dramatic loss of jobs across many Congressional districts firmly hitched to the DOD pork wagon.
American foreign policy is amphibious. It operates through both the wet depths of public institutions and the dry lands of private institutions, and it includes a powerful and effective public-private perception-management apparatus.
Conspiracy
One of the key advantages of the public-private partnership is that foreign policy is insulated from accountability. The boundaries are blurred, via contracts and memoranda of understanding, between the US public sector—with its administrative apparatus, and its military and intelligence establishment with their vast budgets—and the private sector, composed of mercenaries, publicly funded “non-governmental organizations,” think tanks, foundations, and an army of horizontally-integrated perception-managers.
Those perception managers use mass media as a conformity-producing web of influence that reaches right into our living rooms and smart phone displays. The average American spends 6 hours and 59 minutes a day watching television or digital media. To appreciate the latent power of these media, realize that the average college class has a student in tow for three hours a week, approximately 45 hours for an entire course, excluding out-of-classroom study.
The limits of public discourse are established de facto by a media which operates on the same liberal market principles as the people who own them and exercises dominance within the government and in those sectors sometimes called civil society. The media, the governing apparatus, and “civil society” are in fact three faces of the same class interests.
In saying this, I am obliged to clear up a common misunderstanding of what this means and what I mean to say. It is easy to jump from the very general outline I’ve presented of three aspects of US foreign policy—finance, food, and force—to the conclusion that I mean to say, or that these facts tend to support the idea that there’s some conscious group of powerful conspirators who direct the world.
On the contrary, I need to emphasize that this way-of-the-world has evolved through a series of contingencies, and that its stability is maintained precisely because it’s a self-organizing whole held together by the mass aggregate of adaptive habits and social inertia. Its most powerful actors are in many ways as constrained, or more constrained, by the suprapersonal organization of “the regime” than most of us are. We each play our parts, and while some situational conspiracies have always been part of the terrain of politics, they are limited in scope, reactive, and far less determinative of large-scale outcomes than, say, changes in the built environment, climate, demographic shifts, institutional inertia, supra-personal power-structures, or the overdeterminative power of generalized money-dependency.
Remember that in our saga about the birth of neoliberalism there was no straight line, but a confluence of events and contingent decisions: the French buying US gold, Nixon dropping the gold standard, the Egyptian war for the Sinai, the American decision to airlift TOW missiles to the Israelis, the decision of Arab oil producers to embargo oil to the US, the US balance of payments deficit, Nixon dropping fixed currency exchange rates, rising oil prices creating petrodollars, the petrodollar tsunami being converted into opportunistic development loans, the Mexican threat of default, and so it goes. These were not grandiose plots, but actions and reactions, each producing a number of unintended or unanticipated consequences, which stimulated new actions and reactions.
The belief in a conspiratorial view of history seems to me to be a psychological security blanket against the fear of chaos . . . a defense mechanism, if you will. If the world is not as one would like it, at least a conspiratorial view of history suggests that history as a process is still subject to human control, and once some imaginary “we” can wrest control from the unjust conspirators, the world can be made right again.
If there was an overarching cause for these developments, it is not conspiracy, but the raw reality of a finite world facing the consequences of a regime based on the ceaseless expansion of monetized accumulation. Money, by its very nature (as I showed at some length in Mammon’s Ecology) chews up ecosystems and communities, but there are very real limits to how far this combined process can go before it undermines the basis of its own existence.
They are cooking the goose that once laid the golden eggs.
Debt & Religion
The seeming unpredictability, this sense of instability that compels some of us to reach for order in chaos with a history of conspiracy, has been produced by the current political milieu, one wherein neoliberalism has disembedded economies from local control and re-embedded them in national and transnational institutions; and those institutions are themselves now experiencing a loss of control in the face of unanticipated changes.
Structural adjustment programs have become political lightning rods igniting mass unrest around the world. Green Revolution agriculture has spawned megacities that are entropic black holes, teeming with hopelessness and crime. The US military, long considered the guarantor of last instance for the world order, has proven to be both the least cost effective institution on the planet, its biggest institutional oil-burner, and a perennial source of new resistance and unintended outcomes. In Iraq and Afghanistan, the myth of US military invincibility was shattered; and the costs of the Southwest Asia wars bled the US Treasury white. Offshoring of US industry and the political empowerment of rentier capitalists—Wall Street—which was accomplished through foreign policy, has transformed much of the US domestic population not merely into wage workers marketing themselves as commodities, but debt slaves.
Consumer debt in the United States is above $17.29 trillion, up from $12 trillion when I last checked these figures in 2016. The average American credit card (per card) balance is $5,733. US average household debt is $103,358—a combination of mortgages, student loans, personal loans, credit cards, medical debt, and auto loans. The average student loan debt for borrowers in 2023 is $37,338 in federal student loans and $54,921 in private student loans. People don’t understand why these things are happening, but they experience them nonetheless. And so the political response for many was to fall in step behind an obscenely wealthy, dim-bulb carnival-barker like Donald Trump.
When the crisis of fictional value, brought to you by Wall Street speculation, came home to roost, trillions in bailout money was printed and awarded to Wall Street. Main Street was left holding its debts. Wall Street, according to the experts who work the Wall Street-Washington nexus, was “too big to fail.” It’s interpenetration with the rest of the economy meant that if the Wall Street failed, the whole house would collapse. We’ve been “structurally adjusted,” which has always been a euphemism for privatizing the gains and socializing the losses. We are being transferred from the outer edges of the center to the inner edges of the margins. This has always been—given the nature of the global “growth” ecology—inevitable.
As an essential aside, let me say that “growth” is the deepest problem in the economy as it is, and yet “growth” is so irrevocably indispensable to our economy that we are all of us trapped on the same runaway train. The so-called “climate” crisis—which is far more than climate—is showing us this like a grinning clown standing alongside the tracks holding a sign saying, “You are all going to die.” Nonetheless, this is the economy within which our politics plays out, wherein most of us consider our ecologic situation to be just some footnote, or minor exacerbation . . . or another “growth” opportunity. But in the “growth” religion’s own terms, “growth” between World War II and around 1980 was led by wage increases. Since then, growth has increasingly been led by the expansion of debt. This is the essence of the rentier economy that’s now plundering the world’s people and places.
Crash 1
We’ve spoken more than once now about rentier repression, or financial repression, or speculative repression—choose your term—and about something called the Glass-Steagall Act. More properly named the Banking Act of 1933, it created the Federal Deposit Insurance Corporation (FDIC), a government insurance corporation, financed by mandatory premiums from banks themselves, that guarantee depositors up to $250,000 in the event of bank failure. And the Act repressed the rentiers by forbidding a “commercial” banking operation to be under the same roof as an “investment” bank. Productive capital was separated by a firewall from speculative capital.
Only three years earlier, savings and loan banks that held the money from everyday depositors had engaged in competitive speculation in the volatile “equity markets,” or stock-and-bond casinos for rich people. The biggest equity market in the US was, and is, the New York Stock Exchange. Savings banks were using depositor money to make their bets. Until 1929, these markets were experiencing a stock “bubble.” The speculative herd was bidding up stock prices, which meant you could buy a thousand dollars in stocks (regardless of their actual brick-and-mortar background worth) and in a few weeks, or months, cash out at a higher price, stash some money away, and “re-invest” in the casino that appeared it would pay off indefinitely. If you held X dollars in these fictional value assets, it could also be used as collateral for loans. In this competitive marketplace, which had thus stretched its tentacles into every cranny of the economy—real as well as fictional—business actors felt compelled to participate in this modern “tulip craze.” If you were a working stiff with your meager but precious savings in the bank, your money was mixed together with the total money that your bank was gambling with. People had accumulated enormous debts, and businesses had overstretched their markets with easy loans. In March of 1929, the “bubble popped,” which is to say that investors began to worry about the outrageously overpriced stocks, and they started selling off. In spite of attempt by big banks to stop the “slide,” creating a false spike, by October the market went into free-fall, beginning a series of short recoveries followed by worse slumps that peaked downward in 1932, an 89.2 percent drop from March 1929.
Both big and small depositors went to their banks, only to find them closed, their deposit money lost forever in the sea of fictional value. More than 9,000 banks failed between 1929 and 1932.
So, Glass-Steagal was the law that (1) told savings and loan banks they could not gamble with depositors’ money, and (2) made banks pay an insurance premium (FDIC) to protect up to $250,000 in personal savings.
When the 2007-8 crisis hit, it was called a “housing bubble,” which might at first glance appear to contradict the claim that it was speculative, because—after all—houses (real estate) are real brick-and-mortar. But when you take a loan to buy your house, there are two ways to look at it: it might be a home, or it might be (for you) a financial investment you intend to sell later. For a time, people were “flipping” houses and making a lot of money. In the homeowner’s case, the house might be partly home, partly speculative—we all have to be prepared to move in search of disembedded suvival or opportunity.
If you’re investing in the house, then fixing it up, you are selling on a productive market. If you are investing because housing prices generally are rising, and you just want to cash-in on the price change, you are speculating. (One can do both at once.)
Derivatives
But it wasn’t just housing speculation inflating the bubble, though house prices were gassing it up. The mortgage securities were being “bundled” with other “assets” into something called “derivatives.” (Financial types make this shit intentionally hard to understand because it helps them to get away with stuff.)
When Glass-Steagal was in full effect, commercial and investment banks were like Southern turkey and dressing, held in totally different containers. After Glass-Steagal, they became like Northern turkey and dressing, with the dressing cooked right inside the turkey. Even prior to Glass-Steagal’s clinical death at the hands of Bill Clinton, derivatives were a way of getting around the regulators. In the 1990s, while the Clinton Administration was running the often-deceptive financial indicator numbers up (inflating the bubble) by allowing Wall Street to run wild, derivatives came onto the scene with a vengeance.
A derivative is a “private contract,” and as such it escaped the overview of earlier financial regulations (like Glass-Steagall). A derivative can be a simple “hedge,” by which I mean you and I agree that in one year you will buy 500 of the bird houses I build at a price of $50 each. Our contract doesn’t take effect for a year, but there are two mutual guarantees: (1) I will make the bird houses, and (2) you will pay the agreed price no matter how the market changes for bird houses. But derivatives can also be speculative, that is, a bet. I won’t go mad with the details and risk putting readers into a narcotic state, but, first, most derivatives are of the speculative (gambling) kind, and the four forms of derivatives are futures, forwards, options and swaps. My bird house contract was a future. A forward is one or more futures which are traded “over-the-counter” (OTC), or beyond the legal gaze of regulators. Options are bets on whether a particular asset will increase or decrease in value (yes, you can make money from an asset that falls, if you anticipate the fall and make the right bet). With swaps, the counter-parties exchange interest rates, cash flows, or currencies.
Wow, that sounds abstract! Yeah, because it is, too fucking abstract, which is how we got into the problem we’re about to describe.
In 2000, President Bill Clinton signed the death certificate for Glass-Steagall, a law called the Commodities Future Modernization Act (CFMA), a bill with strong bipartisan support (207 Republicans for with 5 against [10 not voting] and 155 Democrats for with 51 against [5 not voting]), which decisively removed the last vestiges of regulations over derivatives, especially of the OTC variety, which effectively meant that billions in these “assets” could be held by institutions in the FIRE sector without the public’s knowledge. Banks thereupon became Wild West brokerage houses. The dressing was stuffed inside a rotten turkey. After the CFMA, derivatives speculation exploded.
This story starts with mortgages. You take out a loan to buy a house, and the bank gets a document for the loan, called a mortgage. You pay on the principle and the interest of the loan to whomever “owns” that piece of paper. One institution can buy that document from another (buying debt, get it?), which means the debtor now owes someone else. My own mortgage payments have changed hands like four times in eleven years. Before this selling of debts (mortgages) to third parties, if your bank was going to loan you money for a house, they wanted some assurances, like a good credit history and a secure job.
In the 2000s, investors were looking for better returns on investment (ROI) than they could get from, say, treasury bonds (which paid low-and-slow interest). The US housing market was running hot, so that looked like a good place to go. But the big players didn’t want little individual mortgages, so they went for these bundled instruments—which were supported substantially by many bundled mortgages—called (oddly enough) mortgage-backed securities (MBS). Okay, it’s even more “derivative” (ahem) than that. Owners of MBSs were selling shares of these “securities,” which were now vast pools of mortgages (and some other stuff), often by the thousands. The theory was that even if a few home buyers defaulted on their loans, the desperate mortgage holders could still sell their (price-inflated) houses again. The fact that houses were being bid skyward like Dutch tulips was concealed by two things: investor greed and the fact that (bank-operated) credit rating agencies were giving these MBSs AAA ratings . . . a lie, basically, to reassure other investors.
This big buyers’ market for housing loans (and credit card and student loans, btw) created an incentive for first lenders (who could sell on the newly created housing debt in a heartbeat) to start granting housing loans to Lotty, Dotty, and everybody, even if they didn’t have the wherewithal to make good in the future.
“Hey, I’m selling this debt into a hot market. I’ve been paid. It’s someone else’s problem if it goes south.”
Crash 2
These riskier loans—which still received the triple-A ratings—were called “sub-prime mortgages” (you can’t make this shit up), and they were peddled not only to risky buyers but a lot of buyers who weren’t savvy with regard to finance and missed the bit about “adjustable rates.” That is to say, interest rates which can be arbitrarily changed (raised, of course) by the lenders later in the life of the loan. So now these sub-primes were filling the MBS pools, which were rapidly becoming (as derivatives) the predominant “instrument” in the portfolios of the world’s biggest FIRE sector institutions. Meanwhile, housing prices were going through the roof, bid upward with insane (and highly predatory) loans to a beguiled public.
First lenders would sell you a house, even if your credit was shit, at a four percent interest rate that would balloon into an unpayable nine percent rate later, after the risk of default was sold on (and into the grand castles of the FIRE sector).
We need to introduce another term here, collatealized debt obligation (CDO), arcane financial “instruments” formed by buying up a bunch of different debts, and bundling (“repackaging”) these debts into yet another form of “security.” (Calling them “secure” turned out to be a sick joke.) Traders who were all yeehawing on the MBSs, then went on to gobble up CDOs that were packed together with these crazy loans. More investment drove up housing prices which drove more investment in a contagion spiral. Price inflation based on irrational investment? Tulips?
“Houston, we have a big-ass bubble.”
This lasted through Clinton and Bush and into the first days of the Obama administration. On Wall Street, it was a round-the-clock party. Bring on the coke and hookers!
But alas, home prices rose, buyers started to default en masse, a jillion houses stood empty, demand nosedived, and voila! The shit hit the fan. All those securities were hemorrhaging value like a pig with its throat cut.
Meanwhile, no one wants a big investment without some big insurance. What had Wall Street used as insurance? Well, I’ll tell you (it’s been foreshadowed) . . . derivatives. To wit, a swap, specifically a credit default swap (CDS). Insurance agencies, like AIG (remember them?), had sold billions in these OTC derivatives as a “hedge.” Kind of like selling hurricane insurance for a hut on the Southeast coast of the Dominican Republic.
I mean, it got even more complicated, because many investors were also betting on the CDOs themselves with that other OTC derivative, the option. This created an intricate and interlocking web of risks, assets, and liabilities, which also, just by the way, thanks to your bank/brokerage house, held things like your 401Ks and pension funds. The entire US (and much of the global) economy had fallen into the predatory hands of the rentiers, rentiers who now had massive exposure to this unregulated “shadow banking system.”
In 2007, the economy juddered.
In September 2008, it crashed.
If averaged across all US households, the net loss was one third. The Dow Jones Industrial average fell 777.68 points in a single day (September 29, 2008). Unemployment doubled in one month. Between 2008 and 2009, securities traders saw $7.4 trillion in fictional value evaporate.
Pumping Blood into a Dead Pig
The Fed leapt in with TARP, the Troubled Assets Relief Program. TARP’s original authorization was for $700 billion just to stop the financial sector free-fall. What that meant was that the US Treasury created enough new money to buy up $700 billion in trash assets.
Who got paid?
Well, I’m glad you asked.
The very same people who had caused the crash in the first place. This is controversial, because some believe that without it, a social catastrophe would have ensued. Others (like me) think that the proper response would have been to bail out consumers and underfunded social programs, and take the entire banking and financial sectors into receivership, essentially nationalizing them (then turning them into a highly regulated public utility with severe curbs on speculation). Whatever the case, there was no political will for such a thing, and many of these big outfits’ bosses were close friends with and the financiers of much of the US political establishment (who are also speculative investors).
Physical property (the real estate) was still there, but it was transferred to the very people who gambled away the economy instead of their predatory lending victims. Crashes in the past have been bad, but there were some upsides. They lower debt service costs (helping debtors) and bring prices back in line with production costs. This one gave Wall Street the gold mine, as the country song goes, and Main Street got the shaft.
What did Wall Street learn from this failure of their computer-velocity, cross-bet gambling fiasco?
I’ll have to answer using the highly technical language of an ex-SF Operations and Intelligence Sergeant: not a goddamn thing.
What bailing out the rentiers accomplished—even in spite of the nearly toothless Dodd-Frank Act—was to allow the rentiers to being immediately re-inflating the bubble. All the government did was sweep away their debt ceiling. It didn’t let this network of fictional value disappear in order to let the economy recalibrate away from the rentiers. It took the fucking debt onto the government’s own balance sheets through something called “quanitative easing.” It bought $5.3 trillion (yes, with a T) of pure shit from Freddie Mac and Fanny Mae (mortgage lending guarantors). The entire GDP of Germany is $4.43 trillion. For its sins, the Fed gave Wall Street a Germany with a tip. That bill to the US public alone is more than $1,600 for every man, woman, and child in the US.
The TARP money that went to the brokerage houses (who still run the US economy and its politics) was shelled out after the entirely disingenuous act of these companies reincorporating themselves as “banks.” Right now, the US holds $7.1 trillion in QE “assets.” Did they buy them up in the immediate wake of the crash? Nay! The biggest purchase was as recent as 2020. The state values these trash assets at pre-crash prices, meaning the US assets ledger is as phony as a three-dollar bill.
Ticks & Swans
Right now, there are approximately one quadrillion dollars in derivatives floating around the world. That is a one, followed by 15 zeroes. More than the value of all the real estate in the entire world!
J P Morgan, Goldman Sachs, and Citibank are the top three “banks” in the US. These so-called banks are again swimming in derivatives. The Schiller Institute recently estimated that J. P. Morgan is holding $54 trillion in derivatives and quite possibly that much again in unregulated OTCs. Morgan’s asset base is $3.3 trillion. So their derivatives-exposure to asset-base ratio is conservatively 16.4/1. Goldman Sachs is 99/1. Citibank is 27/1. Schiller uses the metaphor of a dog and a tick. The asset base is the dog, and the parasitic tick is the derivatives-exposure. The top four “banks” in the US are together an eight-pound dog with a 173 pound tick.
In China, where pundits are predicting collapse for the 5,297th time in the last six months, the tick-to-dog ratio is 7/19. It was China, by the way, who pulled the US and the world out of the 2008 collapse, not quantitative easing (which only bailed out the rentiers). By 2009, China became the world’s biggest industrial commodities consumer in support of an unprecedented growth spurt. Financial Times called China the world’s shock absorber. In 2008, China’s fraction of world demand for industrial commodities was 31 percent. By 2009, it was 46 percent. We’ll return to China momentarily.
In March 2023, On the 15th of March (oh, those Ides!) this year, Credit Suisse, with massive derivatives exposure via the US (a 27/1 ratio) . . . collapsed. With the collapse of Credit Suisse, these derivatives were, like fictional value in any collapse, converted to trash almost overnight. This was fallout partly from the 2023 banking crisis (Silicon Valley Bank et al), which was caused not by derivatives, per se, but a different rentier exposure . . . to Fed interest rate hikes . . . and banks holding a high percentage of non-FDIC-protected deposits. (Of course, the government bailed them out.)
Not if, but when the next bubble pops, it will be thermonuclear. We don’t know yet what trigger will initiate the landslide.
From Quarterly Report on Bank Trading and Derivatives Activities (The Office of the Comptroller of the Currency):
a total of 1,186 insured U.S. national and state commercial banks and savings associations hold derivatives.
four large banks hold 87.0 percent of the total banking industry notional amount of derivatives.
credit exposure from derivatives increased in the second quarter of 2023 compared with the first quarter of 2023. Net current credit exposure increased $27.0 billion, or 10.8 percent, to $273.0 billion.
derivative notional amounts increased in the second quarter of 2023 by $4.3 trillion, or 2.0 percent, to $221.9 trillion.
derivative contracts remained concentrated in interest rate products [debts], which totaled $164.1 trillion or 73.9 percent of total derivative notional amounts.
Meanwhile, a new real estate bubble inflated. This one had two sides, residential and commercial.
Inflation is a problem. Easy inflation (like two percent) helps workers whose wages go up with inflation (a big if), but it hurts people like me who are on fixed incomes. High inflation hurts everyone. Partisans are blaming Joe Biden for our high inflation, even though it’s a worldwide problem. There are three big factors contributing to it. First, the pandemic created chronic supply chain problems. Then the Ukraine war jacked up energy prices. Finally, big corporations used these inflationary events as a way to further gouge the public with price hikes (which they could conveniently blame on #1 and #2).
The Fed adheres to an ideology that blames wage increases for (all) inflation. And the same ideology says that the way to beat inflation is to raise interest rates. Which they did, and which triggered the 2023 banking crisis, drove up unemployment and underemployment, and forced people to take on more debt, and which still hasn’t “tamed” inflation . . . because there’s still a supply chain issue, combined with a nascent trade war with China, combined with still high energy prices, combined with a European land war. None of this shit is susceptible to Fed interest rate hikes.
Some people are also blaming inflation for the rise in housing prices and rent. Nope, nope, nope.
In August 2019, the median residential house price in the US was $278,200. In August of this year, that figure was $407,100, or a 68 percent increase in “value.” A four percent inflation rate does not account for this. Speculation accounts for a great deal, and a housing shortage accounts for the rest. The residential market may not (emphasize may not) be on the cusp of a crash, because it is partly being bid upward by those with the means, but it is definitely increasing inequality. The difference is that in 2008, there was too much supply and not enough demand (ergo the predatory subprime lending). Now there is too much demand and not enough supply. If (when) the residential bubble does burst, millions of homeowners will be left holding the bag, alongside a fair number of speculators.
As to commercial real estate . . . wow. How exposed are US banks to ever more useless commercial real estate? This is a story that began years ago, then ran into a big, mean Black Swan called the pandemic.
Urban property markets have long been the plaything of speculators (ask Trump). Every time the numbers and optimism go up (while bubbles are inflating), developers go batshit crazy and start building things. Ronald Reagan’s Economic Recovery Tax Act of 1981 allowed for accelerated depreciation of these properties, making them far more attractive to investors. At the same time, the Reagan administration was loosening the rules on foreign trade, so lots of foreign investors joined the US commercial real estate party. Developers had access to crazy cheap loans, and city managers wanted development projects to pump up their tax bases. This build-up went on for four decades. The final result was that commercial properties came to account for 80 percent of downtown real estate.
Then a nasty little bug escaped from an NIH-funded lab doing some insane shit called “gain-of-function” research. (“Quantitative easing,” “gain-of-function,” just stop!) GOF research is using genetic science to build dangerous organisms in order to then study them. That bug was called SARS-CoV-2 . . . in the common tongue, “Covid-19.”
The virus went romper-stomper through the entire world, provoking all manner of lockdowns. In response, many people started using their computers to do their work at home. This had always been possible, but control-freak employers prefer to surveil their employees, and the pandemic kind of forced the issue. Long story short, it turned out that people liked avoiding the hours of “getting ready for work,” the commuter traffic jams, the office gossip, and the watchful eyes of bosses. They liked working in housecoats. After the worst of the lockdowns, a very many people continued to work from home. And so a lot of these commercial properties stood empty.
They still do.
The Emptiness
Former downtown workers also used to use public transportation, dine at local eateries, tank up on caffeine, and get a bit of shopping done on breaks; so downtown businesses suffered. One estimate for San Francisco was that the absence of 150,000 downtown workers resulted in 33,000 lost jobs in those dependent retail sectors. The other big nasty ramification, as buildings were shuttered and the lights went out, was the loss of tax revenue for municipalities.
Meanwhile, online shopping during the pandemic hit businesses outside the downtowns hard. Even more commercial real estate then stood unused. After the panic ended, kids returned to school, and little businesses re-opened, and yet . . . the office space remained vacant. At the beginning of 2023, downtown office space vacancies remained at around 23 percent. Office leasing volume rose back after 2020, from the 40+ percent loss due to the pandemic, by 37 percent, driven by unfounded optimism and cheap money, peaking in the second quarter. Then it nose-dived.
But what of those investors? Well, first of all, they didn’t dig money out of their mattresses to make those investments in commercial real estate. They took out loans, and they continued to do so after the pandemic, because the interest rates were damn near zero. Alas, a third of all office leases expire in 2026. $1.5 trillion in commercial real estate loans come due in 2025. Many of these loans are up for refinancing at the beginning of 2024, even as the Fed has doubled interest rates (ouch). A default trend is already gestating.
Okay, step back a second: commercial mortgage-backed-securities are something called “non-recourse debts.” That means the creditors cannot go after the debtors other assets in the event of default. The defaulter just goes to the bank, hands them back the keys, and says, “Bye, y’all. Have a lovely fucking day.”
In February 2023, the Wall Street Journal was already sounding the alarm, as delinquency rates began to rise on commercial-mortgage-backed securities and Leviathan asset-management outfits began to default. Hey, they can just walk away leaving the bank holding the bag, eh. (iirc, bank’s are holding around $3 trillion right now in CMBSs.)
In March, again in response to interest rate hikes, three big regional banks collapsed—Silicon Valley, Signature, and First Republic—forcing the government to raise the ceiling on FDIC paybacks. Seventy percent of all commercial property loans were made by regional banks. Many pension funds, public and private, had also been bundled with commercial property portfolios. Pension funds are already underfunded to the tune of $1 trillion, and a crash of commercial real estate value would be disastrous for millions of old folks.
City governments are terrified. They’re faced with declining tax bases, which force them to face choices between raising taxes or cutting services, one of which will drive residents out and the other that will deter anyone coming in. This double-bind has been named “the doom loop.”
Private lenders are commercial real estate’s only option now, with interest rates up, and they’ve become understandably reluctant. Refinancing is facing what Reuters recently called “liquidity gridlock.” Defaults will definitely ensue. The only question is the size and duration of the wave. In 2022, CRE loan application amounted to $20 billion. By November 2023, that number was $40 billion. Estimates are that around two percent (!) will actually get loans, because most of the applicants are already “over-leveraged.” We haven’t even talked about apartment landlords. The landlords are already hiking rental prices through the roof, and they’ll do it more if they’re taking losses on the real estate . . . until they default. As this is written, the combination of inflation, higher insurance rates, and the increasing reliance on something called floating-rate bridge loans has aimed the so-called “multi-family housing” (apartment) industry at its own default tsunami.
In the background is that quadrillion in derivatives, waiting . . . waiting like a bomb for one of several forms of default to activate it. The reason nothing’s being done to stop it? In 2022, the US FIRE sector contributed $1,661,090,514 to political advocacy and campaigns. $655,529,238 of that went to parties and candidates (50 percent to Democrats and 49 percent to Republicans). (link here) The rest went to “independent outside groups” and lobbyists. No other interest group, as catalogued by Open Secrets, even came close to FIRE in political spending. They know they’ll get the bailouts. The own the politicians!
China
History doesn’t repeat itself, but sometimes it rhymes. We’ll stick a pin in this and leave it now to talk about everyone’s bête noire, China.
After 2008 would have been a perfect time to take the entire financial sector into receivership, essentially to nationalize it. The government needs to run all financial institutions, because only then can there be centralized and coordinated control over credit flows and accumulation of debt between various economic sectors based not on competition for personal gain but on the common good of the whole economy. Someone without a dog in the fight needs to be able to say, “Now, lend,” (“Now, build”) and “Now, stop lending.” Private actors simply cannot be trusted to pursue this common good.
Two overviews: (1) China is undergoing its own financialization crisis right now; (2) The US and China are experiencing growing hostility.
First, we need an appreciation of the differences between the US and China. China has a population of 1.425 billion people. We have around 331 million. Just as in the US, China’s population is concentrated by geographic features, especially water.
Both countries are of a very similar land mass: the US has 3,618,783 square miles, and China has 3,747,877 square miles. But the population difference means that China has a far bigger problem managing the occupation of livable land (land with water) than does the US.
Outsiders can look up “Government of China” on Wikipedia to get a quite formal account of how the Chinese autocracy—led by the Chinese Communist Party (CCP)—is organized; but the internal deliberations of the CCP are held very close to the chest, and most commentators, many of whom disagree, resort to all manner of divination taken from data and statistics alongside enigmatically sweeping announcements from Chinese officials.
What’s more difficult for China to strategically conceal is its economy, because it is thoroughly articulated with the global economy. Lately, most of the tea-leavers tend to agree that the Chinese economy is undergoing some difficulties, but again—as I stated above—it’s best to take the perennial doomsayers with a big grain of salt. We’ll explain why momentarily.
Power shifts in China result in policy shifts. Because China and its economy are so incredibly huge, however, it takes a while to safely turn the ship.
Every economic policy change will also encounter friction.
Looking at our own imperial “back yard” in Latin America as an example, when the pink revolution swept Latin America in recent years, leaders wanted to shift away from the old imperial system that favored export agriculture. They sought greater autarky, but ran into three powerful forms of friction: (1) the landed class themselves, who profit handsomely from the imperial system, (2) the United States government, acting on behalf of US economic interests in the region, and (3) the US government’s interest in maintaining cooperative Latin American leadership. That internal and external friction undermined the agendas of the “pink” leadership, which eventually destroyed their credibility among populations who—like our own—(1) doesn’t care about the big picture and (2) judges whichever government mostly by whether or not their lives are improving or getting worse.
Back to China . . . Chinese leaders understand friction.
By capitalist standards, China has achieved a very great deal. First of all, it’s now the world’s second largest economy. Between 1948 and 2020, China’s foreign trade volume ballooned from (adjusted) $907 billion to $4.65 trillion. Their overall growth rate was 14 percent. In 1986, the average urban income was (adjusted) 1,000 yuan. By 2005, it was $10,000 yuan. Immediately after the Revolution, there were around 5,000 technical and scientific experts in China. In 2020, there were more than five million. After the Revolution, Chinese adult literacy stood at 20 percent. It is now 99.83 percent. Life expectancy was 35 years. It is now 77.3. In 40 years, China lifted 800 million people out of poverty. This was all a matter of political will, centralization of decision-making, and the ability to carry out long-term plans without frequent competitive electoral disruptions.
Now, as an aside (though it matters), the US also has centralized political control, but it is exercised obliquely by the FIRE sector. In the US, in China, or anywhere else, when decision-making is centralized, it has an advantage and a disadvantage. The advantage we’ve covered. The disadvantage is that leadership might be foolish, corrupt, self-interested . . . and only then arbitrary. In the US, the FIRE sector has to exercise its control from behind the throne, and the people still have an element of agency via elections, which is why the FIRE sector spends so much to control elections. But that control can leak, and you can end up during a political crisis with all manner of dissenting leaders—sometimes including scoundrels, morons, and maniacs—who channel the restive population’s dissatisfaction in unpredictable ways—the very subject with which we began this long discussion.
Obviously, China is centralized, but anyone who has ever managed anything bigger than a rifle squad knows that one person or one body cannot stay on top of a vast and complex population. China is administered through regional and local authorities, and there has been and remains some corruption at these levels. Human nature.
Economist Zongyuan Zoe Liu compared the Chinese economy before and after the current crisis as a Monet versus a Jackson Pollack. She says the Monet looks good from afar, but like a mess when you get close, whereas the Jackson Pollack looks like a mess from afar and a mess up close.
Yeah, okay. We shall see.
China’s development model up to the early 2000s was high savings with high investment, market “animal enthusiasm,” and export-led growth. So far so good.
Because of friction, or inertia, a successful “model” that solves a particular set of problems can, once those problems are solved, become obsolete or even counterproductive. It can also mislead leaders into fighting the current war with the last wars tactics. (That’s the conundrum faced by the Fed in the US.)
China’s “growth” since opening up in the 1970s has been astronomical.
The “animal enthusiasm,” combined with more-or-less decentralized (regional) planning produced a lot of remarkable results as well as a few boondoggles (like investing in the construction of cities that remain unoccupied, the “ghost cities”). Internally, there were two schools of thought about all this, the market-heavy advocates and the state-intervention advocates. The Chinese nouveau riche, who had become quite powerful during the long reform era, leaned heavily toward “free market” solutions, then three things happened.
First, the 2008 crash, to which China responded by rescuing the world economy with even higher investment (and massive imports of raw materials)—which was in China’s interest because the Chinese economy is interwoven with 120 different countries, the top ten being the United States, Hong Kong, Japan, South Korea, Vietnam, India, Netherlands, Germany, Malaysia, Taiwan, United Kingdom, Singapore, Australia, Thailand, and Mexico—all of whom were exposed to the crash. Not only did the Chinese leadership rescue the world economy, it paid very close attention to what had happened and what caused it. Remember what we said above about dogs and ticks: the US top four “banks” have an eight pound dog (asset-base) and (at least, probably a lot more) a 173 pound tick (derivatives exposure); China’s dog weights 19 pounds and has a seven pound derivatives tick.
Second event: China’s own housing bubble had been developing. Decentralized “animal enthusiasm” investing ran out of productive investments (ergo, “ghost cities”) and shifted to debt-based investment (ding, red flag!), then debt ran past GDP. That conundrum when growth can only be maintained by generating more debt. Ouch. Investors went heavily into property, running up the prices (we’ve seen something like this before).
Demand for real estate, suddenly 30 percent of the domestic economy, then went into a slump. Note, though, that in Chinese residential real estate, the demand came from home buyers, and 80 percent of investment money was taken up as plain family mortgages. Home ownership in China is 89.68 percent, as opposed to 65.9 in the US. And Chinese mortgages are not multiply-securitized (bundled in with MBSs and CDOs, e.g.).
Third event: Covid-19. Draconian measures closed down non-essential consumer spending. And, as in the US, Covid-19 dramatically increased income inequality. It also happened concurrently with Trump’s anti-Chinese belligerence and a trade war. (Geo-strategically, Trump was—and remains—what we in the Army used to call a dickhead. Everything is about stupidly trying to convince everyone he has a big dick.)
Chips
Short excursus here, but necessary to get the whole picture.
Enter Taiwan and microchips.
It’s a long story that I’ll keep very short; microchips are really expensive to make, because they require environmentally hyper-controlled, multi-billion dollar semiconductor fabrication plants (FABs). The history is somewhat complicated, but Taiwan set itself up—with lower than US labor costs and huge Taiwanese government subsidies—to become US FAB Central (through an outfit called the Taiwan Semiconductor Manufacturing Company [TSMC]), the strategic calculus being that by becoming essential to the US they would incentivize the US to protect them from CCP designs for re-unification.
The high-tech US war industry relies absolutely on the provision of these chips.
Right now, chip production requires several big things: design, software, machines, machine components, wafers, chemicals, laser technology . . . and FABs. Japan contributes chemicals, wafers, and machine components. The Netherlands contributes laser technology. The US supplies design, machines and software.
The most irreplaceable monopolies in this supply chain are design and FABs—a US-Taiwan joint venture.
China, on the other hand, had also come to import 40 percent of Taiwan’s chip production. China, then, is a huge customer for US tech outfits as well as essential to the Taiwanese economy. China depends on this technology—for which their design and manufacturing capacity still lags well behind—just as the US totally depends upon Taiwanese FABs, while Taiwan relies on the mainland as a chip customer.
Xi Jingping became the CCP’s General Secretary in November 2015. Put a pin in that, because this all connects.
The Chinese state’s two overarching drives are (1) for sustaining economic expansion and (2) modernizing its military (especially after Trump, who Biden followed with a similar belligerence). China wants to upgrade the its capacity, and so it needs designs and FABs (on which it’s now feverishly working). Without chips, they are up—ahem—chip creek. Until last year, China was getting all its chip upgrades, directly or indirectly, with the latest designs, from US companies (who were making a mint).
Welp . . . in 2018, the Chinese military tested its first “hypersonic missile system” (HMS), a weapon with short and medium range capabilities that flies at Mach 5 and can duck and dodge during flight to confuse air defense tracking systems. Short and medium range missiles are not a threat to the US, but Taiwan is well within range.
Then again, China isn’t going to kill the goose that lays the microchip eggs.
The Biden Administration declared a chip embargo on China, which sent several US companies into a tailspin . . . but never fear. Our government then authorized massive bailout subsidies to the same companies on the condition that they retool themselves to build . . . well, FABs. But that’s gonna take a while. (We’ll talk further along about the so-called “Inflation Reduction Act,” which ought to be called the FAB Act.)
When Nancy Pelosi did her provocative little publicity stunt in Taiwan last year, the place she visited was the TSMC.
The truth is, China has no interest in a war with the US. The US is still China’s biggest export target (18% of total Chinese exports). But given the new embargo and the tensions ratcheting up around Taiwan, China’s response is now military-economic; which is to say, they’ll play brinksman with their military to exact a price from the US and Taiwan. Total war is neither on their agenda nor in their interest. But they can use the military to threaten Taiwan—playing chicken right up to the last second—as a way of slowing down and disincentivizing trade with Taiwan. The military as a weapon in an economic war.
Means & Ends
So China is facing the combination of “long Covid, a “housing crisis,” and US belligerence. Enter stage right all the US pundits with hyerventilating proclamations about China’s imminent collapse.
Economics is largely an ideology in the US. The problem with ideology is that it comes with its own set of blinders. Ideologues are blinded to important questions, because the questions themselves are unthinkable . . . as in they just don’t occur to to the ideologue.
“Whatever happens in China will work out just as it’s expected to in the US, right?”
Yes, China’s “growth” (reified as a single number) has depended on property investment, financed by expanding debt. Ideologues claim that this is China repeating Japan’s problems in the 1990s (called “balance sheet recession”).
This is wrong, which we’ll explain, but let’s also note that Xi is a re-centralizer as well as a devoted Chinese nationalist. Unlike Trump, for instance, he’s not feeding some chronic, out-of-control, narcissistic insecurity, but has his eye fixed firmly on the long-term goal of making China a stable, secure, moderately prosperous, “great nation.” Growth, for Xi, is not the end, but a means to the end. Ideological economists can’t fathom this, because growth is their golden calf.
(One major impetus to growth—in all major economies, where otherwise there’s no reason the state couldn’t coordinate for a comfortable zero-growth or even de-growth . . . is “security,” that is, war. This is often, based on our compartmentalizing intellectual default, overlooked.)
One of the pieces of “evidence” that China is caught in a terminal debt trap is that China didn’t get the same post-Covid “bounce” in consumer spending that we did. This was interpreted by the same pundits to mean Chinese consumers had lost all confidence in the system, a view reinforced by close contact between many American economists and the “animal enthusiast” market-advocates within China, to whom Xi is now applying the brakes (and creating friction).
In reality, this wasn’t part of some historic conjuncture (rumors of China’s impending death are greatly exaggerated), but a combination of (1) the fact that China didn’t print checks for the population, like the US did (which at least partly accounted for the US post-Covid bump), (2) that the Chinese government sort of flubbed the Covid response with on-again, off-again policies, and (3) that Chinese exports dived during the pandemic.
The fact is, China’s current property price decline is not nearly as precipitous as 1990s Japan’s, and the Chinese asset bubble is substantially smaller than that of 1990s Japan.
The really big difference, though, between 1990s Japan, as well as 2008 and the current US . . . and China . . . is that the former have all these problems interspersed with one another and are financially co-collateralized (if that’s a word) such that everything rises and falls together with little government control, whereas China’s economic sectors and regions are fragmented (compartmented). Debt in one region or sector does not mean debt in another. Moreover, the government has the power not merely to adjust interest rates (the US blunt instrument) but to tell the banks to lend, not lend, to which sector they are to lend and not lend, and the government can prohibit many kinds of “cross-share-holdings” that expose an entire system to a single shock.
This is a potential for economic coordination which doesn’t exist in the US, where banks essentially own the government—a government that changes every four to eight years from one incompetent boob of a puppet to the next. In short, the US does not have the inhering capacity—quite the contrary—to play a long game. China does.
If the US, for example, hadn’t been run by a China-containment warmonger like Barack Obama, followed by a simple-minded jackass like Donald Trump, followed by a Wall Street sycophant like Joe Biden, the US state (if it were playing the long game of weakening China) might have endeared itself to China, accelerating its capital flight (if weakening China is a desirable goal—I disagree, but) and preventing the the newfound tactical (oil trade) compact between Russia and China (or the US could have stopped NATO expansion and prevented the Russian invasion in the first place).
But the Chinese play Go. The Russians play chess. The Americans play Monopoly. Financialized “democracy” really just builds a kind of neo-feudal oligarchy run by greedy sociopaths and narcissistic nincompoops. Again, I digress . . .
Chinese corporate debt has been falling for the last twelve years in every sector except real estate. One of Xi’s pressure points on the economy has been reducing debt and risk exposure, especially in the productive sector (Xi is very focused on tech production!) And, apart from the fact that China didn’t get a big post-Covid bump, consumer income and consumption have been steadily, if slowly, rising for over a year now, even after the second Covid lockdown in 2022.
Yes, real estate investment has fallen, but at around the same rate that manufacturing investment has risen (10 percent faster than it did between 2017 and 2019). So there has been no real “private sector stagnation.” China’s domestic car production just passed that of the US, and China is making half of the world’s new electric vehicles as well as around 80 percent of batteries and solar. (I think electric cars are stupid, but that’s another issue.)
Xi’s priorities are not growth per se, but (1) stability/security/autonomy (autarky), (2) the shift from investment in real estate to manufacturing (especially tech), (3) gradually and structurally reducing income inequality and repressing the nouveau riche (those animal-market guys), and (4) slowly and carefully slipping away from debt traps. If this means letting general growth fall to 3-4 percent, so be it.
Xi is pursuing autarky for China through a policy of what he’s called “military-civil fusion.” In practical terms, Xi’s government, in the wake of Trump/Biden, is embarking on a kind of Manhattan Project for the development of high technology, which has its applications in both sectors.
It should be noted that China is rapidly becoming the world leader in “green technology” (regardless of my own and others’ skepticism about some of it). In 2022, China invested $546 billion in green-tech. The US invested $141 billion. China’s plan is to hit carbon peak by 2030 and carbon neutrality by 2060. They’ve even set up a quasi-independent “green finance market” for it. The People’s Bank of China (PBOC) describes it as “financial services provided for economic activities that are supportive of environmental improvement, climate change mitigation, and more efficient resource utilization.” In 2021, the Xi government announced a five-year-plan to shift from coal to a mix of solar, wind, wave, and nuclear (I know, I know).
There are 4.3 million solar jobs worldwide. China has 2.7 million of them. This is both an export story and a domestic one. In 2022, China invested 90 percent ($79 billion) of the world total in low-carbon manufacturing! This year, in the US, the Inflation Reduction Act (which has jack shit to do with inflation, except with a few distant future supply-chain issues, but a lot to do with green tech and chip manufacture) will roll out $369 billion in “incentives,” not direct investment, for green tech development. China does 79 percent of global battery manufacture, 58 percent of global electric vehicles, and 80 percent of solar technology.
Many will point out that China is responsible for 30 percent of the world’s greenhouse gas emissions. While this is technically true, it fails to take into account that the products of these various processes, are exports to other countries (so it’s them who are partly responsible for carbon output). In 2006 exports accounted for 36 percent of GDP (that’s dropped now to around 21 percent). In other words, a good deal of Chinese manufacturing and the energy to support it is used ultimately on behalf of other countries (17 percent to the US!).
It’s also true that China, even as it builds away from coal-powered electricity, is still (as we all are) heavily dependent on oil. Militarization doesn’t help with this. You’ll never see electric tanks and fighter-bombers. Using the US’s bloated military as an example, our DOD uses 93 percent of the oil purchased by the government. Our military generates as much atmospheric carbon as the entire nation of Portugal. It’s the largest institutional oil burner in the world.
China’s re-militarization policy—in response to heightened US belligerence—will almost certainly raise the general numbers. Xi sees this as a contingency, but his eye is ultimately on turning industrial capacity inward while using technology to gradually address China’s demographic problem of a shrinking labor force as the population ages out in the wake of a decades-long “one-child” policy. Neither “growth,” per se, nor GDP are the main issues.
A quick point about GDP, the Amon-Ra of neoclassical and neoliberal economists. Gross Domestic Product is the sum of all on-the-record economic activity over a given period (usually per year). If we used a similar measure for basketball, we’d judge the health of the game by combining all scores and stats into a single number. It’s that fucking stupid.
Warren Buffett improved its utility a bit with the introduction of the “market capitalization to GDP ratio” or “the Buffett indicator.” It gives a (very) general comparative account of how exposed that GDP is to speculative mischief. The higher the number, the greater the exposure (danger). In the US, that number is 193.3 percent. In China, it is 65.1 percent, down from 79.7 percent last year. So much for the impending Chinese economic implosion.
And why do you look at the straw in your brother’s eye, yet do not perceive the beam in your own eye?
—Matthew 7:3
None of this is to say that I advocate some kind of high-tech digital dictatorship, or that China has all the answers (they have plenty of problems), or that I somehow share the unstated technological optimism of conventional economists, unconventional economists, the world at large, or even the Chinese.
I’m emphasizing economics, not as a “science” (it’s not even close to a “science”), but as a practice that should be, at bottom, about solving problems—which means not imposing orthodoxies flowing out of academia and Wall Street think tanks, but by staying in close and constant contact with on-the-ground realities in all their granular diversity and addressing problems ethically and pragmatically.
I’m emphasizing China—in the face of all the stunned sinophobes’ dire prognostications—to contrast it with the US in one major respect: state control over the financial sector and state coordination of credit and debt. I could criticize China quite mercilessly, on several accounts, but for exactly the same reasons I’ll criticize the rest of the technologically-optimistic world.
State of the Union
Returning to the United States for a moment, let’s begin with the whole shtick of America Number One, Love it or Leave it, God Guts and Guns Made America Free. The US is now 12th in per capita GDP (which is dangerously over-capitalized at 193 percent . . . that we know of, OTCs are big, scary unknowns). The US 31st in math education, and 13th in reading education. We are 27th in social mobility, 49th in life expectancy, 50th in infant mortality, and we spend twice as much as other developed countries on health care that routinely bankrupts families who use it. We have locked up more people in prison than any other nation in the world. Not per capita (we’re first, per capita, too) but in gross numbers. We are helplessly fat, drugged, deskilled, and drowning in debt.
I know this will seem like an excursion, but it’s not. We are not only a very confused, angry nation; we are physically sick and mentally disabled.
The US has very high rates of overweight (BMI 25-29), obesity (BMI 30+), and morbid obesity (BMI 40+). One in three US adults are in the overweight category. Two in five are obese. One in eleven is morbidly obese. One in six kids between two-years-old and 19 are overweight. One in five are obese. One in 16 is morbidly obese. Obesity has been correlated with at least 21 “non-overlapping cardiometabolic, digestive, respiratory, neurological, musculoskeletal, and infectious diseases.” In English, high blood pressure, heart disease/heart attacks, Type 2 diabetes (increasingly in children!), asthma, sleep apnea, osteoarthritis, musculoskeletal injuries, gallstones/gallbladder disease, liver disease, strokes, cancer, clinical depression, and anxiety.
It’s estimated that this costs the US somewhere in the vicinity of $173 billion annually in (admittedly jacked up) health care costs. The only upside is that this disqualifies one out of every three applicants for military service. Which is not to crack on fat people themselves—I know and love a lot of big folks. It’s to get to another point, two actually. First, this is a class issue. Second, it’s a food system issue (I told you we’d be talking about industrial food again.)
The fact is, the wealthier you are, the greater access you have to high quality food and the leisure/money for organized exercise. Cheap processed food is what the poor and precarious eat. They are also typically less educated and more susceptible to marketing ploys and child-targeted addictions. If you live in many cities—especially the poorer sections—you are likely to live in a food desert, where the only available food is convenience store shit. This is not to dismiss factors that cross class lines, like automobile dependency, automated jobs where we sit on our asses all day (and nibble to break the boredom), and low-energy entertainment like video games and television.
In the US, taxpayers subsidize the very industries that are feeding us garbage with overloads of sugar and grease. The big subsidies go to corn, soy, wheat, rice, dairy, and sugar. There are only around 4,500 sugar farms in the US, but they receive more than $4 billion a year in Federal subsidies. Then businesses are allowed to advertise sugar products to our kids who become sugar addicts. The single biggest contributor to the US obesity epidemic is sugar-spiked drinks: soda, fruit juice, and “energy” drinks, etc.
We might as well be marketing cigarettes to our kids, but Big Ag has politicians locked in to secure both their subsidies and their monopoly rents.
Corn subsidies we spoke of above with regard to agricultural dumping policies. This is also a subsidy to sugar production (high fructose corn syrup) and meat for fast food outlets (as cheap feed).
If you grow kale or tomatoes or carrots or pears or green beans, you’re not on that big subsidy teat; so you have to charge more for your products . . . unless you’re a huge corporation who can, as a monopoly, grow and distribute these things on a huge scale using the most destructive and unsustainable methods.
In any sane society, “food” companies would not be allowed to market addiction to sickening shit like this, but late modernity does not produce sane societies. It produces “slaves to envy and addiction.”
On drugs. America has a drug problem. I live in Michigan where recreational marijuana is legal. I’m glad they aren’t throwing people into prison for it, but I have to say, the corporate pot operation about a mile upwind from here has our air smelling like someone ran over a skunk half the time. Okay, that’s a personal complaint.
The drug problem I’m referring to, however, is not the illegal or “abused” variety, but legal and “correctly” taken prescription and OTC drugs, which 70 percent of our population now uses. Some of those are antibiotics, which are not sustained use; but a lot of them are drugs for chronic pain, drugs for all the maladies listed above and a growing list of others, and a hell of a lot of them are psychoactive drugs—antidepressants and the the like.
We don’t ride out minor discomforts now. We medicate them. We don’t address our loneliness, alienation, isolation, and ennui at their roots. We medicate them. We don’t hold corporations accountable for feeding us shit that makes us sick; we peddle the drugs that treat the symptoms and the drugs that treat the symptoms of the other drugs. You can’t turn on the television or ad-heavy platforms like YouTube without being bombarded by drug ads for every damn thing. We drug our children who can’t sit still for six hours a day in a prisonesque classroom (an entirely natural response).
Two thirds of Congress cashes Big Pharma campaign checks, and when one company was caught red handed peddling opioid addiction and mass death, they were charged with misleading advertising.
In 2020, Joe Biden received $6.3 million in pharmacash. That’s not only why these outfits can kill people and get away with it; it’s why taxpayers foot the bill for their research, then private operators get to rake in the profits when the Next Big Thing (or the next vaccine for a man-made organism) hits the market. They are regulated by themselves; that is, the Food and Drug Administration is “administered” by future Big Ag and Big Pharma employees. They do it by promising high-dollar jobs to regulators in the agency (and paying 45 percent of the agency’s budget), who secure their future positions by giving them favorable reviews for their products now. A study in 2016 showed that almost 60 percent of former FDA staffers went on to work for the biopharmaceutical industry. (Big Ag uses the same kind of revolving door system in the Environmental Protection Agency.)
2020:
The typical 40-year-old in our little Midwestern city of 24,000 is taking blood pressure meds, anticoagulants, antidepressants, some kind of OTC fad diet shit, marijuana for “pain,” five cups of coffee, four beers, Tylenol, Tums, cigarettes, and anti-anxiety drugs for the side effects of smoking or eating 20 grams of cannabis a day.
This town is a casualty of offshoring. It was once prosperous with many factories and strong unions. Like most other places, and like all places to some extent, the people here have become deskilled.
This term requires a tad of explanation. As far as I know, it gained currency from Harry Braverman’s 1974 book, Labor and Monopoly Capital (I checked the pdf link, it’s safe). It was a systematic treatment, using Marxian categories, of how technology progressively takes skills away from laborers by hyper-specialization. Another book that covered this process without all the academic language was George Ritzer’s 2012 book, The McDonaldization of Society.
Short version, employers are after “efficiency, calculability, predictability, and control.” So they break what were formerly skills, like cooking, down into discrete mechanical tasks—the assembly line principle. The effect on labor generally is “de-skilling,” which renders these de-skilled laborers more pliable, because they can more easily be exchanged, like faulty parts, for other unskilled laborers.
In practice, with the emphasis on the idol of “efficiency,” this process was also called Taylorization, named after Frederick Taylor (1856–1915), who pioneered this process as “scientific management.” PS: The Chinese are also Taylorites. So was Lenin, for that matter.
Braverman’s and Ritzer’s theses can be applied more broadly, however, to the emergence of modernity, the “war on subsistence,” and urbanization. I’ll drop an Illich quote in here, on “shadow work,” because it’s related:
I designate as shadow work the time, toil, and effort that must be expended in order to add to any purchased commodity the value without which it is unfit for use. Therefore, shadow work names an activity in which people must engage to whatever degree they attempt to satisfy their needs by means of commodities . . . . When a modern housewife goes to the market, picks up eggs, then drives home in her car, takes the elevator to the seventh floor, turns on the stove, takes butter from the refrigerator, and fries the eggs, she adds value to the commodity with each one of these steps. This is not what her grandmother did. [Illich is writing in 1982.] The latter looked for eggs in the chicken coop, cut a piece from the lard she had rendered, lit some wood her kids had gathered on the commons, and added the salt she had bought. . . . The grandmother carries out woman’s gender specific tasks in creating subsistence; the new housewife must put up with the household burden of shadow work.
What he obliquely also describes is de-skilling. Grandma knew how to do a lot of things based on production that was very, very local. Her and her family’s technological dependency was comparatively low, their subsistence skills high. (we are the opposite.) They were, in other words, closer to “subsistence” activity, which requires a host of skills that will someday, in other forms, and after modernity’s destruction is more complete, be required again. Few people now know how to raise chickens, render lard, operate a wood stove, or even build a simple cooking fire. This is not the de-skilling Braverman looked at in comparing industrial artisans with Taylorized factory labor, but something broader and with far greater historical scope.
Here in the US when my mom was a teen before WWII, a lot of people canned food. My mom (1925-2016) and dad (1906-1996) continued to can well into the 1980s. My brother, sister, and I grew up eating mom’s canned salsa, made from the veggies grown in dad’s garden. A few people—mostly those with some leisure who see it as a virtuous hobby, and a few smallholders—still can food. Most have no clue. And yet, this would be an essential skill were internationalized food systems to break down.
When a permaculturist talks about re-skilling, she means canning, cropping, splitting wood, using a scythe, surveying, pond-building, raising livestock, and so forth. When Joe Biden talks about re-skilling, he means sending millions of people back to school to adapt to newly emerging technologies, or “learning to code.” Sending us further out on the Taylor limb, that is, which we’re rapidly sawing off behind ourselves. We have grandchildren who are afraid of the woods and can’t go five waking minutes without checking their phones and tablets.
So, reviewing the bidding, we are fat, drugged, and de-skilled—which puts us in a very precariously dependent position in any emergency—and emergencies are becoming more and more routine. The final thing I said is that we are drowning in debt (which brings us full circle back to the rentiers).
This is not just an American story, because rentiers have taken over most of the world; but I’ll just focus on the US for now.
A Different Socialism
Now, at last, I’ll make it clear(er) why I still elect to call myself a socialist, even though I no longer identify as Marxist. First of all, let me clarify the term. Socialism, in the way I use it, means the state plays a strong role in coordinating the economy and preventing dangerous and/or couterproductive accumulations of wealth, for the common good. Notice that I said absolutely nothing about “growth.” This is an historically contingent political orientation, meaning that orientation is based on the need to solve problems, problems which are unique to our epoch.
This orientation begins not with principles (apart from the common good) or abstractions (like growth), but with some realistic account for things as they are. In may case—as a “subsistence” socialist—this is not a “progressive” orientation (God damn progress!).
Which is to say, with any approach to any issue based on solving problems (for the common good), it is not utopian, but ethically pragmatic. It does not posit some imaginary ideal state of things, nor does it assume (as “progressives” do) that history is aimed teleologically along some final perfecting arc. Nor does this orientation assume, as anarchists often do, that “the state” is the source of all evil, and that therefore its abolition will usher in the Age of Aquarius.
The state is simply one form among many throughout history, and at various scales, of governance; and governance of one sort or another is not only a constant in human society, it is an absolute necessity. We need norms, rules, regulation, and coordination. Right now that form is the nation-state, and I see no indication—apart from a post-nuclear catastrophe—that the nation-state will be superseded by anything else. That, and not some enduring abstraction, is why, as a socialist—a contingent orientation for our particular time—I see the state as the only institution with the capacity, that is, power and authority (these are different btw) to coordinate economic practices for the common good. This is even more true (and more challenging) in broad-scale pluralistic societies, where various traditions form governance subsets via differing traditional norms.
Given that my “socialist” orientation is toward ethically solving concrete problems and not “constructing” a “new future,” and given that I believe that mistakes, and suffering, and death are inevitable constants in the lives of persons and societies, I tend toward Karl Polanyi’s rather practical notion of economic “double movement,” wherein certain parts of economies, especially as smaller scales, can and should follow market-motivations, and that at the same time there must be constant vigilance and regulation of market actions to keep them in check. He also said, and I agree, that land, labor, and money should not be marketed as commodities—a position that is extremely radical in its implications. The protection of certain forms of “embeddedness,” or non-market constraints (like family, place, and tradition) on economic activity.
Henceforth, and based on what I’ve just said, I’ll indulge myself in some unlikely what-if’s, that is, lay out some unlikely political speculations, the very kind that I’m often critical of in others. History is indeed “open-ended,” though this openness to possibilities is circumscribed, as opposed to “determined,” by both the historically-formed present (where we “are”), the inherent limits of the physical world (physical constraints), and the less quantifiable yet still forcefully real continuum of impossible-”possible/improbable-possible/probable-inevitable” that can be gleaned “looking through a glass darkly” of human action and the consequences of such action. (I say this, by the way, as a Christian who expresses confidence in an eschaton that we cannot “see from here,” as it were, because it is both present in history and transcendent of it.)
In history’s circumscribed openness to several possibilities, which we can only estimate based on limited information, there are still real political choices which can be made. But at what scale, in what context, and within what limitations, both known and unknown? And with what consequences, intended and not? There’s the rub.
So, my socialist commitments, which I’ll soon further distinguish as “subsistence socialist,” are still grounded in pessimism of the intellect and a good deal of doubt and suspicion about the will. This doesn’t foreclose political practice, or make me a “doomer” (okay, maybe, kinda, but I call it realism). I still advocate for political practice as damage control; and I still maintain a fool’s hope that someday, somehow, people will have forms of governance that oriented toward some common good than we currently have, that is, neither venal, nationalistic, nor Machiavellian.
In the confidence, then, that I don’t have some crystal ball and that I could very well not see some possibility that is actually there, I humbly accept the necessity of speculative discussions about Big Issues, and not so humbly add my two cents. What’s left un-discussed has no chance, as opposed to some slim chance of being tried and actualized. I could very well be wrong in my most dismal intuitions about systems exceeding the control of even their masters and about the too-synchronic sociopolitical inertia that’s hurtling us toward a meta-catastrophe of environmental and social collapse (or, God forbid, nuclear winter).
Facts, of which I’ve assembled a fair number here, aren’t nearly as important as narrative (a personal thank you to Alasdair MacIntyre and Stanley Hauerwas for making this clear to me). The fact is (ahem)—as anyone can attest who’s had the folly or misfortune of watching today’s network news stations—people can selectively use all sorts of actual facts to sustain entirely false narratives. By the same token, truths that are greater than individual facts can only be ascertained—even if “through a glass darkly”—via narratives.
So here’s mine.
To begin, we need an insight from modern monetary theory, which is correct in my view, on many accounts, but which still has a giant hole in its head where eco-catastrophe should be and many advocates who still valorize “growth.” These are, of course, the two faces of the same error. But what they get right is that all this “government deficit” bullshit is based on the nonsensical notion that a state’s budget is just a family budget writ large.
For starters, no one in the family prints the money (unless the head of the family is Frank Bourassa, who printed some $250 million in phony American bank notes in a shop he set up on a farm).
Deficit hawks pretend that the money is still like gold—a preexisting substance of which there is a finite amount. Meaning it carries with it the assumption of scarcity.
So how, then, did states and interstate monetary systems [Euro] manage to create in the vicinity of $30 trillion for QE, only for it to be reinvested in fictional value like derivatives?)
MMT tells us that we’ve got it backward. The state doesn’t compete for this scarce resource—money. It creates new money and puts it into circulation (or with QE, as more gambling money for rentiers).
Currency is the “legal tender” in which taxes are paid. Such has been the case since an emperor started paying soldiers (and others) in official coin, even as the Emperor demanded that coin back in taxes from the population. In this way, the general economy was nudged into support for the army, wherefrom vendors could sell to soldiers for the tax coin . . . and, presto, money (now in demand as legal tender—a form of payment in which any debt can be serviced) was circulated.
Governments can easily issue as much money as needed without generating inflation, not just by paying soldiers—which we already do, along with other “public servants”—by only issuing enough to keep pace with the quanitity of goods and services in circulation. There’s no reason, except for a lack of political will, that the US government couldn’t “print” trillions of dollars to pay for some kind of Department of National Service, which would include not only the military and other federal servants, but highly professional non-military organizations designed specifically for labor-intensive, low-technology environmental remediation measures and rapid response/remediation for the increasing number of environmental catastrophes, like storms, floods, and fires. There’s no good reason the government couldn’t pay for twenty million small farmers who sell almost exclusively into local markets. The main unspoken objection to such measures is that it would drain off some of the overwhelming social power now enjoyed by those at the top.
Let’s talk now about this thing we call “wealth inequality,” or the “wealth gap.”
To being with, I think we’ve already established that for the past few decades, most wealth accumulation has been based on rent-seeking, in particular debt. The top one percent now controls about 27 percent of US wealth. The top 1/10 of one percent has an average of $38 million. The 9/10 of one percent after the top 1/10 of one percent has on average $10 million. The nine percent below the top one percent has an average of $1.8 million. The top 40 percent below the top ten percent averages $165,382, with most of that concentrated upward. The bottom 50 percent has zero, jack shit, often negative wealth . . . debt, that is.
Rent-seeking is almost wholly responsible for the transfer of wealth upward since the Reagan years. So, now let’s talk about the “problem solving” thing, the derivatives exposure thing, and the "state thing.
One giant step forward in solving this particular problem, along with debt peonage and precarity, would be—as I’ve been harping on for a bit now—for the Federal government to nationalize banking and finance, re-construct the firewall between commercial and speculative capital, and massively forgive debt.
Subsequent to such a jubilee reset, compound interest should be abolished, capital gains taxes raised to 60 percent, and offshore tax havens criminalized. The latter is why “tax the rich” slogans strike little fear in the hearts of the uber-wealthy. Yes, tax the shit out of them, by all means, but first find all the money they’ve hoarded abroad.
But the real wealth transfer back downward, which would disrupt the social-relation aspect of capital, is debt forgiveness. In the Bible, it’s called Jubilee.
Debt forgiveness abroad, too. Debt is the principle reason most peripheral economies are incapable of investing in their own people, and neoliberal debt “conditionalities” (with dollar hegemony) are what prevent those governments from seeking independent economic solutions. The inherently-impoverishing characteristics of the current debt-order constitute one of three major drivers for out-migration from those indebted nations (the other two being war and “climate change”).
Much of that debt is ultimately held by Wall Street, which again means that the US cannot forgive those debts (which are now bundled as trading commodities) until it first seizes the financial sector and converts it into a Federal utility.
Likewise, we now have to take into account a more recent form of rent-seeking: what Yanis Varoufakis calls “techno-feudalism.” He overplays his hand with the claim that it’s “overthrown capitalism,” but what he describes, prior to his new categorization (or bookseller’s brand), is very real.
We can use Amazon as exemplar, but most internet “platforms” are designed to collect rents in somewhat similar ways. Amazon collects a fee from producers and more fees from consumers. Varoufakis claims Amazon produces nothing at all; but that’s not strictly true. Amazon “produces” convenience—which is a form of value-enhancement—and delivery (packaging and transportation). Yes, it’s still mostly rent.
Another form of “platform” rent-seeking can be found on social media, where the owners collect click-fees from advertisers in the new “attention economy.”
In either case, what we get is not some new tributary economy (Varoufakis), but good, old-fashioned monopoly rents. The solution, as it were, would—again—be to nationalize the “platforms” in order to seize the monopolies and run them under some publicly accountable form of governance.
These suggestions are, again, necessary but insufficient, because the internal contradictions of present-day capital—to revert to an old socialist idiom—are not the only or even the biggest challenge facing humanity right now. And here is where I need to sketch out the distinction between what I call subsistence socialism and one form of Promethean idiocy that now passes for socialism among popular online “influencers.” This is also where I have to again merge a number of empirical observations with a bit of history and philosophy.
When I refer to subsistence, I’m channeling four thinkers who’ve had a great deal of influence on me, three of them sometimes classified as feminists: Ivan Illich, Maria Mies, Vandana Shiva, and Silvia Federici. Illich and Mies, in particular, have emphasized the term “subsistence.” Obviously, everyone who lives without any immediate threat of starving is technically “subsisting,” but what we mean when we use the term “subsistence agriculture,” for example, is smallholding agriculture practiced primarily for the consumption of the producers themselves along with some very local exchanges.
This conjures up images of bare, brutish survival—images that have been implanted in us by “development” propaganda. But the emphases of Illich, Mies, Shiva, and Federici are on (1) division of labor, (2) relation of consumption to production, (3) the enclosure of “commons,” (4) reliance on technology and technocratic experts, and (5) the separation of two satisfactions (the satisfaction of having and the satisfaction of doing, to the detriment of the later [alienation]).
The subsistence socialist is concerned with down-scaling the division of labor, by (1) co-location, to the extent possible, production and consumption, (2) reclamation of the commons, (3) reduction of general reliance on technology and technological experts, and (4) the re-integration of “having” and “doing.” Promethean socialists are concerned with redistribution, technological “progress,” and the indoctrination of all into becoming rootless, a-metaphysical, materialist cosmopolitans like themselves.
Yes, there is a “climate” catastrophe unfolding, but it’s been preceded and formed by a metaphysical and spiritual catastrophe. Scientism’s victory over metaphysics was culturally accompanied not by sorrow or even sobriety, but by the frivolous arrogance of the aesthete. The Taylorist deskilling manifest in factory efficiencies and the subdivision of tasks in all work is reflected in an intellectual division of labor, in the university’s increasing specialization of inquest and reflection. The metaphysical whole has been immersed in the solvent of hyper-specialization, its organs, tissues, and cells separated and decontextualized, its mise-en-scène disassembled, like a home that’s been abandoned, its bits and pieces scavenged and carried away. Likewise, scientism made off with the potential for transcendence, supplanted by its consumer counterfeits, or worse, by irony (and when that fails, invective). By our very nature, though, we cry out for transcendence and meaning. We are possessed by this restless urge which seeks some object no matter how fantastical. This imaginary new post-metaphysical home is “the future,” where our imagination can dwell in a perpetual childhood fantasy.
The Promethean socialists are not radicals. They can’t even acknowledge the radices, the roots, of our predicament. They have the implanted imagination of a Disney production, of Tomorrowland, with happy nuclear plants powering publicly-owned Walmarts and Starbucks for the new humanity—cheerfully atheistic non-binary Jetsons who will someday, perhaps, a la Elon, conquer death itself by uploading their thoughts into a nuclear-powered internet cloud.
They cannot connect their techno-future to, say, the mine operations required to fulfill it; and so they chant their mantra: redistribution. In fact, and this is increasingly important, some things not only should not be distributed to a few, they shouldn’t be produced at all—like jet skis, enriched uranium, monster trucks, Fruit Loops, false eyelashes, and internet porn.
Redistribution doesn’t solve the separation of the satisfaction-of-having and the satisfaction-of-doing, of the supercession of the former and disappearance of the latter. It prolongs it, assures it, deepening and solidifying our dependence on technology and technocratic rule. Jose Antonio Viera Gallo sensed this when he said, “Socialism can only arrive on a bicycle.”
By the same token, those with a more eco-socialist propensity need to acknowledge the reality (and yes, necessity) of the state—but also of trans-state cooperation.
Because the complex crisis we face is overwhelmingly ecological, any form of governance will have to address these crises head-on; and it will have to aim all its efforts at ultimately restoring a right relation between humankind and non-human nature.
Because general-purpose money is an ecological phenomenon that dissolves traditions, communities, and the biosphere, any transition worth its salt will have to begin the long march to reduce our dependence on money, which inevitably means radical relocalization of all basic production, draconian control of “markets,” the gradual death by benign neglect of old transportation grids, and the reorganization of political subdivisions around watersheds (the only sensible eco-political boundaries) instead of arbitrary lines drawn on the map.
To this end, the state’s role would be crucial. Once key industries and infrastructure are placed under public control and price controls established, nonessential industries need to be systematically shuttered, as they become non-essential. Public works training and jobs programs (see above) would have to be established to guarantee uninterrupted full employment at living wages; and those jobs would need to be geared to the transitional projects for (1) small-scale agriculture, (2) city transformation, (3) responding to the emergencies entailed by climate change, (4) environmental restoration, and (5) setting the stage with a different form of infrastructure for thoroughgoing relocalization.
With price controls, the state can print money for this purpose (they’ve printed trillions to bail out bond traders). Priority programs would remediate areas and communities where environmental injustices have been the worst. A maximum wage system would need to be established for various professionals—doctors, lawyers, etc. Dramatic conservation measures would need to be taken and enforced, beginning with energy rationing and including any nonessential production that relies on imports which depend upon postcolonial (neoliberal) unequal exchange relations abroad.
All subsidies and allowances in agriculture and forestry should be redirected away from industrial, high-input agriculture and toward both relocalization and sustainability. Any industry that exceeds a certain number of employees and which is not directed wholly by the state would be re-organized as worker-owned. All industry oversight and management should be conducted by subsets of regional watershed authorities. All subsidies to fossil energy extraction and refinement would need to be ended, and a transition program for all workers from those industries and into public works.
In the United States, tens of millions of people would need to become new farmers—local, well-subsidized farmers. Public works projects would include not only regional smart-grids, but sub-regional storage and distribution infrastructure for these new farmers. Storage facilities, refrigerator trucks, and public market buildings, e.g. The state would need to train and hire thousands of teachers for sustainable agriculture in community colleges and support hands-on sustainable agriculture and low-impact forestry research in every public university. The state should likewise support a fifty-year transition program for large-scale agriculture, like the one proposed by Wendell Berry and Wes Jackson, which transitions us from reliance on annual cereals to perennials.
More radical still, I’d suggest Alf Hornborg’s idea of a dual money system—or a “multi-centric economy.” The states (or waterhshed authorities, which should gradually supplant the states in power) would issue two forms of currency. One form would be the existing national currency, a currency for long-distance and international exchange. The other would be local scripts, exchangeable only within certain boundaries (like watersheds) and only for subsistence commodities produced within those boundaries: locally grown food, locally produced tools, re-used items (thrift shops), organic fuels, materials extracted from local land (wood, fibers, plants, mulch, compost, et al.), local transport assistance, and local services. This script would be issued as a substantial portion of public wages and guaranteed annual income for those who cannot work at regular jobs (the infirm, e.g., or those who are full-time carers). Local script would be absolutely tax-free and could be used to hire temporary informal labor.
In the short term, this might actually increase the exchanges using national currency, because it would free more income for non-local commodities; but over the longer term, the advantages afforded by local script, in conjunction with policies that promote increased local production, would strengthen the script as well as stabilize the local economy. In particular, given that local foods would be exchangeable for local script, this system would promote small-scale, local agriculture, which is an essential—if not the essential—component of any larger transition. It would likewise inoculate local production from the solvent-effect of the national general-purpose currency, and set the stage for the most important general change of all: a de-financialized, non-growthcentric economy.
The goal of short and mid-term social control over the economy through a democratic state is not the stabilization of a social-democratic state, but the transition to a de-financialized, no-”growth”-driven economy.
Without this kind of emergency program, what we have now—crisis wracked and headed for disaster—will stutter along and crash, leaving us even more vulnerable to authoritarian reactionaries than we already are, as evidenced by the Trump presidency and the endurance of his personality cult.
Long-term and intentional watershed-based relocalization is far more radical (and rational) than the nationalistic and nostalgic Keyenesians of Bernie Sanders’ stripe, but a real alternative needs to be articulated, with a vision upon which to build a valid alternative to reaction. How that looks would depend on many things that are yet to be discovered in the process of repurposing and redesigning the built environment and restoring the natural one; and if we do not repurpose and redesign the built environment, that built environment itself will return us to our present practical and epistemological default positions on the runaway train.
None of this is even remotely possible, though, unless banks, utilities, key industries, and monopolies are nationalized, unless we were to unilaterally and immediately withdraw all US military forces from abroad, systematically ensure that everyone is housed and fed, guarantee a dignified living to all, make education, incuding crafts and trades, through graduate school free to all who qualify; to forgive massive amounts of personal debt; and to adopt some kind of basic and humanely limited single-payer health care system, in the sense of escaping the bare life paradigm and not prolonging life through ever more technological means. These are just lifesaving first steps, because people need to be reassured and supported before the real work of repurposing, redesign, and restoration begins.
Man Thing
Would such a trajectory solve the world’s problems? No. Would it usher in a utopia? No. Would it be painful? Absolutely. Is it unlikely? Very. It’s possibility on par with me being struck on the head by a meteorite when I go out to walk the dog. God, I hope I’m wrong about that.
Returning to Carlo Lancellotti saying, “One of the hardest things is to convince old leftists that they cannot have FDR's left without FDR's country (in terms of religion, family structure, educational system etc).” We’ve established the context for this (accurate, in my view) claim, which is metaphysical and moral catastrophe—the loss of shared cultural certainties that underwrote the forms of solidarity which took advantage of the Keynesian superstructure. We’re in some kind of rentier-tributary, nuclear-armed, death throe interregnum, against the baleful backdrop of biospheric collapse.
And so, in returning to the actually existing present, we return to John Milbank’s denunciation of the Tories and our own predicament in the US, where a pugnacious, irrational reaction continues to bubble to the surface of our general malaise. Professor Milbank lives in the UK, so he at least doesn’t have to contend with the cancerous gun culture that superintends American reaction.
Most would-be intellectuals, even of the spotty, feral-gonzo sort like moi meme, have certain theses to which they obsessively return like a prospector digging deeper into his little claim in the hope of finding that next grain of gold. Mine has been, since the 1990s, the association between a species of dominator masculinity and violence/war. The danger with this kind of obsession is that it can create a kind of causation tunnel-vision and pilot one into the mistaken belief that the One Big Thing has magical interpretive (and even curative) powers that far exceed its actual capacity. I hope I’ve guarded against that tendency, but my gendered preoccupation (not genderwang!) keeps coming back to me with greater force in the face of actual developments.
When speaking, then, of the wave of reaction that swept parts of the world, and which still is to a degree, and speaking particularly about Trumpism in the US, I believe it’s necessary (though, yes, insufficient) to point out the role of dominator masculinity (which I’ll just call masculinism) that enlivens this reaction here in the country of my birth.
This was the subject of Borderline, a book I wrote about masculinism and war; but it maps easily onto an affinity for violence that is evident in the admiration of Trump (or Bolsonaro, or Orban, or Netanyahu, or Putin. This tendency lies latent in the culture, or cultures, wherein war and the admiration for warriors (as hero figures) is a transhistorical feature, from The Odyssey to the Books of Samuel to Beowulf to The Lord of the Rings to The Eagle Has Landed to Leon Uris’s Exodus. (I enjoyed all these, btw, except the last two.) What has changed, in my experience, since the occupation of Vietnam, is the transition from the hero as a virtuous actor on behalf of some common good, only reluctantly violent in times of extremity, to the construction of the hero-in-himself as an icon of violent, get-things-done masculinity, wherein preemptive violence becomes synonymous with virtue, a propensity that the “masculine man” leans into and quite immodestly displays. (Modesty was once a masculine virtue; how I miss it!) Brute consequentialism, with violence as its celebration. Masculinity as a walking, talking threat (which bespeaks some terrible fear at its very center, an armoring to ward of a deep psychological terror).
War stories now don’t care whether the occupations of Afghanistan and Iraq were justified or even realistically effective, but whether the character himself was willing to risk combat and has the capacity for swift and decisive violence. There’s a push and a pull at work here: the push of anxiety in a world that seems to be losing control and the gendered pull of this pugnacious masculine archetype against a modern life that has become meaningless, safety-obsessed, and unchallenging (the loss of satisfaction in doing, instead of just having).
The Bully Ideology & Lost Status
The most dreadful and ugly aspect of this love of lean-in, consequentialist violence is that it has led us to equate bullies with heroes.
That is the attraction of Trump, Bolosonaro, Orban, and Netanyahu. Nothing so delighted Trump’s acolytes, or the alt-right, as him making fun of the disabled or encouraging his supporters to beat down dissenters.
They love the bully. They celebrate the bully. They want to be the bully. They want to follow the Great Father Bully, the blonde beast, the “splendid predator.”
And as always, this contrarian streak manifests itself in a contempt for racial others and women. Steve Bannon and Jordan Peterson are separated by mere inches, which is why one can easily find people who admire both, and they trade psychologically in what Diana Mutz has called “status threat.”
Out of the miasma of an ever more financialized and precarious economy emerged a dual resentment: economic precarity and the loss of cultural status. I’m always reluctant to employ the over-used and misused term fascism, but what we can learn about present-day reaction from an historically specific form called fascism, is that it’s popular support came from precarious middle classes. Economic status and cultural status are not the same thing, but they have a strong Venn overlap. The Trump cult was (and remains) majority suburban/exurban. (Rural right wing support comes from a combination of exploitative dispensationalist religion and arrogant urban liberal contempt.)
Trump 2024 supporters now in Congress are largely from the sub/exurbs. Andrew Clyde is from Atlanta exurb, Gainesville. Jim Jordan, from the exes outside of Columbus, Ohio. Lauren Boebert drew most of her votes from Grand Junction and Pueblo, Colorado. Anna Paulina Luna—exurbs west of Tampa, Florida. Marjorie Taylor Green—Rome, Georgia (pop, 37,000).
Diana Mutz (herself an urban liberal) overworks her thesis a bit in “Status threat, not economic hardship, explains the 2016 presidential vote” (PNAS, 2018), because these are not mutually exclusive categories. She also employs some questionable claims, i.e., that Trump was elected during an “economic recovery,” using establishment criteria for said “recovery,” which we’ve shown have little to do with everyday people’s actual situations—especially debt precarity! And when she says that “conservatism surges along with a nostalgia for the stable hierarchies of the past,” she fails to define “conservatism” or differentiate it from reaction. Nor does she offer any explanation about why this nostalgia “surges.” Nonetheless, status threat, as she describes it, is a real thing.
Racial hierarchies have somewhat diminished, which has generated a measure of white anxiety among some (negrophobia has been an organizing principle in the Republican Party since Nixon, nothing new here). The four more salient forms of status threat that gave Trump his narrow victory were (1) threats to male status vis-à-vis females, (2) threats to economic status (which Mutz attetmps to dismiss), (3) threats to the associative status of American supremacy in the world (national masculinity), and (4) threats to “white” status apart from Republican negrophobia, i.e., xenophobia (mostly directed against Latin Americans and Muslims) and “white replacement” narratives. It should be noted that Trump’s bully appeal drew in a fair number of black and Latino male voters.
Don’t get confused by zany female politicians in the US, like the round-heeled Boebert and Green. They are decoys, women who’ve opted to become honorary men as the public faces of daft conspiracy theories combined with end-of-the-world fantasies cribbed from The Walking Dead. I suspect many of them have themselves been bullied, like the sexually abused who go one to become abusers.
Threats to male status and the narrative of American supremacy (national masculinity) correspond strongly with a general hostility toward an undifferentiated “feminism” (and women generally) and with our current “surge” of reaction and its American cohort of gun nuts. Returning to “fascism,” then, I think R. W. Connell gave a good psychological (at least) description of it several decades ago:
In gender terms, fascism was a naked reassertion of male supremacy . . . To accomplish this, fascism promoted new images of hegemonic masculinity, glorifying irrationality (the “triumph of the will,” thinking with “the blood”) and the unrestrained violence of the frontline soldier.
This is one hundred thousand times more convincing than Wilhelm Reich’s ridiculous claim that fascism is an outgrowth of “sexual repression” (a claim he then had to support by expanding the definition of fascism to include everything he didn’t like). Obviously, neither psychological explanation takes into account the two key characteristics of actually existing historical fascism: economic precarity and the emergence of a real threat to power from the left.
It’s the latter that’s missing in the US and elsewhere, and which is unlikely to return. Which doesn’t mean millions of paranoid, resentful, sexually-insecure fools with guns and heads full of cockamamie “theories” aren’t a threat to public order or the common good.
Unfortunately, the only alternative allowed by Wall Street, which still owns both the political process and the Democratic Party in the US, is a technocracy which is simultaneously neoconservative and neoliberal—that is, militarism abroad and rentier rule at home. It’s a little like everyone being forced to drive back and forth across an old bridge that we all know is on the verge of collapse.
Seen from above, what would anyone expect of a society where the culture tells people they should be in control, as people increasingly lose control of their own lives at the most granular level . . . where they’ve been disembedded, disenchanted, deskilled, indebted, and rendered terminally technologically dependent?
Last Words
Returning to Professor Lancellotti, he once said, “When I came to the US I was struck by the wealth of well-established professional organizations in all conceivable fields. I also noticed that they were staffed by a particular class of hybrid professional-bureaucratic types, a combination of not-too-successful practitioners.”
The utter failure of rentier liberalism has given birth to power struggles between competing sets of intellectually mediocre narcissists during a period when we most desperately need thoughtful, selfless leadership.
Republicans are fighting Democrats. Both parties are led by scoundrels and buffoons. From afar, it seems there’s a similar dynamic across the water in the UK between Tories and Labor, and only God knows what the hell is going on in Canada. The “wokes” are in a pitched battle with the “woke-panicked.” In churches, the “trads” are in a war with the “progressives.” In the background, behind all this sound and fury, is all of the above. Alas, we have abandoned all meaning, and so we stand disarmed before the beast.
You said in your heart, ‘I will ascend to heaven; above the stars of God I will set my throne on high; I will sit on the mount of assembly in the far reaches of the north; I will ascend above the heights of the clouds; I will make myself like the Most High.’
Isaiah 14:13-14
Hi, I'm coming to this a little late. I really liked how you've pulled all these threads together. I'm familiar with many of the ideas, but to see them all in one place is very helpful.
A couple of random thoughts/responses: Even though it at times it seems pointless, I find some meaning in pushing back against what seems like impossible odds. A small example of this is that I'm part of a Kernza (a perennial wheatgrass from The Land Institute) Growers Co-op. I don't hold out a whole lot of hope that its going to pan out, but I agree with you that changing land use/holding patterns is very important. My work on the farm is a small effort in that direction.
You might find this video playlist of my first years experience growing kernza an interesting diversion.
https://www.youtube.com/playlist?list=PLj5UDFGP0BjfNLkhz36IiZJUJ5xw-VIlp
As an aside, you mentioned in the post that people are stuck watching Ads on YouTube. While true for most/many, it is possible (as you may already know) to avoid Ads completely, without paying, by installing (free) AdBlock and uBlock in your browser. I've had them on my computers for 10+ years without any (to the best of my knowledge) troubles.
Thanks for putting your writing online.
Here's a new development that's right in line with the modus operandi of Finance Capitalism outlined in this tour de force of a Substack post:
JPMorgan is about to spend $1 billion on hundreds of rental homes across the US on the way to becoming a megalandlord
Robert Davis
Nov 17, 2022, 9:01 AM EST
https://www.businessinsider.com/jp-morgan-to-acquire-1-billion-of-single-family-rentals-2022-11?op=1
Wall Street has purchased hundreds of thousands of single-family homes since the Great Recession. Here’s what that means for rental prices
Published Tue, Feb 21 20239:28 AM ESTUpdated Wed, Feb 22 2023
https://www.cnbc.com/2023/02/21/how-wall-street-bought-single-family-homes-and-put-them-up-for-rent.html