The World Turned Inside Out Page 3
Gilder’s solution to the problem, so conceived, was to restore the “male role” of breadwinner in the traditional family. His solution, in other words, was to turn back the clock, to sink roots in sand dunes long since washed away by the tides of historical change. For by the 1970s, the traditional family was already becoming an anachronism, and no amount of rhetoric would restore it. The rate of divorce tripled between 1960 and 1982; by the mid-1980s, 50 percent of marriages ended in divorce, and at least 25 percent of all children were growing up with no more than one parent. In 1967, half of all women in their thirties were married mothers who remained at home full time; by 1982, only a quarter were so occupied. More than half of all women, married or not, were in the workforce by the 1980s, and the American family had diversified accordingly. According to the 1990 census, most families did not fit into the “traditional” category because they were headed by single women, or were based on homosexual partnerships, or were otherwise anomalous from the “anthropological” standpoint Gilder adopted. And so on. Gilder may have been urging a worthy cause—although, come to think of it, The Simpsons does question the function of our fathers—but he might as well have been denouncing the law of gravity.
Or was he on firmer ground? What if the decline of the traditional family was not the inevitable result of social, economic, and cultural changes since 1960? What if this decline was instead the artificial result of misguided public policy? Gilder didn’t bother to ask—for him, the family was one of those “classical truths”—but Charles Murray, yet another quirky Harvard grad with backing from the American Enterprise Institute (AEI), did ask. His startling answers were contained in a book, Losing Ground, which was published in 1982 to the delight of the supply-siders. They had already read his AEI research papers of the late 1970s, where he argued that welfare—that is, Aid to Dependent Children (ADC) as sponsored by the federal government—had destroyed the traditional family by giving poor women the financial incentive to bear children but stay single (the recipients of ADC could be neither married nor employed). In the absence of this transfer payment, Murray insisted, the family would not be at risk, and neither would the larger culture, because the benefits of marriage and work would no longer be obscured, even canceled, by government policy. Dependence on the dole would disappear, and with it the “urban crisis” exemplified by New York City.
Meanwhile, Murray claimed, a criminal “underclass” that was simply unable to take advantage of job training, food stamps, educational subsidies, or welfare would have to be taken off the street. In this scenario—which could have been borrowed from almost any Western movie from 1939 to 1962, from Stagecoach to Liberty Valence—a vigilant law and order cleared the way for self-reliance and stable families, and thus for civilization as such. Angry pimps with too many prostitutes and “welfare queens” with too many bastards would soon give way to husbands with jobs and wives with children just as renegades and outlaws had given way, once upon a time in the West, to the superior force of armed settlers and hired guns. In effect, then, Murray was writing a political screenplay that spoke to the social disorder and ethical ambiguity on display in most films from the 1960s and 1970s. Moral exhortation had already replaced social science. In other words, the imperative mode of argument, where personal conviction (“authenticity”) and narrative form count for more than mutually agreeable exhibits of empirical evidence—where a new, opinionated relativism reigns—had already become the norm by 1980. But notice that it was not the handiwork of the academic Left.
This imperative mode of argument dominates Gilder’s Wealth and Poverty, the anthem of the Reagan Revolution. Almost every paragraph contains a sentence that begins or ends with the kind of certainty we associate with evangelical faith; phrases like “there can be no doubt,” “no one denies,” “there is no question” etc., sprout in this ideological hothouse so profusely that the reader has to wonder why the book had to be written. In the absence of controversy, after all, writers are unnecessary. But then Gilder was enlisting in the culture war of his moment, when the sources of poverty were becoming a new issue. He argued that poverty was, in fact, a structural problem, just like liberals claimed. But the problem in question was, again, cultural, not economic. It was again a result of the shortage of fathers, which in turn meant a surplus of feckless, useless males without attachments, commitments, and responsibilities—that is, without the moral anchor of familial obligation.
Gilder wanted to make the case for the supply-side revolution, however, so he also launched a theoretical assault on the Keynesian consensus. In this airless room of relentless abstraction, the eternal truth of Say’s Law—“there cannot be a glut of goods caused by inadequate total demand” because “the sum of the wages, profits, and rents paid in manufacturing a good is sufficient to buy it”—is repeatedly asserted, to the point where it becomes an explanation, or at least a label, for almost everything in human history. Indeed, Gilder invokes it to describe the gift economies of aboriginal cultures as well as the investment strategies of modern capitalists; by this historical accounting, everyone in every time and place has acted in accordance with its dictates. So he can earnestly announce that “Say’s Law, in general terms, is a rule of all organized human behavior,” and then say, equally earnestly, that “Say’s Law in all its variations is the essential enactment of supply-side theory.”
The Disregard for Capital
But so what, you might ask, if Gilder insisted that Say’s Law is a transhistorical dimension of human nature? What’s the bottom line here? What was the “cash value” of these theoretical incantations? Good questions. The assault on the Keynesian consensus was designed to demote consumer demand and to promote private investment as the key to growth; it was designed, that is, to increase the incomes of the wealthy, on the assumption that such a shift in the distribution of income shares would enhance investment. This was not a hidden agenda; it was clearly stated by everyone involved. If supply creates its own demand, as Say’s Law stipulates, we should not worry about the scale of consumer demand, and we should not, by the same token, penalize private investment by taxing it heavily. Here is how Gilder explained the program: “But the essential point is fruitless to deny. Producers play a leading and initiatory role in eliciting, shaping, and creating demand. Investment decisions will be crucial in determining both the quantity and the essential [sic] pattern of consumer purchases.” Or again: “The crucial question in a capitalist country is the quality and quantity of investment by the rich.”
Now, if “investment by the rich” rather than consumer demand is the engine of economic growth, the policy-relevant question becomes, how do we redistribute national income away from consumption toward saving and investment? In short, how do we reward capitalists for doing their job? The indispensable first step, according to Gilder and his fellow supply-siders, was to cut taxes on investment income, for that move would persuade rich folks to stop buying larger mansions, faster cars, and bigger yachts—to stop consuming so conspicuously—
and start investing their money in goods production. Again, this redistribution of national income was not a hidden agenda. It was articulated as early as 1974 by Alan Greenspan, later of Federal Reserve fame, and it was the centerpiece of Reagan’s presidential campaign in 1980.
The supply-siders were fighting a rear-guard action against what Henry B. Kaufman, the Salomon Brothers partner who was the most influential stock market analyst of the 1970s, called “The Disregard for Capital.” In a speech delivered at the New York Economic Club in May 1980, then widely circulated as a photocopy, Kaufman declared that the American people were no longer satisfied with the merely formal equalities afforded by the modern liberal dispensation—most of them, he claimed, were now clearly committed to something that resembled socialism. “Fundamental change has been taking place in our society over the last five decades,” he noted, and insisted that such change had finally produced a “democracy oriented toward an unaffordable egalitarian sharing of production rath
er than equal opportunity.”
Kaufman’s counterpart on Wall Street, Felix Rohatyn—he was the investment banker who steered New York City away from bankruptcy—similarly lamented the advent of a “padded society” that had smothered market forces in the name of income security for all. They both agreed with the diagnosis offered a few years earlier by the Trilateral Commission, a high-profile group of accomplished intellectuals and business leaders from Japan, Western Europe, and the United States, which blamed a “welfare shift” in federal spending priorities—a shift away from defense and toward domestic social spending—for empowering new voting blocs and thus endangering both the future of economic growth and the “governability” of American democracy: “The political basis of the Welfare Shift was the expansion in political participation and the intensified commitment to democratic and egalitarian norms which existed in the 1960s.”
Notice that the dissatisfaction with business as usual, and the fear of impending alternatives to private enterprise, together created a consensus that included figures from every part of the political spectrum. The liberal Trilateral crowd upstaged Jimmy Carter by dominating his cabinet; then the “conservative” supply-side crowd chewed the scenery of Reagan’s presidency by slashing taxes and burning the budget. But they all agreed that capitalism was broken and needed some kind of fix. More specifically, they all agreed that “a new balance between social and economic objectives,” as Kaufman put it, had to be written into government policies. Otherwise the perceived requirements of social justice would continue to block the known requirements of economic growth, that is, personal saving and private investment. So the supply-siders were not a voice in the wilderness; they were just the loudest in a growing chorus of critics which doubted that capitalism would survive the 1970s unless it received the equivalent of shock treatment. In this sense, the intellectual origins of the Reagan Revolution were not exclusively conservative.
Even the die-hard Keynesians, and there were lots of them, understood that the stagflation of the 1970s represented a new challenge to their theoretical assumptions and policy prescriptions. Lester Thurow of the Massachusetts Institute of Technology, for example, wrote a book called The Zero-Sum Society (1980), which fed into Rohatyn’s almost apocalyptic writings of the early 1980s by arguing that the enfranchisement of new interest groups had created political deadlock in Washington. Economic policy had accordingly become an illogical, even idiotic patchwork of pork-barreled compromise. Like Rohatyn and Lloyd Cutler of the Trilateral Commission (he would become Carter’s White House counsel), Thurow proposed to solve this problem by streamlining and centralizing policy making in the executive branch. Robert Reich and Ira Magaziner, both later employed by the Clinton administration, meanwhile wrote books like Minding America’s Business (1979), which recommended Japanese-style “indicative planning”—low-cost government loans to rising industries, higher interest rates, and worker retraining for declining industries—as the cure for what ailed us. The Keynesian diagnosis of stagflation led, therefore, toward state capitalism.
So maybe David Stockman was right to suggest that he was a soldier in “the war between the statists and the anti-statists, between those who wanted government to dominate every aspect of American life and those who didn’t.” Surely Ronald Reagan’s campaign rhetoric tapped into the powerful antistatist tradition of American politics. He knew as well as anyone that the revolution which created the United States of America was animated by commitment to the sovereignty of the people, not the state or its agencies. The founders understood the appeal of classical republicanism, especially its reading of virtue as the product of political action; they nonetheless believed that the sphere of liberty was society, in market transactions among equals under law, and that the sphere of power was the state, where domination and corruption rather than mutuality ruled. Reagan restated and amplified this opposition between liberty and power by insisting that “government is the problem, not the solution.” His supply-side principles, however belated, were an integral part of his attachment to the antistatism of “original intent.”
By 1980, American voters were willing to try an alternative to the Keynesian consensus—the big-government liberalism—that had characterized the programs of every president since Calvin Coolidge left office in 1929, in part because they knew that public policy wasn’t solving their economic problems, in part because they hoped that increased private investment would. They were always skeptical of Reagan’s programs (every poll of the 1980s showed they liked the man better than the policies), but they rolled the dice and hoped for the best.
Conservatism Old and New
Meanwhile conservative opinion was becoming more visible, more respectable. William F. Buckley Jr., the founder of National Review who had opposed the civil rights movement and who had once proposed to limit the vote to property-holders, still appeared weekly on public television (the fourth channel in most households) as the “moderator” of a wonkish talk show. Then conservative icon Milton Friedman, the famed economist from the University of Chicago who won the Nobel Prize in 1978, was featured as the talking head of a PBS series that let him expound on his ideas about the causative relationship between markets and freedom. This was a breakthrough for conservative opinion, even though his monetarism—the rate of growth in the money supply determines everything, he argued—was inconsistent with the implied activism of supply-side doctrine. For the guileless Friedman indirectly introduced millions of viewers to Friedrich von Hayek, the Darth Vader of modern economics, as if he were just another theorist from the mainstream.
Hayek was the Austrian who insisted that socialism was preposterous as well as evil because it supposed that the market could be subordinated to reason. For Hayek, as for Friedman, market forces were the source of freedom precisely because they could not be known from any standpoint—precisely because they could not be manipulated by individuals or companies or governments. As long as the rights of private property were inviolable, they assumed, and as long as everyone was subject to the same anonymous laws of supply and demand, liberty and equality were possible. From this premise, they argued that only capitalist societies could be free societies. They also argued that the citizens of a free society could not even try to create a just society; for to do so would be to modify the arbitrary results of market forces in the name of justice and thus to staunch the economic source of freedom.
As Irving Kristol, by all accounts the founding father of neoconservatism, noted at the time, this argument’s rigorous indifference to justice would be simply intolerable to most Americans. But he didn’t reject it because it wouldn’t play in Peoria. He rejected it because it denied modernity itself, the moment when consent—not force and not chance—became the principle of social order and political innovation. “But can men live in a free society,” Kristol asked, “if they have no reason to believe that it is a just society?” In a word, no. The “historical accidents of the marketplace,” as Hayek and Friedman portrayed them, could not be “the basis for an enduring and legitimate entitlement to power, privilege, and property,” not any more than the historical accidents of birth could make the claims of a hereditary aristocracy seem reasonable.
Kristol was trying to detach conservatism from its schizophrenic devotion to free markets on the one hand and tradition on the other. He knew that you can’t revere tradition if you admire the “creative destruction” that capitalism brings to life. He knew that you can’t insulate the nuclear family from the heartless logic of the market if you accept the dictates of free enterprise. He knew that conservatism had to become more liberal if it were to sound like something more than hidebound devotion to a phantom past. A “combination of the reforming spirit with the conservative ideal,” he declared, “is most desperately wanted,” and cited Herbert Croly, the original big-government liberal from the Progressive Era, as his source of inspiration.
Kristol also knew that the competitive, entrepreneurial economy Friedman and Hayek posited as the sourc
e of freedom was a mere fantasy. Capitalism had long since become a system in which large corporations, not small producers, dominated the market—those anonymous and unknowable laws of supply and demand which once made all producers equally subject to the discipline of market forces had been supplanted by the visible hand of modern management: “There is little doubt that the idea of a ‘free market,’ in the era of large corporations, is not quite the original capitalist idea.” Some producers had more market power than others: some persons (and this is how corporations are legally designated) were more equal than others. So everyone was not “free to choose,” as Friedman would have it, simply because he or she inhabited a market society. Corporate capitalism remained a moral problem. For in “its concentration of assets and power—power to make economic decisions affecting the lives of tens of thousands of citizens—it seem[s] to create a dangerous disharmony between the economic system and the political.”
We might then say that neoconservatism was born when that problem was acknowledged and addressed by ex-liberals like Kristol. We might also say that Friedman’s PBS series was an “elegant tombstone” marking the death of an older conservatism in which the state was a passive spectator and in which the moral problem of market logic—if everything is for sale, how do we preserve public goods (like democracy) that have no cash value?—was happily ignored. Certainly George Gilder was trying to make an ethical case for capitalism by claiming that investing in the hope of private profit was the equivalent of gift-giving in the name of social solidarity. Kristol was trying, too, but he was more subtle and more skeptical than his colleagues on the supply side.
God, Family, Markets
One of his many protégés, Michael Novak, was writing a book at the same historical moment which exemplifies both this ambition to dress capitalism in new moral clothing and the institutional environment that enabled the ambition. Like Kristol and many other neoconservatives, Novak was a former radical disillusioned by the oppressive idiocies of actually existing socialism and disheartened by the political postures of the largely literary Left. As a genuine theologian and moral philosopher—before he joined the American Enterprise Institute as a senior fellow, he was the Watson-Ledden chair of Religious Studies at Syracuse University—he knew that the defense of capitalism required something more than jolly slogans and militant assertions. That knowledge propelled Novak onto the contested terrain of classical social theory, where Karl Marx and Max Weber were still the landlords, and into the fractious space of contemporary religious debate, where “liberation theology”—a distant echo of the early Christian concern with its original constituency among the poor and the outcast—was now the real alternative to received doctrine in the Catholic Church itself.