- Home
- James Livingston
The World Turned Inside Out Page 5
The World Turned Inside Out Read online
Page 5
Another, more prosaic way to answer the question about an unspoken socialism passing for politics as usual in the late twentieth century is to measure the growth of transfer payments in this period. Transfer payments represent income received by households and individuals for which no contribution to current output of goods or services has been required. By supply-side standards, they are immoral at best and criminal at worst because they represent reward without effort, income without work—because they conform to the ancient Christian and the modern socialist criterion of need (“From each according to his abilities, to each according to his needs”). But they were the fastest-growing component of income in the late twentieth century, amounting, by 1999, to 20 percent of all labor income. From 1959 to 1999, transfer payments grew by 10 percent annually, more than any other source of labor income, including wages and salaries; by the end of the twentieth century, one in every eight dollars earned by those who were contributing to the production of goods or services was transferred to others who were not making any such contribution.
But did the American electorate ever rise up against this unspoken socialism? Not really. Neither did their representatives in Congress. There were, of course, local rebellions against property taxes in the late 1970s, the most successful of which happened in California under the rubric of Proposition 13. But it took a notoriously neoliberal president, Bill Clinton, to “end welfare” as we knew it, and he couldn’t slow the growth of federal transfer payments anymore than his predecessors could. Indeed, the “crisis” of Social Security which consumed so much newsprint, airtime, and blog space in the early twenty-first century was largely a matter of chart–mongering and hand-wringing about this seemingly irresistible urge to detach income received from work done.
The detachment of income from work, the essence of socialism, abides, then, just as unobtrusively, but just as steadfastly, as The Dude, who unwittingly foiled the venal designs of that outspoken neoconservative, the Big Lebowski. The specter of communism may have retired to the dustbin of history, but its first cousin still haunts the economic system of modern capitalism, even in the United States, even in the aftermath of the “Reagan Revolution.”
And the moral/cultural system? Did neoconservatism triumph where the supply-side revolution failed? According to its most ardent supporters, absolutely not. Here, too, they say, the liberals and the leftists and the lesbians won. They have a point. American culture was much more liberal, open, and electric in 1990 than in 1980, and then again in 2000, no matter how you frame the issues of gender, sexuality, and race, no matter how you characterize music, movies, and other performance arts. And that culture was increasingly popular, visible, and even politically significant outside the United States, as the sound tracks and dress codes of the Eastern European revolutions of the early 1990s will attest: from Poland to Czechoslovakia, rock and roll, blue jeans, and T-shirts identified the personnel of a post-Soviet order.
Whatever the issue—whether sexuality, gay rights, reproduction, education, women’s rights, racial equity, equal opportunity, affirmative action, or freedom of expression—the domestic debate was joined in the 1970s and 1980s, was intensified in the 1990s, and was always lost by the so-called cultural conservatives who kept citing “family values.” For the New Left of the 1960s grew up and got jobs in all the right places, especially, but not only, in higher education. Here is how Robert Bork, whom Reagan nominated to the Supreme Court in 1987, explained this process ten years after the Senate voted decisively against his confirmation: “It [the 1960s] was a malignant decade that, after a fifteen year remission, returned in the 1980s to metastasize more devastatingly throughout our culture than it had in the Sixties, not with tumult, but quietly, in the moral and political assumptions of those who now control and guide our major cultural institutions. The Sixties radicals are still with us, but now they do not paralyze the universities; they run the universities.”
Again, the United States was a much more liberal place after Ronald Reagan. We will explore the evidence in more detail in chapter 4. In the next two chapters, meanwhile, we turn to the academic sources of this evident yet unknown result of the culture wars.
chapter two
“Tenured Radicals” in the
Ivory Tower:
The Great Transformation of Higher Education
The so-called culture wars of the late twentieth century were fought, for the most part, at the shifting borders of the American university, the new center of “knowledge production” in the postwar era. The goal of the combatants, from Left to Right, was to clarify and (re)define the proper uses of higher education in the United States. But in a broader sense, the culture wars were debates about the promise of American life; for, by the 1970s, the principal residence of that promise was widely assumed to be the new “meritocracy” enabled by universal access to higher education.
These conflicts often turned on the meaning and significance of obscure academic doctrines such as deconstruction, which apparently invalidated any possibility of a stable truth claim about written texts; they often degenerated into pointless arguments on the role of Western civilization—both the European past and the required course—in sponsoring the concepts of reason, justice, and representative government; they often got stuck at the rhetorical stage of “political correctness,” where both Left and Right could accuse the other side of harboring inane ideas about the realities of research, teaching, and public service. These conflicts nonetheless illuminated different versions of the usable past and the impending future of America which reached well beyond the sacred groves of academe. For as Arthur Schlesinger Jr. put it in 1991, “the debate about the curriculum is a debate about what it means to be an American.”
We might even say that the “culture wars”were a symptom of the fourth great watershed in the history of higher education. By the same token, we might say that they were judgments on the historical moment—the 1960s—in which this educational watershed was both cause and effect.
The university as we know it, a secular city unified only by commitment to intellectual inquiry, is little more than a hundred years old. The university as such is about eight hundred years old; it was invented in the thirteenth century by the so-called scholastics, the learned priests who, with the help of ancient philosophy, tried to square Catholic doctrine on usury with modern uses of money at interest. Thereafter, higher education was mostly a matter of learning to reconcile God and Mammon, of keeping faith with purposes larger than your own, whether you learned the lesson in a seminary, a college, or a university. Certainly the curriculum of the private colleges founded in England and North America before 1800—Oxford, Cambridge, Harvard, and Yale are the blueprints—would suggest that both the religious archive and the effective conscience of modernity found their homes on campus.
The modern secular university emerged in nineteenth-century Germany as a nation-building device. Its inventors thought of it as a way of articulating a culture—not a religion—that would forge the citizens of a unitary, sovereign state from the raw materials of different classes, regions, and peoples. In such a setting, the humanities did not make up the “core curriculum” of the university; they constituted the university. The research program here was scholarship that demonstrated the cultural continuity of ancient Greece and modern Europe. Reputable scholars showed that each modern, civilized nation stood at a different remove, or at a different angle, from the Athenian origin, but that each had developed a unique identity in the form of a national literature by appropriating that origin. And so the idea of the canon was born.
By the end of the nineteenth century, American educators and legislators were building a new, hybrid system that borrowed from the German model—many of the young PhDs who invented and staffed the new American universities had been trained in Berlin—but that didn’t merely reproduce it. We can date the formal beginning of this system, the third great watershed in higher education, from 1884, when the Johns Hopkins University established the fi
rst graduate programs in philosophy and history. But its roots lie in the land grant college acts of 1862 and 1887, which used federal resources to lay the foundations of the state universities that came of age in the 1890s, from Michigan to California; and its effects are evident in the transformation of private institutions—among them, Stanford, Chicago, Cornell, Northwestern, and Columbia—which was meanwhile accomplished by imitation of the public universities. And so “the politics of liberal education,” as we now say, were present at the creation of the modern American university because the university itself was the product of political purposes.
The new model of higher education in the United States was, however, neither public nor private. And its immediate inclusion of science and business courses in the college curriculum meant that the humanities would have no monopoly on the future of the university or the larger culture—the canon was always in question, and it always included renegade texts from disreputable sources. As usual, the American university did not measure up, or down, to the more romantic and elitist standards of European education. Its curriculum and clientele never abstained from the commercial culture that surrounded and permeated it, and by the 1940s its managers, public and private, had begun to imagine the consequences of universal access.
And yet as late as 1960, only a third of high school graduates in the United States went on to college, and of these privileged students, only 38 percent were women, and only 7 percent came from ethnic minorities. In 1960, there were about 3.5 million students in all American colleges and universities. By 1970, however, their numbers had more than doubled to 7.5 million because, for the first time in the history of higher education, more than half of all high school graduates went on to college of some kind. The number of faculty members increased proportionately, by almost 70 percent, as the system sorted into the three tiers we now take for granted (universities, four-year colleges, community and county colleges). Most of this increase came in the five years after 1965, when the number of new positions created and filled by institutions of higher education exceeded the entire number of faculty positions filled in 1940. Many of these new hires were understandably moved by the “new social movements” of the 1960s, particularly civil rights, Black Power, environmentalism, and feminism, but, as we shall see, they didn’t always get tenure.
By 1980, enrollment in colleges and universities had grown by another 60 percent to twelve million students, a fourfold increase since 1960. Meanwhile, the social composition of the student body changed dramatically. Women were less than 40 percent of undergraduates in 1960; by 1980 they were a solid majority of a much larger cohort. Minorities were less than 7 percent of undergraduates in 1960 (about 210,000 students); by 1980, their numbers had grown sevenfold to 13 percent of all undergraduates (about 1.6 million students). The social composition of faculty and professional staff changed accordingly, but less dramatically, as the net number of new positions dwindled after 1970 and almost disappeared after 1980. For better or worse, the politics of liberal education had nevertheless changed irrevocably.
This almost incredible expansion of higher education—the fourth great watershed—was not an accident, and as with other such massive quantitative changes, it had qualitative effects both inside and outside academe. One way to explain it as the product of careful deliberation rather than a random development is to claim that the Soviet scientific achievement of 1956—the successful rocket launch of Sputnik, an automatically orbiting satellite—
galvanized public opinion and government resources in the United States, pointing both toward greater investment in education as a vital dimension of national defense. There is much to be said for such an explanation, in part because it dissolves the distinction between domestic programs and foreign policy. Even so, it tends to reduce the great transformation of higher education in the 1960s and 1970s to a late episode in the Cold War. That explanatory tendency is a problem because this transformation began long before the Soviet Union and the United States squared off in the 1950s as contestants for the loyalties of the “Third World,” and it remains, in our own time, as evidence of a fundamental historical transition that has nothing to do with the Cold War.
The great transformation of higher education began, in this sense, as an intellectual upheaval that reached a verge in the 1940s and became a kind of common sense in the 1950s. The Keynesian Consensus discussed in chapter 1 was the policy-relevant economic theory that authorized this upheaval. The GI Bill, which gave veterans of World War II significant incentives to abstain from the labor market and go to college—you could reduce unemployment and get a free education, all at once—was one of the new public policies that expressed the same upheaval. So was the Master Plan for the University of California, which, as implemented in the 1950s and early 1960s, made education the basic industry of the most populous state. And so, too, were the federal policies of the 1960s, which sought to increase equality of opportunity, to reduce regional educational disparities—the South was still a backwater at best—and to create universal access to higher education through tuition grants and admission mandates.
Postindustrial Intellect
The intellectual upheaval that figures as the backstory of these policies and the great transformation they enforced—the most consequential “reform of institutions of higher education in the Western world in eight hundred years,” according to Clark Kerr, who wrote California’s Master Plan—was the discovery of a “postindustrial society.” Daniel Bell, whom we met in chapter 1, is the scholar most consistently associated with that discovery, but he has always insisted that he did no more than integrate the findings of many other scholars, from Raymond Aron to Amitai Etzioni, whose mid-century work was shaped by some sense of an ending. “The sense was present—and still is—that in Western society we are in the midst of a vast historical change,” Bell noted, “in which old social relations (which were property-bound), existing power structures (centered on narrow elites), and bourgeois culture (based on notions of restraint and delayed gratification) are being rapidly eroded.”
The eclipse of modern-industrial society and the concurrent emergence of a postindustrial society meant, according to Bell and his antecedents, that goods production and manufacturing would give way to a “service economy” in which occupational opportunity and standing would derive from open-ended access to knowledge (in effect, from universal access to education); that power derived from ownership of property would decline accordingly as social status would come increasingly determined by theoretical knowledge and technical skill; and finally that culture, having “achieved autonomy,” would replace technology as the fundamental source of change in Western society. So conceived—and nobody, from Left to Right, bothered to quibble with this new periodization of Western civilization—a postindustrial society required rather more nerds than had previously been thought usable or tolerable.
Intellectuals—“persons who are not performing essential services,” as Everett Ladd and S. M. Lipset, two influential academics, characterized their own kind—became crucial in such a society, where the production of strange new ideas is more important, perhaps even more profitable, than the production of tangible goods. And the university quickly became the place these nomads—these intellectuals, these ideas—could call home. So, from Left to Right, everyone agreed by the 1970s that professors, the intellectuals certified by their well-educated peers in the universities, held the key to the political future of postindustrial society. “Colleges and universities are the wellspring of the ideas around which we organize ourselves,” as Lynne V. Cheney put it in bitter retrospect, in 1995. Here is how Kerr himself saw the future twenty years earlier: “Once economic growth depended upon new land, more labor, more capital. Increasingly it now depends upon more skill and new knowledge. . . . [Thus] intellectuals have become a real force in society. In some ways they are taking the part once played by the peasants or the farmers, and later by the trade union movement in social change.”
Once upon a
time in the 1930s, as this story goes, capital and labor had engaged in a great class struggle for control of the point of production. Now the great divide was between the intellectuals who congregated in academia, where an “adversary culture” flourished, and the rest of Americans, who frequented a conformist, consumer culture. At the historical moment when new knowledge, technical skill, and theoretical acumen became the rudiments of economic growth and national security, what the New Left of the 1960s called the “new class”—the white-collared “professional-managerial class,” or PMC—had apparently inherited the earth. At any rate it had taken over the universities, and by all accounts the better universities housed its most adversarial elements. The revenge of the nerds was complete by the time Ronald Reagan took office.
Well, almost. There was a remarkable shift in the politics of professors after 1969, but that shift reflected a larger public disavowal of the war in Vietnam, and even after the epochal election of 1972, the social sciences—the most outspoken and critical of academic fields—had not been seized by an intemperate radicalism that would become a wholesale “anti-Americanism.” Ladd and Lipset were emphatic on this point in their massive and authoritative study of professorial politics in the 1960s and 1970s: “Most academic social scientists are not radicals, and do not seek or support radical change in the polity of the academy. While far from being apologists for all or most policy initiatives of the ruling elites of the country, the vast majority of social scientists are also far from being advocates of a fundamentally new constitutional order. They are critics within the system.”