The Great Disruption*

Francis Fukuyama**





The shift to the information age has been accompanied by social disorder throughout the industrialized world. But new forms of stability may already be in the making

OVER the past half century the United States and other economically advanced countries have made the shift into what has been called an information society, the information age, or the post-industrial era. The futurist Alvin Toffler has labeled this transition the "Third Wave," suggesting that it will ultimately be as consequential as the two previous waves in human history: from hunter-gatherer to agricultural societies, and from agricultural to industrial ones.

A society built around information tends to produce more of the two things people value most in a modern democracy - freedom and equality. Freedom of choice has exploded, in everything from cable channels to low-cost shopping outlets to friends met on the Internet. Hierarchies of all sorts, political and corporate, have come under pressure and begun to crumble.

People associate the information age with the advent of the Internet, in the 1990s but the shift from the industrial era started more than a generation earlier, with the deindustrialization of the Rust Belt in the United States and comparable movements away from manufacturing in other industrialized countries. This period, roughly the mid-1960s to the early 1990s was also marked by seriously deteriorating social conditions in most of the industrialized world. Crime and social disorder began to rise, making inner-city areas of the wealthiest societies on earth almost uninhabitable. The decline of kinship as a social institution, which has been going on for more than 200 years, accelerated sharply in the second half of the twentieth century. Marriages and births declined and divorce soared; and one out of every three children in the United States and more than half of all children in Scandinavia were born out of wedlock. Finally, trust and confidence in institutions went into a forty-year decline. Although a majority of people in the United States and Europe expressed confidence in their governments  and fellow citizens during the late 19505 only a small minority did so by the early 1990s. The  nature of people's involvement with one another changed as well - although there is no evidence that people associated with one another less, their ties tended to be less permanent, looser, and with smaller groups of people.

These changes were dramatic; they occurred over a wide range of similar countries; and they all appeared at roughly the same period in history. As such, they constituted a Great Disruption in the social values that had prevailed in the industrial-age society of the mid twentieth century. It is very unusual
for social indicators to move together so rapidly; even without knowing why they did so, we have cause to suspect that the reasons might be related. Although William J. Bennett and other conservatives are often attacked for harping on the theme of moral decline, they are essentially correct: the perceived breakdown of social order is not a matter of nostalgia, poor memory, or ignorance about the hypocrisies of earlier ages. The decline is readily measurable in statistics on crime, fatherless children, broken trust, reduced opportunities for and outcomes from education, and the like.

Was it simply an accident that these negative social trends, which together reflect a weakening of social bonds and common values in Western societies, occurred just as the economies of those societies were making the transition from the industrial to the information era? The hypothesis of this article is that the two were in fact intimately connected, and that although many blessings have flowed from a more complex, information-based economy, certain bad things also happened to our social and moral life. The connections were technological, economic, and cultural. The changing nature of work tended to substitute mental for physical labor, propelling millions of women into the workplace and undermining the traditional understandings on which the family had been based. Innovations in medical technology leading to the birth-control pill and increasing longevity diminished the role of reproduction and family in people's lives. And the culture of individualism, which in the laboratory and the marketplace leads to innovation and growth, spilled over into the realm of social norms, where it corroded virtually all forms of authority and weakened the bonds holding families, neighborhoods, and nations together. The complete story is, of course, much more complex than this, and differs from one country to another. But broadly speaking, the technological change that brought about what the economist Joseph Schumpeter called "creative destruction" in the marketplace caused similar disruption in the world of social relationships. Indeed, it would be surprising if this were not true.

But there is a bright side, too: social order, once disrupted, tends to get remade, and there are many indications that this is happening today. We can expect a new social order for a simple reason: human beings are by nature social creatures, whose most basic drives and instincts lead them to create moral rules that bind them together into communities. They are also by nature rational, and their rationality allows them to spontaneously create ways of cooperating with one another. Religion, though often helpful to this process, is not the sine qua non of social order, as many conservatives believe. Neither is a strong and expansive state, as many on the left argue. Man's natural condition is not the war of "every man against every man" envisioned by Thomas Hobbes but rather a civil society made orderly by the presence of a host of moral rules. These assertions, moreover, are empirically supported by a tremendous amount of research coming out of the life sciences in recent years, in fields as diverse as neurophysiology, behavioral genetics, evolutionary biology, ethology, and biologically informed approaches to psychology and anthropology. The study of how order arises - not as the result of a top-down mandate by hierarchical authority, whether political or religious, but as the result of self-organization on the part of decentralized individuals - is one of the most interesting and important intellectual developments of our time.

The idea that social order has to come from a centralized, rational, bureaucratic hierarchy was very much associated with the industrial age. The sociologist Max Weber, observing nineteenth-century industrial society, argued that rational bureaucracy was, in fact, the very essence of modem life. We know now, however, that in an information society neither governments nor corporations will rely exclusively on formal bureaucratic rules to organize people. Instead they will decentralize and devolve power, and rely on the people over whom they have nominal authority to be self-organizing. The precondition for such self-organization is internalized rules and norms of behavior, a fact that suggests that the world of the twenty-first century will depend heavily on such informal norms. Thus although the transition into an information society has disrupted social norms, a modem, high-tech society cannot get along without them and will face considerable incentives to produce them.

THE disruption of social order by the progress of technology is not a new phenomenon. Particularly since the beginning of the Industrial Revolution, human societies have been subject to a relentless process of modernization, as one new production process replaced another. The social disorder of the late eighteenth and early nineteenth centuries in America and Britain can be traced directly to the disruptive effects
of the so-called first Industrial Revolution, when steam power and mechanization created new industries in textiles, railroads, and the like. Agricultural societies were transformed into urban industrial societies within the space of perhaps a hundred years, and all the accumulated social norms, habits, and customs that had characterized rural or village life were replaced by the rhythms of the factory and the city.

This shift in norms engendered what is perhaps the most famous concept in modern sociology - the distinction drawn by Ferdinand Tonnies between what he called Gemeinschaft ("community") and Gesellschaft ("society"). According to Tonnies, the Gemeinschaft that characterized a typical pre-modern European peasant society consisted of a dense network of personal relationships based heavily on kinship and on the direct, face-to-face contact that occurs in a small, closed village. Norms were largely unwritten, and individuals were bound to one another in a web of mutual interdependence that touched all aspects of life, from family to work to the few leisure activities that such societies enjoyed. Gesellschaft, on the other hand, was the framework of laws and other formal regulations that characterized large, urban industrial societies. Social relationships were more formalized and impersonal; individuals did not depend on one another for support to nearly the same extent, and were therefore much less morally obligated to one another.

Many of the standard sociological texts written in the middle of the twentieth century treated the shift from Gemeinschaft to Gesellschaft as if it were a one-shot affair: societies were either "traditional" or "modern," and the modern ones somehow constituted the end of the road for social development. But social evolution did not culminate in middle-class American society of the 1950s; industrial societies soon began transforming themselves into what the sociologist Daniel Bell has characterized as post-industrial societies, or what we know as information societies. If this transformation is as momentous as the previous one, we should hardly be surprised that the impact on social values has proved equally great.

Whether information-age democracies can maintain social order in the face of technological and economic change is among their greatest challenges today.  From the early 1970s to the early 1990s there was a sudden  surge of new democracies in Latin America, Europe, Asia, and the former Communist world. As I argued in The End of History and the Last Man (1992), there is a strong logic behind the evolution of political institutions in  the direction of modem liberal democracy, based on the correlation between economic development and stable democracy. Political and economic institutions have converged over time in the world's most economically advanced countries, and there are no obvious alternatives to the ones we see before us.

This progressive tendency is not necessarily evident in moral and social development, however. The tendency of contemporary liberal democracies to fall prey to excessive individualism is perhaps their greatest long-term vulnerability, and is particularly visible in the most individualistic of all democracies, the United States. The modem liberal state was premised on the notion that in the interests of political peace, government would not take sides among the differing moral claims made by religion and traditional culture. Church and State were to be kept separate; there would be pluralism in opinions about the most important moral and ethical questions, concerning ultimate ends or the nature of the good. Tolerance would become the cardinal virtue; in place of moral consensus would be a transparent framework of law and institutions that produced political order. Such a political system did not require that people be particularly virtuous;  they need only be rational and follow the law in their own self-interest. Similarly, the market-based capitalist economic system that went hand in glove with political liberalism required only that people consult their long-term self-interest to achieve a socially optimal production and distribution
of goods.

The societies created on these individualistic premises have worked extraordinarily well. and as the twentieth century comes to a close, there are few real alternatives to liberal democracy and market capitalism as fundamental organizing principles for modem societies. Individual self-interest is a lower but more stable ground than virtue on which to base society. The creation of a rule of law is among the proudest accomplishments of Western civilization - and its benefits become all too obvious when one deals with countries that lack one, such as Russia and China.

But although formal law and strong political and economic institutions are critical, they are not in themselves sufficient to guarantee a successful modern society. To work properly, liberal democracy has always been dependent on certain shared cultural values. This can be seen most clearly in the contrast between the United States and the countries of Latin America. When Argentina, Brazil, Chile, Mexico, and other Latin American countries got their independence, in the nineteenth century, many of them established formal democratic constitutions and legal systems  patterned on the presidential system of the United States. Since then not one Latin American country has experienced the political stability, economic growth, or institutional efficacy enjoyed by the United States, though most, fortunately, had returned to democratic government by the end of the 1980s.

There are many complex historical reasons for this, but the most important is a cultural one: the United States was settled primarily by British people and inherited not just British law but British culture as well, whereas Latin America inherited various cultural traditions from the Iberian peninsula. Although the U.S. Constitution enforces a separation between Church and State, American culture was decisively shaped in its formative years by sectarian Protestantism. Sectarian Protestantism reinforced both American individualism and the tendency of the society to be self-organizing in a myriad of voluntary associations and communities. The vitality of American civil society was crucial both for the stability of the country's democratic institutions and for its vibrant economy. The imperial and Latin Catholic traditions of Spain and Portugal, in contrast, reinforced dependence on large, centralized institutions like the State and the Church, weakening an independent civil society. Similarly, the differing abilities of Northern and Southern Europe to make modern institutions work were influenced by religious heritage and cultural tradition. The problem with most modern liberal democracies is that they cannot take their cultural preconditions for granted. The most successful among them, including the United States, were lucky to have married strong formal institutions to a flexible and supportive informal culture. But nothing in the formal institutions themselves guarantees that the society in which they exist will continue to enjoy the right sort of cultural values and norms under the pressures of technological, economic, and social change. Just the opposite: the individualism, pluralism, and tolerance that are built into the formal institutions tend to encourage cultural diversity, and therefore have the potential to undermine moral values inherited from the past. And a dynamic, technologically innovative economy will by its very nature disrupt existing social relations.

It may be, then, that although large political and economic institutions have long been evolving along a secular path, social life is more cyclical. Social norms that work for one historical period are disrupted by the advances of technology and the economy, and society has to play catch-up in order to establish new norms.

SINCE the 1960s the West has experienced a series of liberation movements that have sought to free individuals from the constraints of traditional social norms and moral rules. The sexual revolution, the feminist movement, and the 1980s and 1990s movements in favor of gay and lesbian rights have exploded through the Western world. The liberation sought by each of these movements has concerned social rules, norms, and laws that unduly restricted the options and opportunities of individuals - whether they were young people choosing sexual partners, women seeking career opportunities, or gays seeking recognition of their rights. Pop psychology, from the human-potential movement of the 1960s to the self-esteem trend of the 1980s, sought to free individuals from stifling social expectations.

Both the left and the right participated in the effort to free the individual from restrictive rules, but their points of emphasis tended to be different. To put it simply, the left worried about lifestyles and the right worried about money. The left did not want traditional values to unduly constrain women, minorities, gays, the homeless, people accused of crimes. or any number of other groups marginalized by society. The right, on the other hand, did not want communities putting constraints on what people could do with their property - or, in the United States, what they could do with their guns. Left and right each denounced excessive individualism on the part of the other: those who supported reproductive choice tended to oppose choice in buying guns or gas-guzzling cars; those who wanted unlimited consumer choice were appalled when the restraints on criminals were loosened. But neither was willing to give up its preferred sphere of free choice for the sake of constraining the other.

As people soon discovered, there are serious problems with a culture of unbridled individualism, in which the breaking of rules becomes, in a sense, the only remaining rule. The first has to do with the fact that moral values and social rules are not simply arbitrary constraints on individual choice but the precondition for any kind of cooperative enterprise. Indeed, social scientists have recently begun to refer to a society's stock of shared values as "social capital." Like physical capital (land, buildings, machines) and human capital (the skills and knowledge we carry around in our heads), social capital produces wealth and is therefore of economic value to a national economy. But it is also the prerequisite for all forms of group endeavor that take place in a modern society, from running a comer grocery store to lobbying Congress to raising children. Individuals amplify their own power and abilities by following cooperative rules that constrain their freedom of choice, because these also allow them to communicate with others and to coordinate their actions. Social virtues such as honesty, reciprocity, and the keeping of commitments are not worthwhile just as ethical values; they also have a tangible dollar value and help the groups that practice them to achieve shared ends.

The second problem with a culture of individualism is that it ends up being bereft of community. A community is not formed every time a group of people happen to interact with one another; true communities are bound together by the values, norms, and experiences their members share. The deeper and more strongly held those common values, the stronger the sense of community. The trade-off between personal freedom and community, however, does not seem obvious or necessary to many. As people have been liberated from their traditional ties to spouses, families, neighborhoods, work-places, and churches, they have expected to retain social connectedness. But they have begun to realize that their elective affinities, which they can slide into and out of at will, have left them feeling lonely and disoriented, longing for deeper and more permanent relationships. A society dedicated to the constant upending of norms and rules in the name of expanding individual freedom of choice will find itself increasingly disorganized, atomized, isolated, and incapable of carrying out common goals and tasks. The same society that wants no limits on its technological innovation also sees no limits on many forms of personal behavior, and the consequence is a rise in crime, broken families, parents' failure to fulfill obligations to children, neighbors' refusal to take responsibility for one another, and citizens' opting out of public life.


BEGINNING in about 1965 a large number of indicators that can serve as negative measures of social capital all started moving upward rapidly at the same time. These could be put under three broad headings: crime, family, and trust.

Americans are aware that crime rates began sometime in the 1960s to climb very rapidly - a dramatic change from the early post-Second World War period, when U. S. murder and robbery rates actually declined. After declining slightly in the mid-1980s crime rates spurted upward again in the late 1980s and peaked around 1991-1992. The rates for both violent and property crimes have dropped substantially since then. Indeed, they have fallen most dramatically in the are as where they had risen most rapidly - that is, in big cities like New York, Chicago, Detroit, and Los Angeles.

Although the United States is exceptional among developed countries for its high crime rates, crime rose significantly in virtually all other non-Asian developed countries in approximately the same time period. Violent crime rose rapidly in Canada, Finland, Ireland, the Netherlands, New Zealand, Sweden, and the United Kingdom. With regard to crimes against property, a broader measure of disorder, the United States is no longer exceptional: Canada, Denmark, the Netherlands, New Zealand, and Sweden have ended up with theft rates higher than those in the United States over the past generation.

Of the shifts in social norms that constitute the Great Disruption, some of the most dramatic concern those related to reproduction, the family, and relations between the sexes. Divorce rates moved up sharply across the developed world (except in Italy, where divorce was illegal until 1970, and other Catholic countries); by the 1980s half of all American marriages could be expected to end in divorce, and the ratio of divorced to married people had increased fourfold in just thirty years. Births to unmarried women as a proportion of U.S. live births climbed from under five percent to 32 percent from 1940 to 1995. The figure was close to 60 percent in many Scandinavian countries; the United Kingdom, Canada, and France reached levels comparable to that in the United States. The combined probabilities of single-parent births, divorce, and the dissolution of cohabiting relationships between parents (a situation common in Europe) meant that in most developed countries ever smaller minorities of children would reach the age of eighteen with both parents remaining in the household. The core reproductive function of the family was threatened as well: fertility has dropped so dramatically in Italy, Spain, and Germany that they stand to lose up to 30 percent of their populations each generation, absent new net immigration.

Finally, anyone who has lived through the 1950s to the 1990s in the United States or another Western country can scarcely fail to recognize the widespread changes in values that have taken place over this period in the direction of increasing individualism. Survey data, along with common-sense observation, indicate that people are much less likely to defer to the authority of an ever-broader range of social institutions. Trust in institutions has consequently decreased markedly. In 1958, 73 percent of Americans surveyed said they trusted the federal government to do what is right either "most of the time" or "just about always"; by 1994 the figure had fallen as low as 15 percent. Europeans, although less anti-statist than Americans, have nonetheless seen similar declines in confidence in such traditional institutions as the Church, the police, and government. Americans trust one another less as well: although 10 percent more Americans evinced more trust than distrust in surveys done in the early 1960s by the 1990s the distrusters had an almost 30 percent margin over those expressing trust. It is not clear that either the number of groups or group memberships in civil society declined overall in this period, as the political scientist Robert Putnam has suggested. What is clear, however, is that what I call the radius of trust has declined, and social ties have become less binding and long-lasting. (Readers can obtain more detailed statistical information on the Great Disruption at


CHANGES as great as these will defy attempts to provide simple explanations. However, the fact that many different social indicators moved across a wide group of industrialized countries at roughly the same time simplifies the analytical task somewhat by pointing us toward a more general level of explanation. When the same phenomena occur in a broad range of countries, we can rule out explanations specific to a single country, such as the effects of the Vietnam War or of Watergate. Several arguments have been put forward to explain why the phenomena we associate with the Great Disruption occurred. Here are three: They were brought on by increasing poverty and income inequality. They were products of the modem welfare state. They were the result of a broad cultural shift that included the decline of religion and the promotion of individualistic self-gratification over community obligation.

The Great Disruption was caused by poverty and inequality. The idea that such large changes in social norms could be brought on by economic deprivation in countries that are wealthier than any others in human history might give one pause. Poor people in the United States have higher absolute standards of living than many Americans of past generations, and more per capita wealth than many people in contemporary Third World countries with more-intact family structures. Poverty rates, after coming down dramatically through the 1960s and rebounding slightly thereafter, have not increased in a way that would explain a huge increase in social disorder.

Those favoring the economic hypothesis argue, of course, that it is not absolute levels of poverty that are the source of the problem: modern societies, despite being richer overall, have become more unequal, or else have experienced economic turbulence and job loss that have led to social dysfunction. A casual glance at the comparative data on divorce and illegitimacy rates shows that this correlation cannot possibly be true in the case of family breakdown. If one looks across the Organization for Economic Cooperation and Development, there is no positive correlation between the level of welfare benefits aimed at increasing economic equality and stable families. Indeed, there is a weak positive correlation between high levels of welfare benefits and illegitimacy, tending to support the argument advanced by American conservatives that the welfare state is the cause of and not the cure for family breakdown. The highest rates of illegitimacy are found in Sweden and Denmark, egalitarian countries that cycle upwards of 50 percent of their gross domestic product through the state. The United States cycles less than 30 percent of GDP through the government and has higher levels of inequality, yet it has lower rates of illegitimacy. Japan and Korea, which have minimal welfare-state protections for poor people, also have two of the lowest rates of divorce and illegitimacy in the OECD.

The notion that poverty and inequality beget crime is a commonplace among politicians and voters in democratic societies who seek reasons for justifying welfare and poverty programs. But although there is plenty of evidence of a broad correlation between income inequality and crime, this hardly constitutes a plausible explanation for rapidly rising crime rates in the West. There was no depression in the period from the 1960s to the 1990s to explain the sudden rise in crime; in fact, the great American postwar crime wave began in a period of full employment and general prosperity. (Indeed, the Great Depression of the 1930s saw decreasing levels of violent crime in the United States.) Income inequality rose in the United States during the Great Disruption, but crime has also risen in Western developed countries that have remained more egalitarian than the United States. America's greater economic inequality may to some degree explain why its crime rates are higher than, say, Sweden's in any given year, but it does not explain why Swedish rates began to rise in the same period that America's did. Income inequality, moreover, has continued to increase in the United States in the 1990s while crime rates have fallen; hence the correlation between inequality and crime becomes negative for this period.

The Great Disruption was caused by mistaken government policies. The second general explanation for the increase in social disorder is one made by conservatives; it has been primarily associated with Charles Murray's book Losing Ground, and before that with the economist Gary Becker. The argument is the mirror image of the left's: it maintains that the perverse incentives created by the welfare state itself explain the rise in family breakdown and crime. The primary American welfare program targeted at poor women, the Depression-era Aid to Families with Dependent Children, provided welfare payments only to single mothers, and thereby penalized women who married the fathers of their children. The United States abolished AFDC in the welfare-reform act of 1996, in part because of arguments concerning its perverse incentives. There is little doubt that welfare benefits discourage work and create what economists call "moral hazard." What is less clear is their impact on family structure. Welfare benefits in real terms stabilized and then began to decline in the 198Os, whereas the rate of family breakdown continued unabated through the mid-1990s. One analyst suggests that no more than perhaps 15 percent of family breakdown in the United States can be attributed to AFDC and other welfare programs.

The more fundamental weakness of the conservative argument is that illegitimacy is only part of a much larger story of weakening family ties - a story that includes declining fertility, divorce, cohabitation in place of marriage, the separation of cohabiting couples, and the like. Illegitimacy is primarily though not exclusively associated with poverty in the United States and most other countries. Divorce and cohabitation, however, are much more prevalent among the middle and upper classes throughout the West. It is very difficult to lay soaring divorce rates and declining marriage rates at the government's doorstep, except to the extent that the state made divorce legally easier to obtain.

Similarly, rising crime is seen by many conservatives as a result of the weakening of criminal sanctions, which occurred in the same period. Gary Becker has argued that crime can be seen as another form of rational choice: when payoffs from crime go up or costs (in terms of punishment) go down, more crimes will be committed, and vice versa. Many conservatives have argued that crime began to rise in the 1960s because society had grown permissive and the legal system was "coddling criminals." By this reasoning, the tougher enforcement undertaken by communities across the United States in the 1980s - stiffer penalties, more jails, and in some cases more police officers on the streets - was one important reason for falling crime rates in the 1990s.

Although improved policing methods and stronger penalties may well have had a lot to do with declining crime rates in the 1990s it is hard to argue that the great upsurge in crime in the 1960s was simply the product of police permissiveness. It is true that the United States constrained its police forces and prosecutors in the interests of the rights of criminal defendants through a series of Supreme Court decisions in the 196Os, most notably Miranda v. Arizona. But police departments quickly learned how to accommodate what were perfectly legitimate concerns over police procedure. A great deal of recent criminological theory ascribes crime to poor socialization and impulse control relatively early in life. It is not that potential criminals do not respond rationally to punishment; rather, the propensity to commit crimes or to respond to given levels of punishment is heavily influenced by upbringing. What may be more relevant to understanding a sudden upsurge in crime is changes in mediating social institutions such as families, neighborhoods, and schools that were taking place in the same period, and changes in the signals that the broader culture was sending to young people.

The Great Disruption was caused by a broad cultural shift. This brings us to cultural explanations, which are the most plausible of the three presented here. Increasing individualism and the loosening of communal controls clearly had a huge impact on family life, sexual behavior, and the willingness of people to obey the law. The problem with this line of explanation is not that culture was not a factor but rather that it gives no adequate account of timing: why did culture, which usually evolves extremely slowly, suddenly mutate with extraordinary rapidity after the mid-1960s?

In Britain and the United States the high point of communal social control was the last third of the nineteenth century, when the Victorian ideal of the patriarchal conjugal family was broadly accepted and adolescent sexuality was kept under tight control. The cultural shift that undermined Victorian morality may be thought of as layered: At the top was a realm of abstract ideas promulgated by philosophers, scientists, artists, academics, and the occasional huckster and fraud, who laid the intellectual groundwork for broad-based changes. The second level was one of popular culture, as simpler versions of complex abstract ideas were promulgated through books, newspapers, and other mass media. Finally, there was the layer of actual behavior, as the new norms implicit in the abstract or popularized ideas were embedded in the actions of large populations.

The decline in Victorian morality can be traced to a number of intellectual developments at the end of the nineteenth century and the beginning of the twentieth, and to a second wave that began in the 1940s. At the highest level of thought, Western rationalism began to undermine itself by concluding that no rational grounds supported universal norms of behavior. This was nowhere more evident than in the thought of Friedrich Nietzsche, the father of modern relativism. Nietzsche in effect argued that man, the "beast with red cheeks," was a value-creating animal, and that the manifold "languages of good and evil" spoken by different human cultures were products of the will, rooted nowhere in truth or reason. The Enlightenment had not led to self-evident truths about right or morality; rather, it had exposed the infinite variability of moral arrangements. Attempts to ground values in nature, or in God, were doomed to be exposed as willful acts on the part of the creators of those values. Nietzsche's aphorism "There are no facts, only interpretations" became the watchword for later generations of relativists under the banners of deconstructionism and postmodernism.

In the social sciences the undermining of Victorian values was first the work of psychologists. John Dewey, William James, and John Watson, the founder of the behavioralist school of psychology, for differing reasons all contested the Victorian and Christian notion that human nature was innately sinful, and argued that tight social controls over behavior were not necessary for social order. The behavioralists argued that the human mind was a Lockean tabula rasa waiting to be filled with cultural content; the implication was that human beings were far more malleable through social pressure and policy than people had heretofore believed. Sigmund Freud was, of course, enormously influential in promulgating the idea that neurosis originated in the excessive social repression of sexual behavior. Indeed, the spread of psychoanalysis accustomed an entire generation to talking about sex and seeing everyday psychological problems in terms of the libido and its repression.

The cultural historian James Lincoln Collier points to the years on either side of 1912 as critical to the breakdown of Victorian sexual norms in the United States. It was in this period that a series of new dances spread across the nation,   along with the opinion that decent women could be seen in dance clubs; the rate of alcohol consumption increased; the feminist movement began in earnest; movies and the technology of modern mass entertainment appeared; literary modernism, whose core was the perpetual delegitimization of established cultural values, moved into high gear; and sexual mores (judging by what little empirical knowledge we have of this period) began to change. Collier argues that the intellectual and cultural grounds for the sexual revolution of the 1960s had already been laid among American elites by the 1920s. Their spread through the rest of the population was delayed, however, by the Depression and the Second World War, which led people to concentrate more on economic survival and domesticity than on self-expression and self-gratification - which most, in any event, could not afford.

The crucial question about the changes in social norms that occurred during the Great Disruption is therefore not whether they had cultural roots, which they obviously did, but how we can explain the timing and speed of the subsequent transformation. We know that culture tends to change very slowly in comparison with other factors, such as economic conditions, public policies, and ideology. In those cases where cultural norms have changed quickly, such as in rapidly modernizing Third World societies, cultural change is clearly being driven by socioeconomic change and is therefore not an autonomous factor.

So with the Great Disruption: the shift away from Victorian values had been occurring gradually for two or three generations by the time the disruption began; then all of a sudden the pace of change sped up enormously. It is hard to believe that people throughout the developed world simply decided to alter their attitudes toward such elemental issues as marriage, divorce, child-rearing, authority, and community so completely in the space of two or three decades without that shift in values being driven by other powerful forces. Those explanations that link changes in cultural variables to specific events in American history, such as Vietnam, Watergate, or the counterculture of the 1960s betray an even greater provincialism: why were social norms also disrupted in other societies, from Sweden and Norway to New Zealand and Spain?

If these broad explanations for the Great Disruption are unsatisfactory, we need to look at its different elements more specifically.


ASSUMING that increases in the crime rate are not simply a statistical artifact of improved police reporting, we need to ask several questions. Why did crime rates increase so dramatically over a relatively short period and in such a wide range of countries? Why are rates beginning to level off or decline in the United States and several other Western countries?

The first and perhaps most straightforward explanation for rising crime rates from the late 1960s to the 1980s and declining rates thereafter, is a simple demographic one. Crime tends to be committed overwhelmingly by young males aged fifteen to twenty-four. There is doubtless a genetic reason for this, having to do with male propensities for violence and aggression, and it means that when birth rates go up, crime rates will rise fifteen to twenty-four years later. In the United States the number of young people aged fifteen to twenty-four increased by two million from 1950 to 1960, whereas the next decade added 12 million to this age group - an onslaught that has been compared to a barbarian invasion. Not only did greater numbers of young people increase the pool of potential criminals, but their concentration in a "youth culture" may have led to a more-than-proportional increase in efforts to defy authority.

The Baby Boom, however, is only part of the explanation for rising crime rates in the 1960s and 1970s. One criminologist has estimated that the increase in the U. S. murder rate was ten times as great as would be expected from shifts in the demographic structure alone. Other studies have shown that changes in age structure do not correlate well with increases in crime cross-nationally.

A second explanation links crime rates to modernization and related factors such as urbanization, population density, opportunities for crime, and so forth. It is a commonsense proposition that there will be more auto theft and burglary in large cities than in rural areas, because it is easier for criminals to find automobiles and empty homes in the former than in the latter. But urbanization and a changing physical environment are poor explanations for rising crime rates in developed countries after the 1960s. By 1960 the countries under consideration were already industrialized, urbanized societies; no sudden shift from countryside to city began in 1965. In the United States murder rates are much higher in the South than in the North, despite the fact that the latter tends to be more urban and densely populated. Indeed, violence in the South tends to be a rural phenomenon, and most observers who have looked closely into the matter believe that the explanation for high crime rates there is cultural. Japan, Korea, Hong Kong, and Singapore are among the most densely populated, overcrowded urban environments in the world, and yet they did not experience rising crime rates as that urbanization was occurring. This suggests that the human social environment is much more important than the physical one in determining levels of crime.

A third category of explanation is sometimes euphemistically labeled "social heterogeneity." That is, in many societies crime tends to be concentrated among racial or ethnic minorities; to the extent that societies become more ethnically diverse, as virtually all Western developed countries have over the past two generations, crime rates can be expected to rise. The reason that crime rates are frequently higher among minorities is very likely related, as the criminologists Richard Cloward and Lloyd Ohlin have argued, to the fact that minorities are kept from legitimate avenues of social mobility in ways that members of the majority community are not. In other cases the simple fact of heterogeneity may be to blame: neighborhoods that are too diverse culturally, linguistically, religiously, or ethnically never come together as communities to enforce informal norms on their members. But only part of the blame for rising crime rates in the United States can be placed on immigration.

A fourth explanation concerns the more or less contemporaneous changes in the family. The currently dominant school of American criminology holds that early-childhood socialization is one of the most important factors determining the level of subsequent criminality. That is, most people do not make day-to-day choices about whether or not to commit crimes based on the balance of rewards and risks, as the rational-choice school sometimes suggests. The vast majority of people obey the law, particularly with regard to serious offenses, out of habit that was learned relatively early in life. Most crimes are committed by repeat offenders who have failed to learn this basic self-control. In many cases they are acting not rationally but on impulse. Failing to anticipate consequences, they are undeterred by the expectation of punishment.


IN the realm of trust, values, and civil society, we need to explain two things: why there has been a broad-based decline in trust both in institutions and in other people, and how we can reconcile the shift toward fewer shared norms with an apparent growth in groups and in the density of civil society.

The reasons for the decline of trust in an American context have been debated extensively. Robert Putnam argued early on that it might be associated with the rise of television, since the first cohort that grew up watching television was the one that experienced the most precipitous decline in trust levels. Not only does the content of television breed cynicism in its attention to sex and violence, but the fact that Americans spend an average of more than four hours a day watching TV limits their opportunities for face-to-face social activities.

One suspects, however, that a broad phenomenon like the decline of trust has a number of causes, of which television is only one. Tom Smith, of the National Opinion Research Center, performed a statistical analysis of the survey data on trust and found that the lack of it correlates with low socioeconomic status, minority status, traumatic life events, religious polarization, and youth. Poor and uneducated people tend to be more distrustful than the well-to-do or those who have gone to college. Blacks are significantly more distrustful than whites, and there is some correlation between distrust and immigrant status. The traumatic life events affecting trust include, not surprisingly, being a victim of crime and being in poor health. Distrust is associated both with those who do not attend church and with fundamentalists. And younger people are less trusting than older ones.

Which of these factors has changed since the 1960s in a way that could explain the decrease in trust? Income inequality has increased somewhat, and Eric Uslaner, of the University of Maryland, has suggested that this may account for some of the increase in distrust. But poverty rates have fluctuated without increasing overall in this period, and for the vast majority of Americans the so-called "middle-class squeeze" did not represent a drop in real income so much as a stagnation of earnings.

Crime increased dramatically from the mid-sixties to the mid-nineties, and it makes a great deal of sense that someone who has been victimized by crime, or who watches the daily cavalcade of grisly crime stories on  the local TV news, would feel distrust not for immediate friends and  family but for the larger world. Hence crime would seem to be an important  explanation for the increase in distrust after 1965, a conclusion well  supported in more-detailed analyses.

The other major social change that has led to traumatic life experiences has been the rise of divorce and family breakdown. Commonsensically, one would think that children who have experienced the divorce of their parents, or have had to deal with a series of boyfriends in a single-parent household, would tend to become cynical about adults in general,   and that this might go far toward explaining the increased levels of distrust that show up in survey data.

Despite the apparent decline in trust, there is evidence that groups and group membership are increasing. The most obvious way to reconcile lower levels of trust with greater levels of group membership is to note a reduction in the radius of trust. It is hard to interpret the data either on values or on civil society in any other way than to suggest that the radius of trust is diminishing, not just in the United States but across the developed world. That is, people continue to share norms and values in ways that constitute social capital, and they join groups and organizations in ever larger numbers. But the groups have shifted dramatically in kind. The authority of most large organizations has declined, and the importance in people's lives of a host of smaller associations has grown. Rather than taking pride in being a member of a powerful labor federation or working for a large corporation, or in having served in the military, people identify socially with a local aerobics class, a New Age sect, a co-dependent support group, or an Internet chat room. Rather than seeking authoritative values in a church that once shaped the society's culture, people are picking and choosing their values on an individual basis, in ways that link them with smaller communities of like-minded folk.

The shift to smaller-radius groups is mirrored politically in the almost universal rise of interest groups at the expense of broad-based political parties. Parties like the German Christian Democrats and the British Labour Party take a coherent ideological stand on the whole range of issues facing a society, from national defense to social welfare. Though usually based in a particular social class, these parties unite a broad coalition of interests and personalities. Interest groups, on the other hand, focus on single issues such as saving rain forests or promoting poultry farming in the upper Midwest; they may be transnational in scope, but they are much less authoritative, both in the range of issues they deal with and in the numbers of people they bring together.

Contemporary Americans, and contemporary Europeans as well, seek contradictory goals. They are increasingly distrustful of any authority, political or moral, that would constrain their freedom of choice, but they also want a sense of community and the good things that flow from community, such as mutual recognition, participation, belonging, and identity. Community therefore has to be found in smaller,
more flexible groups and organizations whose loyalties and membership can overlap, and where entry and exit entail relatively low costs. People may thus be able to reconcile their contradictory desires for autonomy and community. But in this bargain the community they get is smaller and weaker than most of those that have existed in the past. Each community shares less with neighboring ones, and has relatively little hold on its members. The circle that people can trust is necessarily narrower. The essence of the shift in values at the center of the Great Disruption, then, is the rise of moral individualism and the consequent miniaturization of community. These explanations go partway toward explaining why cultural values changed after the 1960s. But at the Great Disruption's core was a shift in values concerning sex and the family - a shift that deserves special emphasis.


ALTHOUGH the role of mother can safely be said to be grounded in biology, the role of father is to a great degree socially constructed. In the words of the anthropologist Margaret Mead, "Somewhere at the dawn of human history, some social invention was made under which males started nurturing females and their young." The male role was founded on the provision of resources; "among human beings everywhere [the male] helps provide food for women and children." Being a learned behavior, the male role in nurturing the family is subject to disruption. Mead wrote,

But the evidence suggests that we should phrase the matter differently for men and women - that men have to learn to want to provide for others, and this behaviour, being learned, is fragile and can disappear rather easily under social conditions that no longer teach it effectively.

The role of fathers, in other words, varies by culture and tradition from intense involvement in the nurturing and education of children to a more distant presence as protector and disciplinarian to the near absence possible for a paycheck provider. It takes a great deal of effort to separate a mother from her newborn infant; in contrast, it often takes a fair amount of effort to involve a father with his.

When we put kinship and family in this context, it is easier to understand why nuclear families have started to break apart at such a rapid rate over the past two generations. The family bond was relatively fragile, based on an exchange of the woman's fertility for the man's resources. Prior to the Great Disruption, all Western societies had in place a complex series of formal and informal laws, rules, norms, and obligations to protect mothers and children by limiting the freedom of fathers to simply ditch one family and start another. Today many people have come to think of marriage as a kind of public celebration of a sexual and emotional union between two adults, which is why gay marriage has become a possibility in the United States and other developed countries. But it is clear that historically the institution of marriage existed to give legal protection to the mother-child unit, and to ensure that adequate economic resources were passed from the father to allow the children to grow up to be viable adults.

What accounts for the breakdown of these norms constraining male behavior, and of the bargain that rested on them? Two very important changes occurred sometime during the early postwar period. The first involved advances in medical technology - that is, birth control and abortion - that permitted women to better control their own reproduction. The second was the movement of women into the paid labor force in most industrialized countries and the steady rise in their incomes - hourly, median, and lifetime - relative to men's over the next thirty years.

The significance of birth control was not simply that it lowered fertility. Indeed, if the effect of birth control is to reduce the number of unwanted pregnancies, it is hard to explain why its advent should have been accompanied by an explosion of illegitimacy and a rise in the abortion rate, or why the use of birth control is positively correlated with illegitimacy  across the OECD.

The main impact of the Pill and the sexual revolution that followed it was, as the economists Janet Yellen, George Akerlof, and Michael Katz have shown, to dramatically alter calculations about the risks of sex, and thereby to change mule behavior. The reason that the rates of birth-control use, abortion, and illegitimacy went up in tandem is that a fourth rate - the number of shotgun marriages - declined substantially at the same time. By these economists' calculations, in the period 1965-1969 some 59 percent of white brides and 25 percent of black brides were pregnant at the altar. Young people were, evidently, having quite a lot of premarital sex in those years, but the social consequences of out-of-wedlock childbearing were mitigated by the norm of male responsibility for the children produced. By the period 1980-1984 the percentages had dropped to 42 and 11, respectively. Because birth control and abortion permitted women for the first time to have sex without worrying about the consequences, men felt liberated from norms requiring them to look after the women they got pregnant.

The second factor altering male behavior was the entry of women into the paid labor force. That female incomes should be related to family breakdown is an argument accepted by many economists, and elaborated most fully by Gary Becker in his work A Treatise on the Family (1981). The assumption behind this view is that many marriage contracts are entered into with imperfect information: once married, men and women discover that life is not a perpetual honeymoon, that their spouse's behavior has changed from what it was before marriage, or that their own expectations for partners have changed. Trading in a spouse for someone new, or getting rid of an abusive mate, had been restricted by the fact that many women lacking job skills or experience were dependent on husbands. As female earnings rose, women became better able to support themselves and to raise children without husbands. Rising female incomes also increase the opportunity costs of having children, and therefore lower fertility. Fewer children means less of what Becker characterizes as the joint capita1 in the marriage, and hence makes divorce more likely.

A subtler consequence of women's entering the labor force was that the norm of male responsibility was further weakened. In divorcing a dependent wife, a husband would have to face the prospect of either paying alimony or seeing his children slip into poverty. With many wives earning incomes that rivaled those of their husbands, this became less of an issue. The weakening norm of male responsibility, in turn, reinforced the need for women to arm themselves with job skills so as not to be dependent on increasingly unreliable husbands. With a substantial probability that a first marriage will end in divorce, contemporary women would be foolish not to prepare themselves for work.

The decline of nuclear families in the West had strongly negative effects on social capital and was related to an increase in poverty for people at the bottom of the social hierarchy, to increasing levels of crime, and finally to declining trust. But pointing to the negative consequences for social capital of changes in the family is in no way to blame women for these problems. The entry of women into the workplace, the steady earnings gap with men, and the greater ability of women to control fertility are by and large good things. The most important shift in norms was in the one that dictated male responsibility for wives and children. Even if the shift was triggered by birth control and rising female incomes, men were to blame for the consequences. And it is not as if men always behaved well prior to that: the stability of traditional families was often bought at a high price in terms of emotional and physical distress, and also in lost opportunities - costs that fell disproportionately on the shoulders of women.

On the other hand, these sweeping changes in gender roles have not been the unambiguously good thing that some feminists pretend. Losses have accompanied gains, and those losses have fallen disproportionately on the shoulders of children. This should not surprise anyone: given the fact that female roles have traditionally centered on reproduction and children, we could hardly expect that the movement of women out of the household and into the workplace would have no consequences for families.

Moreover, women themselves have often been the losers in this bargain. Most labor-market gains for women in the 1970s and 1980s were not in glamorous Murphy Brown kinds of jobs but in low-end service-sector jobs. In return for meager financial independence, many women found themselves abandoned by husbands who moved on to younger wives or girlfriends. Because older women are considered less sexually attractive than older men, they had much lower chances of remarrying than did the husbands who left them. The widening of the gap among men between rich and poor had its counterpart among women: educated, ambitious,  and talented women broke down barriers, proved they could succeed at male occupations, and saw their incomes rise; but many of their less-educated, less-ambitious, and less-talented sisters saw the floor collapse under them, as they tried to raise children by themselves while in low-paying, dead-end jobs or on welfare. Our consciousness of this process has been distorted by the fact that the women who talk and write and shape the public debate  about gender issues come almost exclusively from the former category. In contrast, men have on balance come out about even. Although many have lost substantial status and income, others (and sometimes the same ones) have quite happily been freed of burdensome responsibilities for wives and children. Hugh Hefner did not invent the Playboy lifestyle in the 1950s; casual access to multiple women has been enjoyed by powerful, wealthy, high-status men throughout history, and has been one of the chief motives for seeking power, wealth, and high status in the first place. What changed after the 1950s was that many rather ordinary men were allowed to live out the fantasy lives of hedonism and serial polygamy formerly reserved to a tiny group at the very top of society. One of the greatest frauds perpetrated during the Great Disruption was the notion that the sexual revolution was gender-neutral, benefiting women and men equally, and that it somehow had a kinship with the feminist revolution. In fact the sexual revolution served the interests of men, and in the end put sharp limits on the gains that women might otherwise have expected from their liberation from traditional roles.


HOW can we rebuild social capital in the future? The fact that culture and public policy give societies some control over the pace and degree of disruption is not in the long run an answer to how social order will be established at the beginning of the twenty-first century. Japan and some Catholic countries have been able to hold on to traditional family values longer than Scandinavia or the English-speaking world, and this may have saved them some of the social costs experienced by the latter. But it is hard to imagine that they will be able to hold out over the coming generations, much less re-establish anything like the nuclear family of the industrial era, with the father working and the mother staying at home to raise children. Such an outcome would not be desirable, even if it were possible.

We appear to be caught, then, in unpleasant circumstances: going forward seems to promise ever-increasing levels of disorder and social atomization, at the same time that our line of retreat has been cut off. Does this mean that contemporary liberal societies are fated to descend into increasing moral decline and social anarchy, until they somehow implode? Were Edmund Burke and other critics of the Enlightenment right that anarchy was the inevitable product of the effort to replace tradition and religion with reason?

The answer, in my view, is no, for the very simple reason that we human beings are by nature designed to create moral rules and social order for ourselves. The situation of normlessness -  what the sociologist Emile Durkheim labeled "anomie" - is intensely uncomfortable for us, and we will seek to create new rules to replace the ones that have been undercut. If technology makes certain old forms of community difficult to sustain, then we will seek out new ones, and we will use our reason to negotiate arrangements to suit our underlying interests, needs, and passions.

To understand why the present situation isn't as hopeless as it may seem, we need to consider the origins of social order per se, on a more abstract level. Many discussions of culture treat social order as if it were a static set of rules handed down from earlier generations. If one was stuck in a low-social-capital or low-trust country, one could do nothing about it. It is true, of course, that public policy is relatively limited in its ability to manipulate culture, and that the best public policies are those shaped by an awareness of cultural constraints. But culture is a dynamic force, one that is constantly being remade - if not by governments then by the interactions of the thousands of decentralized individuals who make up a society. Although culture tends to evolve more slowly than formal social and political institutions, it nonetheless adapts to changing circumstances.

What we find is that order and social capital have two broad bases of support. The first is biological, and emerges from human nature itself. There is an increasing body of evidence coming out of the life sciences that the standard social-science model is inadequate, and that human beings are born with pre-existing cognitive structures and age-specific capabilities for learning that lead them naturally into society. There is, in other words, such a thing as human nature. For the sociologists and anthropologists, the existence of human nature means that cultural relativism needs to be rethought, and that it is possible to discern cultural and moral universals that, if used judiciously, might help to evaluate particular cultural practices. Moreover, human behavior is not nearly as plastic and therefore manipulable as their disciplines have assumed for much of this century. For the economists, human nature implies that the sociological view of human beings as inherently social beings is more accurate than their own individualistic model. And for those who are neither sociologists nor economists, an essential humanity confirms a number of commonsense understandings about the way people think and act that have been resolutely denied by earlier generations of social scientists - for example, that men and women are different by nature, that we are political and social creatures with moral instincts, and the like. This insight is extremely important, because it means that social capital will tend to be generated by human beings as a matter of instinct.

The biological revolution that has been under way in the second half of the twentieth century has multiple sources. The most startling advances have been made at the level of molecular biology and biochemistry, where the discovery of the structure of DNA has led to the emergence of an entire industry devoted to genetic manipulation. In neurophysiology great advances have been made in understanding the chemical and physiological bases of psychological phenomena, including an emerging view that the brain is not a general-purpose calculating machine but a highly modular organ with specially adapted capabilities. And finally, on the level of macro behavior, a tremendous amount of new work has been done in animal ethology, behavioral genetics. primatology, and evolutionary psychology and anthropology, suggesting that certain behavioral patterns are much more general than previously believed. For instance, the generalization that females tend to be more selective than males in their choice of mates proves to be true not only across all known human cultures but across virtually all known species that reproduce sexually. It would seem to be only a matter of time before the micro and macro levels of research are connected: with the mapping of complete gene sequences for fruit flies, nematodes, rats, and eventually human beings, it will be possible to turn individual gene sequences on and off and directly observe their effects on behavior.

The second basis of support for social order is human reason, and reason's ability to spontaneously generate solutions to problems of social cooperation. Mankind's natural capabilities for creating social capital do not explain how social capital arises in specific circumstances. The creation of particular rules of behavior is the province of culture rather than nature, and in the cultural realm we find that order is frequently the result of a process of horizontal negotiation,   argument, and dialogue among individuals. Order does not need to proceed from the top down - from a lawgiver (or, in contemporary terms, a state) handing down laws or a priest promulgating the word of God.

Neither natural nor spontaneous order is sufficient in itself to produce the totality of rules that constitutes social order per se. Either needs to be supplemented at crucial junctures by hierarchical authority. But when we look back in human history, we see that self-organizing individuals have continuously been creating social capital for themselves, and have managed to adapt to technological and economic changes greater than those faced by Western societies over the past two generations.

PERHAPS the easiest way to get a handle on the Great Disruption's future is to look briefly at great disruptions of the past. Indices of social order have increased and decreased over time, suggesting that although social capital may often seem to be in the process of depletion, its stock has increased in certain historical periods. The political scientist Ted Robert Gurr estimates that homicide rates in England were three times as high in the thirteenth century as in the seventeenth, and three times as high in the seventeenth as in the nineteenth; in London they were twice as high in the early nineteenth century as in the 1970s. Both conservatives decrying moral decline and liberals celebrating increased individual choice sometimes talk as if there had been since the early 1600s a steady movement away from Puritan values. But although a secular trend toward greater individualism has been evident over this long time period, many fluctuations in behavior have suggested that societies are perfectly capable of increasing the degree of constraint on individual choice through moral rules.

The Victorian period in Britain and America may seem to many to be the embodiment of traditional values, but Victorianism was in fact a radical movement that emerged in reaction to widespread social disorder at the beginning of the nineteenth century - a movement that deliberately sought to create new social rules and instill virtues in populations that were seen as wallowing in degeneracy.

It would be wrong to assert that the greater social order that came to prevail  in Britain and America during the Victorian period was simply the result of changing moral norms. In this period both societies established modem police forces, which replaced the hodgepodge of local agencies and poorly trained deputies   that had existed at the beginning of the nineteenth century. In the United States after the Civil War the police focused attention on such minor offenses against public order as public drinking, vagrancy, loitering, and the like, leading to a peak in arrests for this kind of behavior around 1870.  Toward the end of the century many states had begun to establish systems of  universal education, which sought to put all American children into free public schools - a process that began somewhat later in Britain. But the essential change that took place was a matter of values rather than institutions. At the core of Victorian morality was the inculcation of impulse control in young people - the shaping of what economists today would call their preferences - so that they would not indulge in pleasures like casual sex, alcohol, and gambling.

There are other examples from other cultures of moral renovation.  The feudal Tokugawa period in Japan - when power was held by various daimyo, or warrior lords - was one of insecurity and frequent violence.  The Meiji Restoration, which took place in 1868, established a single centralized state, and   stamped out once and for all the kind of banditry that had taken place in feudal Japan. The country developed a new moral system as well. We think of a custom like the lifetime employment that is practiced by large Japanese firms as an ancient cultural tradition, but in fact it dates back only to the late nineteenth century, and was fully implemented among large companies only after the Second World War. Before then there was a high degree of labor mobility; skilled craftsmen in particular were in short supply and constantly on the move from one company to another. Large Japanese companies like Mitsui and Mitsubishi found that they could not attract the skilled labor they needed, and so, with the help of the government, they embarked on a successful campaign to elevate the virtue of loyalty above others.

COULD the pattern experienced in the second half of the nineteenth century in Britain and America, or in Japan, repeat itself in the next generation or two? There is growing evidence that the Great Disruption has run its course, and that the process of re-norming has already begun. Growth in the rates of increase in crime, divorce, illegitimacy, and distrust has slowed substantially, and in the 1990s has even reversed in many of the countries that experienced an explosion of disorder over the past two generations. This is particularly true in the United States, where levels of crime are down a good 15 percent from their peaks in the early 1990s. Divorce rates peaked in the early 1980s and births to single mothers appear to have stopped increasing. Welfare rolls have diminished almost as dramatically as crime rates, in response both to the 1996 welfare-reform measures and to the opportunities provided by a nearly full-employment economy in the 1990s. Levels of trust in both institutions and individuals have also recovered significantly since the early 1990s.

How far might this re-norming of society go? We are much more likely to see dramatic changes in levels of crime and trust than in norms regarding sex, reproduction, and family life. Indeed, the process of re-norming in the first two spheres is already well under way. With regard to sex and reproduction, however, the technological and economic conditions of our age make it extremely doubtful that anything like a return to Victorian values will take place. Strict rules about sex make sense in a society in which unregulated sex has a high probability of leading to pregnancy and having a child out of wedlock is likely to lead to destitution, if not early death, for both mother and child. The first of these conditions disappeared with birth control; the second was greatly mitigated, though not eliminated, by a combination of female incomes and welfare subsidies. Although the United States has cut back sharply on welfare, no one is about to propose making birth control illegal or reversing the movement of women into the workplace. Nor will the individual pursuit of rational self-interest solve the problems posed by declining fertility: it is precisely the rational interest of parents in their children's long-term life chances that induces them to have fewer children. The importance of kinship as a source of social connectedness will probably continue to decline, and the stability of nuclear families is likely never to fully recover. Those societies, such as Japan and Korea, that have until now bucked this trend are more likely to shift toward Western practices than the reverse.

Some religious conservatives hope, and liberals fear, that the problem of moral decline will be resolved by a large-scale return to religious orthodoxy - a Western version of the Ayatollah Khomeini returning to Iran on a jetliner. For a variety of reasons this seems unlikely. Modem societies are so culturally diverse that it is not clear whose version of orthodoxy would prevail. Any true orthodoxy is likely to be seen as a threat to large and important groups in the society, and hence would neither get very far nor serve as a basis for a widening radius of trust. Rather than integrating society, a conservative religious revival might in fact accelerate the movement toward fragmentation and moral miniaturization: the various varieties of Protestant fundamentalism would argue among themselves over doctrine; orthodox Jews would become more orthodox; Muslims and Hindus might start to organize themselves as political-religious communities, and the like.

A return to religiosity is far more likely to take a more benign form, one that in some respects has already started to appear in many parts of the United States. Instead of community arising as a byproduct of rigid belief, people will come to religion because of their desire for community. In other words, people will return to religion not necessarily because they accept the truth of revelation but precisely because the absence of community and the transience of social ties in the secular world make them hungry for ritual and cultural tradition. They will help the poor or their neighbors not necessarily because doctrine tells them they must but rather because they want to serve their communities and find that faith-based organizations are the most effective means of doing so. They will repeat ancient prayers and re-enact age-old rituals not because they believe that they were handed down by God but rather because they want their children to have the proper values, and because they want to enjoy the comfort and the sense of shared experience that ritual brings. In this sense they will not be taking religion seriously on its own terms but will use religion as a language with which to express their moral beliefs. Religion becomes a source of ritual in a society that has been stripped bare of ceremony, and thus is a reasonable extension of the natural desire for social relatedness with which all human beings are born. It is something that modem, rational, skeptical people can take seriously in much the way that they celebrate national independence, dress up in traditional ethnic garb, or read the classics of their own cultural tradition. Understood in these terms, religion loses its hierarchical character and becomes a manifestation of spontaneous order.

Religion is one of the two main sources of an enlarged radius of trust. The other is politics. In the West, Christianity first established the principle of the universality of human dignity, a principle that was brought down from the heavens and turned into a secular doctrine of universal human equality by the Enlightenment. Today we ask politics to bear nearly the entire weight of this enterprise, and it has done a remarkably good job. Those nations built on universal liberal principles have been surprisingly resilient over the past 200 years, despite frequent setbacks and shortcomings. A political order based on Serb ethnic identity or Twelver Shi'ism will never grow beyond the boundaries of some corner of the Balkans or the Middle East, and could certainly never become the governing principle of large, diverse, dynamic, and complex modem societies like those that make up, for example, the Group of Seven.

There seem to be two parallel processes at work. In the political and economic sphere history appears to be progressive and directional, and at the end of the twentieth century has culminated in liberal democracy as the only viable choice for technologically advanced societies. In the social and moral sphere, however, history appears to be cyclical, with social order ebbing and flowing over the course of generations. There is nothing to guarantee upturns in the cycle; our only reason for hope is the very powerful innate human capacity for reconstituting social order. On the success of this process of reconstruction depends the upward direction of the arrow of History.


* Francis Fukuyama is the Hirst Professor of Public Policy and the director of the International Commerce and Policy Program at George Mason University, in Virginia. His article in this issue will appear, in somewhat different form, in his book The Great Disruption, to be published by the Free Press, a division of Simon Schuster, next month.

** The Atlantic Monthly; May 1999; The Great Disruption; Volume 283, No. 5; pages 55-80

This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. I am making such material available in my efforts to advance understanding of issues of environmental and humanitarian significance. I believe this constitutes a 'fair use' of any such copyrighted material as provided for in section 107 of the US Copyright Law. In accordance with Title 17 U.S.C. Section 107, the material on this site is distributed without profit to those who have expressed a prior interest in receiving the included information for research and educational purposes. For more information go to: If you wish to use copyrighted material from this site for purposes of your own that go beyond 'fair use', you must obtain permission from the copyright owner.

Return to Top



©1999 Wes Jones. All rights reserved. Terms of use.
Last updated: Sunday, April 22, 2012