Over the last year, we have come to know a good deal about Umar Farouk Abdul Mutallab. You'll remember that he is the young Nigerian passenger who last Christmas attempted to blow up a Detroit-bound airliner, known now to the world as "The Underpants Bomber." Luckily, he was an amateur. The only one he hurt was himself.

The more we learned about him, the odder it seemed that he should have become a terrorist. This fresh-faced twenty-three-year old enjoyed a privileged upbringing. His father, a wealthy banker and former Nigerian economics minister, made sure his son had the best of everything.

His was "a gilded life," according to the Independent, which included the best schools and expensive homes. "With his wealth, privilege, and education," the newspaper declared, he "had the world at his feet—able to choose from a range of futures to make his mark on the world."

Yet the son went off to sojourn with Yemeni jihadis, and the father was so worried about him that he asked U.S. officials not to renew his visa. "My family system, our village system, broke down," the father explained.

New York Times columnist Thomas Friedman, that one-note charlie of globalism, complained soon after that there weren't enough such fathers in Muslim societies: "Unless more Muslim parents, spiritual leaders, political leaders—the village—are ready to publicly denounce suicide bombing . . . this behavior will not stop."

Friedman didn't bother to ask why the son of such a man might have given himself over to radical Islam; much less did he bother to ask himself what the present state of this hypothetical "village" might be. He avoided more or less entirely the central question: Why would a privileged youth feel drawn to immerse himself in so deadly a cause?

We used to have a handy answer to that question. We used to call such young people "alienated youth."

Between the end of World War II and the 1970s, it was a truism that affluence bred alienation as the institutions of modernity replaced more traditional social relationships. Alienation, in turn, was presumed to generate either apathy, on the one hand, or rage against the machine, on the other.

This same explanation holds merit for our understanding of young Mr. Mutallab. The reason why "Muslim parents, spiritual leaders, political leaders—the village," as Friedman puts it, are unable to direct the lives of some of their young people is because the very processes of economic and technological change that Friedman and others never tire of promoting systematically undermine the power of those traditional forms of authority.

Alienated from traditional authority, privileged young people are apt to harbor contempt for the village. Their basic identities dissolved away in the transition to affluence, young people like Mr. Mutallab predictably gravitate toward extremists who promise a restoration of what presumably has been lost.

Such lost souls are the flotsam and jetsam of fundamental social change, and one never knows exactly where they will end up. But undoubtedly for some, suicide for a cause is an appealing alternative to a life shorn of meaning.

Alienation Arrives in America

For those of us who study the intellectual and cultural history of the post-World War II West, it is strange that the concept of alienation seems to have evaporated. It was the paradigmatic explanation for social behavior in the 1950s and 1960s.

In the 1950s, alienation explained the timidity of white-collar workers and the conformity of suburbanites. It explained the eruption of juvenile delinquency in middle-class communities. It accounted for high divorce rates and the heavy use of alcohol and barbiturates among the professional classes.

Both the decline in voter participation and the political irrationality of the Red Scare were chalked up to alienation. In The Feminine Mystique, Betty Friedan deployed the concept to give voice to the vague but powerful "problem with no name."

The concept gained even more persuasive power in the 1960s, when a generation of young people who had enjoyed unprecedented affluence erupted in radical protests across the developed world. Some rode that wave of public activism to violent extremes.

The causes of alienation were well understood. Privileged young people rebelled against the Affluent Society (the title of one famous book from that era), because the institutions and processes that created material well-being did so at steep human costs. Homogenized anonymity left individuals adrift in lonely crowds (the title of another).

Powerful corporations and impersonal bureaucracies imposed rigid, inscrutable rules and turned people into IBM punch cards. Reduced to automatons, people felt powerless. Computerized technologies of production, whether on the assembly line or in the office, gutted whatever was left of the worker's control over the labor process and sucked away the pleasures of creative work.

American commercial culture offered the "choice" of three television networks that said pretty much the same thing at pretty much the same time about the same issue. The political class, united behind the Cold War "consensus," engineered the entire political apparatus to suit themselves.

Even the geography of middle- and upper-class America, de-centered into undifferentiated suburban non-communities, destroyed neighborly relationships, and, as Holden Caulfield said over and over in Catcher in the Rye, made people "phony."

These structures of affluence undermined people's control over their daily lives and removed the sources of meaningful living—creative work, human relationships not besmirched by calculation, and connections to usable and dignified community traditions of art and folkways.

At least to me, this narrative made sense in the 1950s and 1960s, and today it makes sense of jihadism's appeal to young people like Mr. Mutallab.

But when was the last time you heard someone say that alienation is the root cause of religious fanaticism? For that matter, why is it so rarely used to describe the social psychology of Americans? After all, none of those structures of power that emerged in the early postwar period have gone away, and several are arguably stronger.

No one, it seems, is alienated anymore.

In the midst of the worst economic downturn since the Great Depression, an MTV poll finds 75 percent of its young respondents reporting that they are happy, their economic worries offset by pleasing relationships with family and friends. Holden Caulfield is a has-been.

"Present-day students," a high-school literature teacher explained recently in the New York Times, "do not have much sympathy for alienated anti-heroes; they are more focused on distinguishing themselves in society . . . than in trying to change it."

Never ones to miss an obvious trend, Hollywood filmmakers have reinforced this more recent mood. The emblematic youth film of the early postwar period was James Dean's Rebel Without a Cause, the crucial scene of which had Dean screaming at his messed-up parents, "You're tearing me apart!"

By sharp contrast, the emblematic youth film of the last quarter of the twentieth century, Ferris Buehler's Day Off, was decidedly free of existential angst. Instead the protagonist is anything but alienated: Ferris, the free-spirited son swaddled in the affections of adoring suburban parents, has the world at his command.

Meanwhile, the intellectuals, who as a class were not only responsible for disseminating the concept but who, legend has it, were once alienated themselves, rarely employ the concept anymore. It no longer speaks to their condition. Secure in tenured university posts and yet often given to the self-delusion that they are still radicals, content in their comfort and given to toothless criticism, they enjoy their avocational prerogatives without the discomfort of the hand-to-mouth existence of bohemia.

Alienation? Who needs it? It doesn't pay well, and there's no 401(k).

The Roots of the Idea

So where did the concept of alienation come from and why did it carry such power in the early postwar period?

As a formal philosophical proposition it begins with Georg Hegel, whose phenomenology deployed alienation as a way to describe the individual separated from Truth. Most people, though, connect the concept to Karl Marx, who applied it to the separation of the worker from the work process in the development of capitalism.

The great Continental sociologists, Georg Simmel, Max Weber, and above all, Emile Durkheim, added elements to the concept. Simmel spoke of the "stranger" in urbanizing societies. Weber decried modernity as the "iron cage" of bureaucracy. And Durkheim's anomie challenged the quaint notion that human happiness would inevitably increase with material well being. Anomie, he insisted, was more likely to set in because of modernization, not in spite of it.

If anomie and alienation were not exactly the same concepts, their close kinship and the quickly broadening use of each in the 1950s washed away their differences, so that by 1960 they were more or less synonymous in conventional usage.

Explaining why these ideas gained purchase in postwar America is an exercise in the sociology of knowledge. Each had its appeals. Radical humanists warmed to Marx's alienation as a way of keeping faith with dialectical materialism without bowing to Communist dogma, a very useful thing in Cold War America.

Weber's critique of the bureaucratic regime obviously appealed in a society where conglomerates and the Cold-War state dominated.

Because Durkheim warned that modernization had severe socio-psychological effects, he spoke powerfully to the condition of a society that had gone from its most severe depression, through the enormous sacrifice of world war, to the greatest prosperity in human history in less than a generation.

No wonder Americans felt discombobulated.

Those Damn Alienated Kids

By nearly every account, youth were.

American intellectuals were never more secure than when they moved from their precarious existence as independent writers to tenured professors. They were never more alienated than when they woke up to see that they had traded independence for the bureaucratized grind of university life.

Factory workers were never so well paid, thanks to their unions and general prosperity. Yet every study of blue-collar workers revealed deep dissatisfaction. "The job gets so sickening—day in and day out plugging in ignition wires," one autoworker told sociologists Charles Walker and Robert Guest in the early Fifties. "I get through with one motor, turn around, and there's another motor staring me in the face. It's sickening."

Meanwhile, the white-collar managerial class, supposedly the leading edge of the post-industrial society, was increasingly yoked to computers and quartered in cubicles. The professional class had the added burden of living up to new social expectations by moving to suburbia, where they pursued the autonomy in private life that they had surrendered in their work.

Many postwar Americans chose suburban living in search of an independent, rural idyll. This was, of course, an absurdity. Surrounded by the same people they knew at work, dependent on the automobile, cowering in the corners of their cul-de-sacs in hopes of keeping the rest of the world at bay, the white-collar suburbanites were by most accounts in the early postwar years psychologically crippled.

While it turned up practically everywhere someone bothered to look, many observers came to think by 1960 that alienation fell hardest on "youth," as the ridiculously generic noun tagged them.