Skip to main content Skip to search Skip to search

Social Science Popular Culture

The Authenticity Hoax

How We Get Lost Finding Ourselves

by (author) Andrew Potter

Publisher
McClelland & Stewart
Initial publish date
May 2011
Category
Popular Culture, Social, Media Studies
  • Paperback / softback

    ISBN
    9780771071065
    Publish Date
    May 2011
    List Price
    $21.00

Add it to your shelf

Where to buy it

Description

One of Canada's hippest, smartest cultural critics takes on the West's defining value.
We live in a world increasingly dominated by the fake, the prepackaged, the artificial: fast food, scripted reality TV shows, Facebook "friends," and fraudulent memoirs. But people everywhere are demanding the exact opposite, heralding "authenticity" as the cure for isolated individualism and shallow consumerism. Restaurants promote the authenticity of their cuisine, while condo developers promote authentic loft living and book reviewers regularly praise the authenticity of a new writer's voice.

International bestselling author Andrew Potter brilliantly unpacks our modern obsession with authenticity. In this perceptive and thought-provoking blend of pop culture, history, and philosophy, he finds that far from serving as a refuge from modern living, the search for authenticity often creates the very problems it's meant to solve.

About the author

Andrew Potter is the coauthor of the international bestseller Nation of Rebels. A journalist, writer, and teacher, he lives in Toronto. Follow him on Twitter (@jandrewpotter).

Andrew Potter's profile page

Excerpt: The Authenticity Hoax: How We Get Lost Finding Ourselves (by (author) Andrew Potter)

One
THE MALAISE OF MODERNITY

In one of the most famous road trips in history, an emaciated and notoriously untrustworthy Greek youth named Chaerophon trekked the 125 miles from Athens to the Temple of Apollo at Delphi to consult with the Oracle as to whether there was any man wiser than his friend Socrates. No, Chaerophon was told by the Oracle, there was no man wiser, and so he returned to Athens and informed Socrates of what the Oracle had said. At first Socrates was a bit skeptical, since it struck him that most of his fellow Athenians certainly acted as if they were wise about a great many things, while he, Socrates, didn’t really know much about anything at all. But after wandering about the city for a while, questioning his fellow citizens about a range of topics (such as truth, beauty, piety, and justice), Socrates eventually decided that most of them were indeed as ignorant as he, but they didn’t know it. He concluded that he was indeed the wisest Athenian, and that his wisdom consisted in the fact that he alone knew that he knew nothing.

Socrates should have known there was a bit of a trick to the Oracle’s pronouncement. Inscribed in golden letters above the entrance to the ancient temple were the words gnothi seauton – “know thyself.”

“Know thyself” thus became Socrates’ fundamental rule of intellectual engagement, which he continued to deploy in Athens’ public spaces, conversing at length with anyone who would indulge him and spending a great deal of downtime in the company of handsome young men. It cannot be said that his fellow Athenians appreciated this commitment to debate. In 399 bce, Socrates was charged and tried for the crimes of teaching false gods, corrupting the youth, and “making the weaker argument the stronger” (a form of argumentative trickery called sophistry). At his trial, when the jury returned with a verdict of guilty on all counts, his accusers pressed for the death penalty. Given the chance to argue for an alternative punishment, Socrates started by goading the jury, going so far as to suggest that they reward him with free lunch for life. As for the other options – exile or imprisonment – he told the jury that he would not be able to keep his mouth shut on philosophical matters. Philosophy, he told them, is really the very best thing that a man can do, and life without this sort of examination is not worth living.

Socrates chose death rather than silence, and ever since he has been hailed for his integrity, a Christlike figure who was martyred for his refusal to sacrifice the ideals of intellectual independence, critical examination, and self-understanding. For many people, the Socratic injunction to “know thyself” forms the moral core of the Western intellectual tradition and its modern formulation – “to thine own self be true” – captures the fullness of our commitment to authenticity as a moral ideal.

For its part, the visit to the Oracle, with the cryptic pronouncement about Socrates having a special hidden characteristic, has become a stock motif of countless works of film and fiction, where the hero has to come to believe something about himself before he can help others. Perhaps the most hackneyed version of this is the scene in the film The Matrix, when Morpheus takes Neo to visit the Oracle. Morpheus believes that Neo is The One, the prophesied messiah destined to rescue humanity from the computer-generated dreamworld in which it has been enslaved. Neo is, understandably, a bit skeptical of his ability to serve as the savior of humanity. So Morpheus drags Neo off to see the Oracle, hoping that the good word from a maternal black woman who speaks in riddles while baking cookies will give Neo the boost of self-confidence he needs to get into the game and set about destroying the machines. Instead, the Oracle looks Neo in the eye and tells him he hasn’t got what it takes to be the messiah. On his way out, she hands him a cookie and says, somewhat oddly, “Make a believer out of you yet.” As Neo leaves, we see inscribed above the entrance to the kitchen the words temet nosce, which is Latin for “know thyself.”

As it turns out, Neo is (of course) the messiah. The Oracle could not just come out and say so though, because Neo had to believe it himself. He had to buy into the whole worldview that Morpheus and his gang had laid out, about the rise of the machines, the scorching of the earth, and the enslavement of humanity. As Trinity tells Neo later on, it doesn’t matter what Morpheus or even the Oracle believe, what matters is what Neo himself believes. The lesson is pretty clear. Before Neo can save humanity, he first has to believe in himself. The idea that self-knowledge and self-discovery are preconditions for social contribution is a thoroughly modern lesson, well steeped in the ethic of authenticity.

The Wachowski brothers were no doubt aware of the parallels they were drawing between Socrates and Neo (and, indirectly, between both of them and Jesus). Yet in Sincerity and Authenticity, Lionel Trilling makes it clear that this claim of continuity between the ancient world of Socrates and modern world of Oprah Winfrey and Eckhart Tolle is an anachronism, and that the authentic ideal is actually something relatively new. According to Trilling, the necessary element of authenticity – a distinction between an inner true self and a outer false self – only emerged in Western culture a few hundred years ago, toward the end of the eighteenth century. So despite superficial similarities, there is no real continuity between the Socratic dictum to “know thyself” and the thoroughly modern quest of self-discovery and self-understanding as an end in itself. What separates them is a yawning chasm between us moderns on the one side and the premodern world on the other.

What does it mean to be modern? That is a big and difficult question, and it has been the subject of a great many big and difficult books. One problem is that we often use modern as a synonym for contemporary, as when we marvel at modern technology or fret about modern love. Furthermore, even when we are careful to use modern to refer to a specific historical period, just what that is depends on the context. For example, historians sometimes refer to as “modern” the whole period of European history since the Middle Ages ended and the Renaissance began. Modern architecture, however, typically refers to a highly functional and unornamental building style that arose around the beginning of the twentieth century.

Here, I am concerned with modernity less as a specific historical epoch than as a worldview. To be modern is to be part of a culture that has a distinctive outlook or attitude, and while an important task for historians involves understanding why this worldview emerged where and when it did, it is essential to the concept of modernity that it is not tied to a particular place and moment. Modernity is what Marshall Berman, in his 1982 book All That Is Solid Melts Into Air, calls “a mode of vital experience – experience of space and time, of the self and others, of life’s possibilities and perils – that is shared by men and women all over the world today.” More than anything, modernity is a way of being, a stance we adopt toward the world and our place in it.

The rise of the modern worldview is marked by three major developments: the disenchantment of the world, the rise of liberal individualism, and the emergence of the market economy, also known as capitalism. Between 1500 and 1800, these three developments ushered in profound changes in people’s attitudes toward everything from science, technology and art, to religion, politics, nd personal identity. Put together, they gave rise to the idea of progress, which, as we shall see, does not necessarily mean “things are getting better all the time.” More than anything, progress means constant change, something that many people find unpleasant and even alienating. But we’re getting ahead of ourselves, so let’s begin with the disenchantment of the world.

In the first season of the television series Mad Men, set in the advertising world of Madison Avenue in the early 1960s, graysuit-and-Brylcreemed advertising executive Don Draper finds himself caught up in an affair with a bohemian proto-hippie named Madge. She drags Draper to parties and performance art clubs in Greenwich Village where he jousts with her anti-establishment friends over marketing and the moral culpability of capitalism. (Typical exchange: “How do you sleep at night?” “On a big pile of money.”)

One night they end up back at an apartment, drinking and smoking pot and arguing once again. When one of the stoned beatniks informs Draper that television jingles don’t set a man free, Draper replies by telling him to get a job and make something of himself. At this point, Madge’s beatnik boyfriend chimes in with some classic countercultural paranoia: “You make the lie,” he tells the ad man. “You invent want. But for them, not us.” Draper has had enough, so he stands up, puts on his hat, and gives them some serious buzzkill: “I hate to break it to you, but there is no big lie. There is no system. The universe . . . is indifferent.”

“Man,” goes the extremely bummed reply. “Why’d you have to go and say that?”

If he’d bothered to stick around to continue the debate, Don Draper might have answered, Because it is true. For the most part, this exchange is nothing more than stereotyped bickering between hipsters and squares, of the sort that has been going on in dorm rooms and coffee shops for over half a century. But that last line of Draper’s, about the indifferent universe, speaks to a deeper existential realization at the heart of the modern condition.

Once upon a time, humans experienced the world as a “cosmos,” from a Greek word meaning “order” or “orderly arrangement.” The order in this world operated on three levels. First, all of creation was itself one big cosmos, at the center of which was Earth. In fixed orbits around Earth revolved the moon, the sun, and the visible planets, and farther out still were the fixed stars. Second, life on Earth was a sort of enchanted garden, a living whole in which each being or element had its proper place. And finally, human society was itself properly ordered, with people naturally slotted (by unchosen characteristics such as bloodline, birth order, gender, or skin color) into predetermined castes, classes, or social roles.

Whatever else it might have been, this was a place of meaning, value, and purpose, with each part getting its identity from knowing its place in the whole and performing its proper function within an organic unity. For both the ancient Greeks and, later, medieval thinkers, this fundamental order could be described by the notion of the “great chain of being,” a strict hierarchy of perfection stretching from the rocks and minerals, up through the plants and animals, to humans, angels, and God. In this geo centric and homocentric cosmos, humanity found itself trebly at home, comfortably nestled like a Russian doll within a series of hierarchies. Earth was the most important part of creation, and humans were the most important beings on Earth. Finally, human society was itself a “cosmos,” a functional and hierarchical system – of slaves, peasants, commoners, tradespeople, nobles, and so on – in which each person’s identity was entirely determined by their place in that structure.

This was the worldview in which Socrates (or at least, the man who is revealed through Plato’s writings) operated. For Socrates, self-discovery involved little more than coming to understand where you fit in the grand scheme of things. On this reading, the oracular injunction to “know thyself” would be better expressed as “know thy place.” This is not to say that people did not have ambitions, emotions, or deeply felt desires, just that these were not important to helping you discover your place in the world.

Our films and fiction are full of romantic stories about the sons of medieval cobblers who fall in love with, woo, and win the local nobleman’s daughter, or Georgian scullery girls who find themselves swept off their feet and up to the castle by the young prince who is sick of dating fake society girls, but these are nothing more than projections of our own assumptions and values onto a world in which such behavior was literally inconceivable. The work of Jane Austen is so important precisely because it marks the transition from that world to a more modern sensibility – most of her stories hinge on her characters’ nascent individualism straining against the given roles of the old social order.

Almost every society that has ever existed has seen its world as “enchanted” in one way or another, from the polytheism of the ancient Greeks and Romans to the strict social roles of Chinese Confucianism to the form many of us are most familiar with, the monotheistic religions of Judaism, Christianity, and Islam. What is characteristic of traditions of this sort is that they are what we can call “comprehensive doctrines,” in that they purport to explain and justify a great deal about life on Earth, how the world works, and why human society is structured as it is, all within a common metaphysical framework.

Consider Catholicism, which at its peak was a powerful comprehensive doctrine that began by providing an explanation for the origins of the universe (God made it in seven days) and life on Earth (God made Adam out of dust, and Eve out of one of Adam’s ribs). Additionally, it provided a moral code (the Ten Commandments), along with a justification (God commands it), backed up by a sanction for violations (you’ll burn in hell). Finally, it explained the meaning of life, which consists of spiritual salvation through communion with God, mediated by the priesthood. Science, politics, morality, spiritual succor – the Catholic church is a one-stop explanatory shop, serving needs existential, political, social, and scientific.

What a comprehensive religious tradition does is ensure that everything that happens on Earth and in human society makes sense. In the end, everything happens for a reason, as it must be interpreted in light of what God wills, or what He commands. This is a version of what philosophers call teleological explanation – explanation in light of ultimate purposes or goals (from the Greek telos, meaning “end.”) The disenchantment of the world occurs when appeals to ultimate ends or purposes or roles being built into the very fabric of the universe come to be seen as illegitimate or nonsensical.

The big steamroller of Christianity was science, as a series of discoveries – from Copernican heliocentrism to Darwinian natural selection – played an important role in shaking up humankind’s sense of its place in the scheme of things. But even though science has progressively discredited any number of specific religious claims, there is no necessary antagonism between science and religion at the deepest level, and for many scientists, scientific inquiry is just a way of coming to understand the mind of God. As the great Renaissance polymath Sir Francis Bacon put it, “a little science estranges a man from God; a lot of science brings him back.”

So scientific discoveries alone are not enough to kick us out of the enchanted garden, and over the centuries religion, especially the Catholic Church, has shown itself to be very adept at accommodating the truths of divine revelation to those discovered through scientific inquiry. One standard move is to avoid resting the truth of the faith on any particular scientific fact and simply assert that however the world is, it is like that because God intended it that way, while another is to “protect” the divine origins of the human soul by simply declaring it off-limits to empirical inquiry. Pope John Paul ii employed both of these methods in 1996, when he conceded that evolution was “more than a hypothesis.” Darwin might be right about evolution through natural selection, the Pope said, “but the experience of metaphysical knowledge, of selfawareness and self-reflection, of moral conscience, freedom, or again of aesthetic and religious experience, falls within the competence of philosophical analysis and reflection, while theology brings out its ultimate meaning according to the Creator’s plans.”

This is what that biologist Stephen Jay Gould called noma, short for non-overlapping magisteria. noma is a sort of explanatory federalism, where the scope of human experience is carved into watertight compartments in which each type of explanation – scientific, religious, aesthetic, metaphysical – is sovereign within its own competency. If we accept the noma gambit, then disenchantment cannot be the inevitable outcome of a string of increasingly devastating scientific discoveries. As the Pope’s address makes clear, religion can never be chased completely off the field as long as people are willing to accept the validity of certain noncompeting forms of explanation, even if that amounts to little more than accepting “God wills it” as an account of why some things happen. For true disenchantment to occur, the scientific method of inquiry must be accepted as the only legitimate form of explanation.

If your high-school experience was anything like mine, the phrase scientific method conjures up memories of science labs with Bunsen burners, test tubes, and wash stations. We were taught a rigid formula that supposedly captured the essence of scientific inquiry, consisting of a statement of the problem, a hypothesis, a method of experiment, data, and a conclusion. (Problem: Do the same types of mold grow on all types of bread? Hypothesis: Yes. Method of Experiment: Leave lunch in locker every day for a month.) But the scientific habit of mind is more abstract than this, and it actually has little to do with experimentation, data collection, and analysis. At the core of scientific thinking are two elements. First, the commitment to explanation in terms of general laws and principles. Second, the recognition of the open-ended and ultimately inconclusive character of scientific progress.

Back when people believed that gods roamed and ruled Earth, the explanation of natural phenomena was basically a soap opera. Why was last night’s thunderstorm so intense? Because Hera caught Zeus cheating on her again, and when she started throwing crockery he replied with thunderbolts. Why were all our ships lost at sea? Because the captains forgot to sacrifice a bull to Poseidon before they left port, and he showed his displeasure by whipping up a storm with his trident.

It doesn’t take a great amount of insight to appreciate that these little stories don’t actually explain anything at all. They are ad hoc accounts invented after the fact as a way of imposing some sort of order on a world that is often totally unpredictable. Worse, they give us no way of predicting the future. When will Zeus cheat again? Who knows. Will sacrifice to Poseidon keep our ships safe? Probably not, since he appears to get angry for the most arbitrary reasons, and plenty of captains who did remember the sacrifice have had their ships founder on a lee shore. Clearly, if we’re looking for some way of both understanding the past and giving some reasonable guidance to what will happen in the future, we need something better.

The person usually credited as the first to turn away from supernatural soap opera as a way of making sense the world was an Ionian Greek named Thales of Miletus, who lived sometime between 620 and 546 bce. Thales was interested in the nature of matter and how it is able to take the tremendously diverse forms that make up the furniture of the world, and he proposed that the primary organizing principle of all things is water. We don’t have a proper record of why Thales believed this, since most of what we know of his thought comes to us second-hand from Aristotle. According to Aristotle, Thales may have got the idea by observing the role of water in creating and preserving life: “That the nurture of all creatures is moist, and that warmth itself is generated from moisture and lives by it; and that from which all things come to be is their first principle. . .” As a theory of everything, “all is water” is not much of an advancement, but what gives Thales his well-deserved reputation as the first true philosopher is a conceptual innovation we can call the generality of reason.

Thales realized that stories about Zeus cheating on Hera or Poseidon being irritated by the lack of sacrifices don’t help us understand why they behaved the way they did – these stories are pseudo-explanations that give only the illusion of providing understanding. Real understanding requires some sort of formula that shows us how events of this type can be expected to have effects of that type, both looking back (as an explanation of what happened) and forward (as a prediction of what will happen). For this, we need to understand events in the light of laws or general properties of some sort.

Once we have the idea of the generality of reason, we are armed with a tremendously powerful cognitive tool, since the notion that the world operates according to predictable general laws is what gives us logic, science, and technology, as well as the principles of impartiality and equality in the ethical and moral realms. And so by suggesting that water was the originating principle of all things, Thales took a fateful step away from the random and usually bizarre plot twists of the supernatural soap opera and took a stab at explaining the world in terms of general principles that would enable us to understand why things are the way they are as well as predict how they might be in the future.

Yet this, too, is still a long way from the disenchantment of the world. The world can be a rational, ordered place, explicable in terms of general laws and principles, and still be a cosmos. Thales showed us how to leave soap opera behind, but there is still plenty of room for recourse to arguments underwritten by final explanations in terms of God’s will, plan, or intentions. What drives teleology from the field once and for all is the realization that science is a progressive endeavor that can in principle never come to an end.

It was the late nineteenth-century German sociologist Max Weber who showed us how the disenchantment of the world occurs once, and only once, we fully appreciate the relentlessly progressive nature of science. As he argued, scientific discoveries are by their very nature meant to be improved upon and superseded. We can never say we have achieved the final truth, because there is always the possibility of further explanation in terms of broader or deeper laws. For Weber, the commitment to science

means that principally there are no mysterious incalculable forces that come into play, but rather that one can, in principle, master all things by calculation. This means that the world is disenchanted. One need no longer have recourse to magical means in order to master or implore the spirits, as did the savage, for whom such mysterious powers existed. Technical means and calculations perform the service.

The term is overused, but Weber’s final repudiation of magical thinking represents a genuine paradigm shift in our outlook on the world, and is a giant step toward becoming fully modern. The most significant consequence of this is that the disenchanted world is no longer a cosmos but is now a universe. It is Don Draper’s morally indifferent realm, consisting of mere stuff – energy and matter in motion – that neither knows nor cares about humans and their worries. The philosophical significance of this is that the world can no longer serve as a source of meaning or value. That is, no statement of how the world is can, by itself, validate any conclusions about what ought to be the case. We are no longer entitled, for example, to the argument that just because some groups are slaves, slavery is their natural and therefore justified condition.

This gap between reasoning about what is and reasoning about what ought to be, and the illegitimacy of proceeding from the first to the second, was famously noticed by David Hume. Known as “Hume’s guillotine,” it cautions philosophers to be on their guard against unwarranted inferences.

Along with everything else, disenchantment transformed our understanding of the self, it privileged a utilitarian philosophy that saw the maximizing of happiness as the ultimate goal, and it encouraged an instrumental and exploitative approach to nature through the use of technology. A key effect of disenchantment, though, was its action as a social solvent, helping break up the old bonds in which individuals and groups found their place within larger class-based divisions or hierarchies. When people no longer have a proper place in the scheme of things – because there is no scheme of things – they are, in a sense, set free. They are free to make their own way, find their own path.

Thus, the disenchantment of the world leads directly to the second major characteristic of modernity, the rise of the individual as the relevant unit of political concern.

One effect of disenchantment is that pre-existing social relations come to be recognized not as being ordained by the structure of the cosmos, but as human constructs – the product of historical contingencies, evolved power relations, and raw injustices and discriminations. This realization wreaked a great deal of havoc with long-established forms of social organization, and it came at an auspicious time. At the end of the eighteenth century, even as people were coming to appreciate the conventional and arbitrary origins of the traditional hierarchies in which they found themselves embedded, the nascent industrialization of parts of Europe was pushing them into greater proximity, thanks to large-scale migrations from the country into the cities and town centers.

This double contraction of distance (of social class and of geography) inspired a root-and-branch rethink of most elements of political authority. For the first time, people began asking themselves questions such as: Who should rule? On what basis? Over whom should power be exercised, and what are its scope and limits? What inspired these sorts of questions was the relative decline of groups, castes, nations, and other collectivities, and the emergence of the individual as the central unit of political concern.

A recurring theme in the literature on the birth of modernity is the idea that political individualism was itself the result of the prior rise of religious individualism in the sixteenth century. The decisive event here is the Protestant Reformation, which Tennyson called “the dawning of a new age; for after the era of priestly domination comes the era of the freedom of the individual.” Unlike Catholics, whose faith is mediated through the supreme authority of the Church, Protestants regarded each person’s interpretation of the Bible as authoritative. The Protestant relationship to God is personal and unmediated, with piety residing not in good acts or confession of sins, but in the purity of will. As Martin Luther put it, “it is not by works but by faith alone that man is saved.” By abolishing the need for a separate caste of priests, Protestantism effectively turned religion into a private matter, between the individual and God. Furthermore, the emphasis on salvation through faith fueled a psychologically inward turn, in which the examination of one’s conscience took center stage.

The Reformation was indeed an important step in the emergence of political individualism. In particular, it fed a stream of modernity that influenced the rise of authenticity as a moral ideal. Yet the religious individualism of the Reformation was itself enabled by a much more profound development, the emergence of the centralized state. The deep connection between the rise of the modern state and the emergence of the individual is not always fully appreciated. They are actually just two aspects of the same process, and it is no coincidence that the individual became the focus of political concern just as the centralized state was beginning to consolidate its power in the sixteenth century.

The state is such a distinctively modern institution that most of us find it difficult to imagine any other way of carving up the world, so much so that we habitually describe territories that employ other forms of government – such as Afghanistan or Sudan – as “failed states.” It was not always so. There are many ways of governing a territory, and the state is only one of them. As it emerged out of the Middle Ages, the state vied with a handful of other, sometimes overlapping, forms of political organization, including those based on kinship, tribal affiliation, feudal ties, religion, and loose confederal arrangements such as the Holy Roman Empire. With the idea of the state comes the notion of sovereignty, which in your standard Politics 101 textbook is usually defined as something like “the monopoly over the use of force in a territory” or “the exercise of supreme legislative, executive, and judicial authority” over a well-defined geographical area. The paired ideas here are supremacy and territoriality; together, they embody the form of government we know as the sovereign state. Instead of governing along with or indirectly through local tribesmen or warlords, members of the nobility, various estates, or the Church, the sovereign asserts his unlimited and unrestricted authority over each and every member of society.

What does this have to do with political individualism? After all, what does it matter to a serf, to a woman, or to a Jew whether they are oppressed by the whims of a local lord, husband, or cleric, or by the decrees of a distant king? Eliminating the middle man may be a useful step toward more efficient tyranny, but it is hard to see how it involves a decisive concession to the principle of individual liberty. Nevertheless, that was the precise effect of the rise of the modern state. As Oxford political scientist Larry Siedentop puts it, the modern state is a Trojan horse, carrying with it an implicit promise of equality before the law:

The very idea of the state involves equal subjection to a supreme law-making authority or power – the sovereign. To speak of a “state” is to assume an underlying equality of status for its subjects . . . once there is a sovereign agency or state, inherited practices no longer have the status of law unless they are sanctioned by the sovereign.

And so the emergence of the state was the political sidecar to the process of disenchantment that was already underway. Just as disenchantment stripped away the metaphysical foundations that justified inherited social classes as fixed roles, the state cut through the various overlapping layers of political authority that had accumulated over the centuries like so many coats of paint by establishing a direct and unmediated relationship between the sovereign and the individual subject. In contrast with every other form of political organization, the state is first and foremost a collection of individuals. They may have any number of social roles – tradesperson, wife, baron, priest – but these are merely secondary add-ons. Individuality is now the primary social role, shared equally by all.

This distinction between primary and secondary social roles is only possible within a sovereign state. To see this, try to imagine a collection of historical types that includes an ancient Egyptian slave, a medieval serf, an Untouchable from India in the eighteenth century, and a nineteenth-century Frenchman. Now imagine you could go back in time and ask each one of them to describe the underlying bedrock of their identity, perhaps by simply asking each of them, “What is your status?” All the first three could say in response is that they are, respectively, a slave, a serf, and an Untouchable. Their identity is entirely constituted by their inherited social roles. In contrast, the Frenchman could reply that, while he occupies a number of secondary social roles (such as courtier, landowner, husband), he is first and foremost a freestanding citoyen, an individual whose formal equality with all other Frenchmen is guaranteed by the very nature of the French republic.

A second, related distinction is between law and custom. In a society where people are born into a social role – such as an Egyptian slave or a Confucian wife – in which they have an obligation to obey another in virtue of that social role, all social rules have the same status. In Confucianism, for example, there is no distinction between temporal laws and religious decrees – both carry equal weight because there is no essential difference between the principles that govern the Kingdom of Heaven and those that govern the Kingdom of Man. Similarly, in a theocratic state there is no essential difference between religious decrees and temporal laws, and in a feudal system there is no difference between the expressed preferences of the lord and his actual decrees.

But when there is an established state claiming a monopoly over political authority, we see the arrival of a distinction between laws, which are the explicit and obligatory commands of the sovereign backed up by a sanction, and customs, which might be enforced through social pressure but which have no legitimate legal backing. Of course, some systems retain a certain hybrid character, and a state such as India remains imperfectly sovereign thanks to the ongoing influence of the caste system, despite the fact that it has been outlawed. But that’s the entire point: the claim about the tight relationship between individualism and the state is not that once a state is formed, people no longer try to contest the state’s legitimacy by enforcing religious decrees or assuming certain caste privileges, it is that these attempts are illegitimate by the state’s own self-understanding. Note that much of what we call nation-building (which is actually state-building) in a place such as Afghanistan involves trying to subordinate the influence of local warlords or religious leaders to that of the central government in Kabul.

The two distinctions, between primary and secondary social roles and between positive law and mere custom, are creatures of that quintessential institution of modernity, the sovereign state. And once they are in place, they are able to evolve into their fully developed form, the liberal distinction between the public realm, which is within the law’s reach, and the private realm, which is a sphere of personal conscience, worship, choice, and pursuit. But for this to emerge, the concept of sovereignty has to be drained of its unlimited reach, through the notion of the limited state.

Probably the best essay on the tremendous gap between social roles under feudal forms of government and in the modern state is the famous scene in Monty Python and the Holy Grail where King Arthur approaches a peasant digging in the dirt to ask for directions. Arthur begins by introducing himself as “King of the Britons,” a claim greeted with a wink of incredulity. Arthur persists in asking the peasant for directions, but the peasant protests that Arthur has not shown him proper respect. After Arthur replies that he is, after all, the king, an argument ensues over the nature of legitimate government and the scope of the state’s writ, which ends with the peasant insisting that supreme executive authority derives “from a mandate from the masses, not from some farcical aquatic ceremony” – a reference to the Lady of the Lake who gave Arthur the sword Excalibur and anointed him King.

The scene has grown stale over the years, thanks largely to bores who insist on quoting it verbatim at parties (complete with accents), but what made it work in the first place is the cognitive dissonance involved in a medieval peasant rattling on to his king about a modern concept such as responsible government. It might seem obvious to us moderns that a ruler the people didn’t vote for has no legitimacy, but the Python scene reminds us what an aberration that notion is, considered against the sweep of human history.

Virtually all premodern societies, including those within the Christian tradition, are built around what philosophers call a “perfectionist” value system. They conceive of society first and foremost as a community, organized around a shared conception of human perfection, of what promotes human flourishing and what matters to the good life. The main role of the established authority is to promote the moral and spiritual perfection of the people. It does this by using political power to exercise moral authority and to declare which beliefs and behaviors are virtues and which are vices, condoning the former and condemning the latter. The full coercive apparatus of the state is used to enforce these sanctions.

The perfectionist attitude toward the state was captured by the traditional French saying Une foi, un loi, un roi (one faith, one law, one king), and it was only after the devastating French Wars of Religion in the sixteenth century that Europeans started to come around to the idea that the continent’s divisions were probably here to stay. The basket of virtues we take to be characteristic of Western individualism began with religious toleration, and it was something not so much chosen as forced upon an economically shattered and morally exhausted Europe as a second-best alternative to perfectionism.

It was through a similar process that the liberal commitment to individual rights arose, through widespread revulsion in the face of the horrors that could be visited upon the population by an increasingly powerful and centralized state. Sure, monarchs had always fought one another, marching out in the spring, fighting a few skirmishes on a suitable battlefield, then marching home in the fall, but the people were largely left out of things. Only when the state began to turn its power of coercion against its own citizens – that is, when dictatorship of absolute monarchy evolved into the tyranny of the Inquisition – did people start to consider that maybe the business of the state shouldn’t be the moral perfection of the people, and maybe its authority ought to be limited in certain important ways.

This shift in thinking is marked by the transition in thought between two seventeenth-century philosophers, Thomas Hobbes and John Locke. Hobbes was quite certain that the citizens of a commonwealth would prefer an absolute sovereign to the nasty and brutish condition of the state of nature (the “war of all against all”), but Locke – writing in the aftermath of the Glorious Revolution – finds this ridiculous. He proposed that the state be divided into separate branches, where the citizens have a right to appeal to one branch against another. This is the beginning of constitutionalism, or the idea of the limited state. The main principles of constitutionalism are that the state is governed according to the rule of law; that everyone is equally subject to the law; and that the scope of what is a legitimate law is limited by a charter of individual rights and liberties. A constitutional state is one that foregrounds its respect for the autonomy of the individual and his or her free exercise of reason, choice, and responsibility.

To a large extent, respect for individual autonomy follows directly from the rejection of natural hierarchies and assigned social roles. When people possess formal equality under the state, and when they no longer owe any natural and enforceable obligations to one another, then a great many decisions – about how to worship, what to do for fun, whom and how to love – become a matter of personal choice. As the future prime minister of Canada, Pierre Elliott Trudeau put it in 1967 when, as a young Justice Minister, he introduced a bill decriminalizing homosexual acts between consenting adults: “There’s no place for the state in the bedrooms of the nation.” But this sort of autonomy is valuable only to the extent that the decisions people actually make are respected and protected. It would be a hollow form of autonomy indeed if the state told people that they were free to make various choices but then turned around and allowed them to be persecuted for those decisions.

That is why any suitably robust commitment to autonomy must be accompanied by a firm commitment to preserving the legal distinction between the public and the private realm. Just where the line is to be drawn will differ from one state to the next, but a liberal society must carve out some minimal space for the protection of individual conscience and the pursuit of private goals. This puts the question of individual rights onto the agenda: as philosopher Ronald Dworkin has argued, “Rights are best understood as trumps over some background justification for political decisions that states a goal for the community as a whole.” That is, a right is a trump card that preserves a sphere of private individual action against decisions taken by the state in the name of the common good. Locke summarized things with the declaration that everyone had the right to life, liberty, and property, the ultimate consequence of which was a ground-up rethink of the appropriate relationship between the state, morality, and the good life.

An essential part of this system of individual liberty that emerges from the Hobbes-to-Locke trajectory is a species of economic individualism, also known as a market economy, also known, to its critics anyway, as capitalism. The emergence of the global market economy marks the third step in the development of the modern world, a step that was to prove the most transformative yet also the most controversial and socially disruptive.

Mention capitalism in the context of the last years of the eighteenth century and what immediately clouds the mind are proto-Dickensian images of urban squalor, belching factories, and overworked and threadbare six-year-olds being beaten by rapacious landlords in top hats. While this image is not completely inaccurate, the use of the term capitalism puts a misleading emphasis on material forces, while neglecting the powerful ideals motivating this new economic individualism. In particular, focusing on material relations (and even the class struggle) obscures the role of individual autonomy, the rise of the private sphere, and the importance of contract in conceiving a fundamentally new approach to the morality of economic production and consumption.

On the economic side of things, the most important consequence of Locke’s liberalism was the idea that the public good could be served by individuals pursuing their private interest. With the “privatization of virtue,” ostensible vices such as greed, lust, ambition, and vanity were held to be morally praiseworthy as long as their consequences were socially beneficial. In his Fable of the Bees (subtitled Private Vices, Publick Benefits), Dutch-born physician Bernard de Mandeville argued that luxury, pride, and vanity were beneficial because they stimulated enterprise. Mandeville was an obvious influence on Adam Smith’s Wealth of Nations, particularly on Smith’s infamous metaphor of the invisible hand of the market that describes a positive, yet unintended, consequence of self-interested behavior. Smith argued that each individual, seeking only his own gain, “is led by an invisible hand to promote an end which was no part of his intention,” that end being the public interest. “It is not from the benevolence of the butcher, or the baker, that we expect our dinner,” Smith wrote, “but from regard to their own self interest.”

This is clearly at odds with almost all popular moralities, including Christianity, which emphasize the importance to public order and to the common good of individual sentiments of benevolence and public-mindedness. But it is no great leap from Locke’s economic individualism to the idea that what matters to morality are not intentions, but outcomes. What does it matter why people behave the way they do, as long as society is better off for it? If we can harness the pursuit of private interest to positive social outcomes, it is hard to see what anyone could have to complain about. The theory that best served this new morality was utilitarianism, summarized by philosopher and social reformer Jeremy Bentham as the principle of the greatest good for the greatest number.

What ultimately validated the utilitarian pursuit of happiness – that is, hedonism – as a morally acceptable end in itself was the first great consumer revolution, which began in the second half of the eighteenth century, in which both leisure and consumption ceased to be purely aristocratic indulgences. Not only did consumerism become accessible to the middle classes, it became an acceptable pursuit; buying stuff, and even buying into the spiritual promise of goods, came to be seen as a virtue. Precisely what spurred the dramatic increase in consumption in the 1770s and 1780s is a matter of considerable dispute in the sociological literature, but everyone agrees that the crucial development was the emergence of fashion consciousness amongst the masses. Sure, there had always been fashion trends of some sort, insofar as what was considered appropriate style, material, and color changed over time. But this was almost always the result of social emulation amongst the aristocracy, and shifts happened slowly, sometimes over the course of decades. But in the 1770s, the “fashion craze” made its appearance, characterized by the distinctly modern feature of rapid shifts in accepted popular taste. In 1776, for example, the “in” color in the streets of London was something called couleur de noisette; a year later, everyone who was anyone was wearing dove gray.

In order for there to be a consumer revolution, there had to be a corresponding revolution in production. That is because consumption and production are just two ways of looking at the same unit of economic activity; one person’s consumption is another person’s production. That is why any significant change on the demand side of the economy must be accompanied by an equal shift on the production side. And indeed it was, via the great convulsion we call the Industrial Revolution. Historians argue over the precise dating of the Industrial Revolution, but most agree that it began with a number of mechanical innovations in the British textile industry in the 1760s. These innovations quickly spread to France, Holland, and the rest of Western Europe, and across the Atlantic to Canada and the United States, along with similar and almost simultaneous gains in other industries, especially ironsmelting, mining, and transportation.

The Industrial Revolution affected almost every aspect of the economy, but there were two main aspects to the growth in innovation: first, the substitution of work done by machines for skilled human labor, and second, the replacement of work done by unskilled humans and animals with inanimate sources of work, especially steampower. Put these together – machines replacing men, machines replacing animals – and you get a tireless supply of energy driving increasingly sophisticated machines. This marked the death of the cottage industry and the birth of the factory, where power, machines, and relatively unskilled workers were brought together under common managerial supervision.

As David Landes points out in his book The Wealth and Poverty of Nations, what made eighteenth-century industrial progress so revolutionary was that it was contagious. “Innovation was catching,” writes Landes, “because the principles that underlay a given technique could take many forms, find many uses. If one could bore cannon, one could bore the cylinders of steam engines. If one could print fabrics by means of cylinders . . . one could also print wallpaper that way.”

The same idea held for printing, textile manufacturing, tool machining, and countless other industrial processes, where a discovery made in one area reinforced those made in others, driving innovation along an ever-widening front. As ever-higher incomes chased ever-cheaper consumer goods, the economy became locked into a virtuous cycle of growth and development, with each new innovation serving as a stepping stone to another. After centuries of relative stagnation, for the first time in the history of the world people had to get used to the fact that the future would not resemble the past, that there would be progress. That is, people had become modern.

When it comes to the Industrial Revolution, the two longstanding questions are: Why Britain? Why then? These are fantastically difficult questions and are virtually impossible to distinguish from the larger question of why some countries are rich and some are poor. Is it a matter of race or climate? Culture or environment? Religion or politics? You can take your pick of answers, depending on your taste and intellectual temperament. Regardless, it is almost certainly no accident that the Industrial Revolution happened where and when it did.

If you were given the opportunity to design a society optimized for a “growth and innovation” economy, there are a number of characteristics you would be sure to include. You would want it to be a society of intellectual freethinkers, in which people had the right to challenge conventional thinking without having to worry about persecution or censorship. This freedom of thought would include a commitment to the virtues of empiricism and the willingness to accept the reports of inquiry and observation at face value and to pursue these inquiries wherever they may lead. Your ideal society would have a legal mechanism for translating the fruits of research and invention into commercial enterprise by respecting property rights and the freedom of contract. And there would be few social or legal taboos against consumption, allowing for the flourishing of a market for new goods, in the form of a mass consumer society. Freedom, science, individualism, and consumerism: that’s a difficult agenda for any society, one that few countries in the world today have managed to get their heads around. But as a first draft of modernity, it is not a bad description of late-Enlightenment Britain.

When we think about the deployment of power, technology, and human resources, what usually comes to mind are architectural marvels such as pyramids or cathedrals, or grand public enterprises like the Manhattan Project or the space program. That is, we think of public works along the lines of Kublai Khan’s Xanadu, stately pleasure domes brought into existence by the decrees of grand rulers or the ambitions of great states. But the most remarkable aspect of the Industrial Revolution is that it was powered almost entirely by the private, household consumption desires of the middle classes. In their pursuit of personal happiness and self-fulfillment through economic development and consumption, the British nation of shoppers and shopkeepers unleashed a force unlike anything the world had ever seen.

No one understood this better than Karl Marx. The Communist Manifesto begins with what seems to be an extended and genuine appreciation of the bourgeoisie, a group that has managed to conquer the modern state and, in contrast with the “slothful indolence” of the middle ages, has unleashed the active forces that lie within. It has done this by hitching the latent power of technology to the forces of private competition, and in so doing has remade the world.

The bourgeoisie, during its rule of scarce one hundred years, has created more massive and more colossal productive forces than have all preceding generations together. Subjection of nature’s forces to man, machinery, application of chemistry to industry and agriculture, steam navigation, railways, electric telegraphs, clearing of whole continents for cultivation, canalization or rivers, whole populations conjured out of the ground – what earlier century had even a presentiment that such productive forces slumbered in the lap of social labor?

Even Marx concedes that this was not a wholly bad occurrence. Once it had the upper hand, the bourgeoisie “put an end to all feudal, patriarchal, idyllic relations.” It shrugged off all the old, arbitrary ties that bound men to one another in virtue of who their ancestors were or what color their skin was. The bourgeoisie cleared a social and political space for the emergence of new visions of the good life, new possibilities for what people can do and who they might become.

Yet capitalism proved to be a universal solvent, eating away at the social bonds between people in a given society as well as the cultural barriers that formerly served to separate one society from another. In place of the family or feudal ties, of religiosity, of codes of conduct like chivalry and honor, there is now nothing left but the pitiless demands of the cash nexus, the rest having been drowned “in the icy water of egotistical calculation.” Meanwhile, all that is local and particular succumbs before the relentless cosmopolitanism and consumerism of the world market. National industries falter before the relentless scouring of the globe by industry for resources and manpower, while every nation’s cultural heritage – its poetry, literature, and science – are turned into homogenized commodities. In sum: read Marx on capitalism and you realize how little was added to his original critique by the critics of globalization of the late twentieth century.

Capitalism is able to do this by exploiting the almost limitless capacity that humans have for enduring perpetual upheaval and change, in both their personal and public affairs. A capitalist society puts tremendous pressure on people to constantly innovate and upgrade, to keep on their toes. They must be willing to move anywhere and do anything, and “anyone who does not actively change on his own will become a passive victim of changes draconically imposed by those who dominate the market.” For the bourgeoisie, life comes to resemble the world of the Red Queen in Through the Looking-Glass, where it takes all of the running one can do just to stay in the same place.

Modernity, then, is the offspring of three interlocking and mutually supporting developments: the disenchantment of the world and the rise of science; the emergence of political individualism and a taste for liberal freedoms; and the technology-driven gale of creative economic destruction known as capitalism. These gave us a new kind of society and, inevitably, a new kind of person, one who has learned to thrive in a milieu in which freedom is equated with progress, and where progress is nothing more than constant competition, mobility, renewal, and change. In what is probably the most succinct and evocative one-paragraph description of modernity you will find anywhere, a passage that perfectly captures the almost delirious light-headedness of the modern worldview, Marx writes,

Constant revolutionizing of production, uninterrupted disturbance of all social conditions, everlasting uncertainty and agitation distinguish the bourgeois epoch from all earlier ones. All fixed, fast frozen relations, with their train of ancient and venerable prejudices and opinions, are swept away, all new-formed ones become antiquated before they can ossify. All that is solid melts into air, all that is holy is profaned, and man is at last compelled to face with sober senses his real condition of life and his relations with his kind.

But no sooner had people learned to enjoy the fruits of modernity (all that stuff, all that freedom) than they started complaining of the bitter pit in the center; even as some were tallying up the gains, others were keeping track of the inevitable losses.

And losses there certainly were. First, each person lost his or her given place in society, through the breakdown of the old order, the destruction of pre-existing hierarchies, and the demise of ancient ways of living. Second, humanity as a whole lost its place in the grand scheme of things. The world was no longer an ordered cosmos but instead a chaotic universe, a cold and indifferent realm of mere matter in motion.

Not everyone finds this loss of meaning acceptable, just as not everyone is happy with the related demise of nature as a source of intrinsic value. As for individualism, freedom is all well and good, but is it worth the price? Is it worth it if the result is a narrowing of vision, a shallowness of concern, a narcissistic emphasis on the self? The consumer culture that arose thanks to the privatization of the good life may be good for the economy, but is it good for the soul? For a great many people, the answer to questions of this sort was – and is – a clear “no.” As they see it, what is lacking in modernity is an appreciation for the heroic dimension of life, higher purpose having been sacrificed to the pursuit of what Alexis de Tocqueville called “small and vulgar pleasures.” As for the pleasant-sounding principle of “economic individualism,” it quickly showed its stripes as the hard realities of laissez-faire capitalism hit home. Adam Smith’s vaunted division of labor may have made for great efficiency gains at the pin factory, but it didn’t make working there any more enjoyable. If anything, work became rigid, mechanical, and boring, and somewhere between the cottage and the factory workers stopped identifying themselves with the products of their labor. They became what today we call wage slaves, trading their effort for money, but not necessarily for meaning.

And so the victories of modernity look from another perspective like defeats, and purported gains appear as losses. The main problem is not that modernity has eliminated unwanted hierarchies and stripped away unwarranted privileges. It is that it has dissolved every social relation, drained the magic from every halo. We have replaced the injustices of fixed social relations with the consumer-driven obsession with status and the esteem of others, and where we once saw intrinsic meaning and value we now find only the nihilism of market exchange. Critics have found it useful to gather all of these problems and objections to modernity under the term alienation.

If you have been paying any attention at all to the world around us, you will have noticed that there seems to be a lot of alienation going around. Husbands are alienated from their wives, students are alienated from their teachers, voters are alienated from their politicians, and patients are alienated from their doctors. Everyone thinks the mass media are alienating, especially thanks to all the advertising. Religious people find the permissiveness of our secular society alienating, and some believe that alienation is what motivates terrorists. People who live downtown find the suburbs alienating, while suburbanites feel the same way about life in the big anonymous city. The world of work seems to have tapped a rich vein of alienation, if the success of comics such as Dilbert and television shows such as The Office are any indication of popular sentiment. In modern society, we are all alienated from nature and from one another, although perhaps that is only because ultimately we are all alienated from ourselves.

If modernity has a lot of explaining to do, the concept of alienation is doing most of the explaining. It is a substantial burden to lay on a single word, particularly one that is routinely left undefined. For many people, alienation is like victimhood: if you feel alienated, then you are, which is why the term commonly serves as a synonym for bored, powerless, ignorant, unhappy, disgusted, aimless, and just about every other negative descriptor for one aspect of our lives or another.

To say that something is “alien” just means that it is in some sense different, foreign, or “other,” hence the use of terms such as alien life form or illegal alien. To alienate something is to make it foreign or separate, and someone is alienated when they feel disconnected or detached from their friends, their families, or their jobs. Trade or commerce is a form of alienation: property is alienated when ownership is transferred from one person to another. Abstract concepts can also be alienated; social contract theorists talk of citizens alienating (that is, transferring) certain of their natural rights to the state. In its strictest sense then, alienation is just a separation or lack of unity between pairs of people, groups, and things. But the dictionary can only take us so far, since what matters is not what the word means, but the various uses to which it is put.

When social critics talk about alienation, they tend to veer between talking about alienation as a psychological or as a social phenomenon. They may often be related, but they are actually logically distinct. Psychological alienation refers to your attitude, emotions, or feelings toward whatever situation you find yourself in, whether it is your work, or your marriage, or your living environment, and it typically manifests itself as dissatisfaction, resentment, unhappiness, or depression.

Social alienation, on the other hand, is not concerned with whether you are unhappy or resentful. Instead, it looks at the social, political, or economic structures and institutions in which people find themselves embedded. The discordant effects of social alienation are caused by a “lack of fit” between people’s actual behavior and the norms or expectations of their environment, and they manifest themselves in various ways. For example, a high crime rate in an inner-city neighborhood might be attributed to the fact that the youth want to skateboard in the church parking lot but the cops keep chasing them away. Thus, these youth might find themselves alienated from their urban environment. Similarly, a high absentee rate amongst employees in a large corporation might be the result of an alienating corporate culture that squeezes unique individuals into cookie-cutter cubicles that are hostile to free, creative thought.

Regardless of which type of alienation we are talking about, it is vital to keep in mind the following: Just because you are alienated, it does not mean that there is a problem and that something ought to be done about it. Both of the types of alienation, the psychological and the social, are just descriptions of certain states of affairs. The first describes someone’s state of mind, the second is an account of certain social relationships among persons, groups, and institutions.

Consider again the world of work. The contemporary office, with the long lines of cubicles (what novelist Douglas Coupland coined as “veal fattening pens”) manned by cut-and-paste worker drones who are simmering with barely restrained rage, is a thoroughly worked-over metaphor for contemporary alienation. The anonymous nature of bureaucratic organizations, and the routine and mechanical nature of the work, seems completely at odds with any reasonable understanding of what might make someone happy. Anyone who doesn’t feel alienated in such an environment, we think, must be either drugged, insane, or lobotomized. But so what? Nobody says you are supposed to like your job, and nothing says it is supposed to be fulfilling. To put it bluntly: there’s a reason why they call it work, and there is a reason why they pay you.

This line of reasoning should be ringing a few bells. It is nothing more than a restatement of Hume’s guillotine, which forbids us from drawing any moral conclusions from strictly empirical or descriptive premises. This is-ought problem might seem strange in the context of a discussion about alienation. Nobody refers to an institution as alienating unless they are expressing moral or political disapproval, and to describe someone as alienated is to make it very clear that you think something needs to be done about it. In fact, for many of its proponents, the great virtue of thinking about society in terms of alienation is that it provides a way of doing an end-run around Hume’s guillotine and bridging the is-ought gap.

To see how this is supposed to work, consider the way we think about disease. When a doctor diagnoses you with an illness, you might say that she is simply describing a state of affairs. If you have cancer, it means that you have uncontrolled cell division occurring in some tissues of your body. To say you have malaria is to say that a certain protozoan parasite is multiplying within your red blood cells. But of course we want to say more than that. Illness isn’t just a description of the state of the body like any other; the difference between being healthy and being sick is not the same as the difference between having brown or blond hair, or between standing up or lying down. Yes, sometimes we would like to stand, sometimes we prefer to lie down, but that preference depends on other desires and purposes we have at the time. There is nothing intrinsically wrong with standing up or with lying down. In contrast, the very idea of what it means to be sick (and lurking in the etymology of the word dis-ease) is that something is not right, that the body has been disturbed from a normal or natural state to which it ought to return or be restored. In its everyday usage, disease is more than a description of how things are – it implies a way things ought to be.

Alienation theory tries to bridge the is-ought gap by treating alienation like a disease: it not only describes a state of affairs, it also considers that state of affairs as abnormal or unnatural. It carries with it an implicit normative judgment, a preference for a natural, nonalienated state that ought to be restored. To do this, alienation theory needs something to play the role of analogue to health. Just as medicine has an account of what constitutes normal or natural health, alienation theory needs an account of what constitutes normal human life. It needs a theory of human nature or of self-fulfillment that is not just relative to a given place or culture, or relative to an individual’s desires at a certain time. It needs an account of human flourishing that is in some sense natural or essential. If the alienation we moderns feel is in fact a malaise, a sort of illness, then what is needed is an account of what it would mean to end the discord, to restore the absent sense of coherence or unity. That is, for a theory of alienation to do any work, it needs a corresponding theory of authenticity.

This, in a nutshell, was the burden of Romanticism. The Romantic response to modernity was an attempt to transcend or mitigate the alienating effects of the modern world and recoup what is good and valuable in human life. The key figure here is French philosopher Jean-Jacques Rousseau.

Editorial Reviews

“It’s a fascinating approach to a fascinating subject . . . Written in a lively style that invites the reader to argue with the author, the book, at the very least, will turn the reader’s eye inward, and make us take a good, long look at the way we present ourselves to the world.”
—Booklist
“Unique insights on every page and breathtaking in scope, The Authenticity Hoax is a useful guide to understanding what we humans are all about.”
—John Zogby, Chairman of Zogby International and author of The Way We'll Be

“A totally real, genuine, authentic book about why you shoudn't believe any of those words. And it's genuinely good.”
—Gregg Easterbrook, author, Sonic Boom

“In The Authenticity Hoax, Andrew Potter masters two of the trickiest balancing acts in contemporary social criticism. He takes on a wide range of highbrow sources — from John Locke and Jean-Jacques Rousseau to Walter Benjamin and Lionel Trilling — and he makes them accessible without reducing them to cartoons. And he comments on an even wider range of pop culture items — from The Matrix to skateboarding to locally grown produce and the YouTube aesthetic — in a tone that’s pitched just right, each mordant insight framed in terms that show he understands the appeal of the quest for authenticity, even as he unmasks it. That’s the kind of criticism that changes minds.” ­
—Thomas de Zengotita, author of Mediated

“A provocative meditation on the way we live now.”
Kirkus Reviews
“A shrewd and lively discussion peppered with pop culture references and a stimulating reappraisal of the romantic strain in modern life.”
Publishers Weekly

Other titles by