Sunday, October 13, 2024

Confessions of a Republican Exile. The Atlantic - Politics / by David Brooks

Confessions of a Republican Exile
The Atlantic - Politics / by David Brooks / Oct 12, 2024 at 10:21 PM
Politically, I’m a bit of a wanderer. I grew up in a progressive family and was a proud democratic socialist through college. Then, in the Reagan-Thatcher era of the 1980s, after watching the wretched effects some progressive social policies had on poor neighborhoods in Chicago, I switched over to the right—and then remained a happy member of Team Red for decades. During the era of social thinkers like James Q. Wilson, Allan Bloom, Thomas Sowell, Jeane Kirkpatrick, and Irving Kristol, the right was just more intellectually alive. But over time I’ve become gradually more repulsed by the GOP—first by Newt Gingrich and Tom DeLay, then by the Tea Party and the Freedom Caucus, and now, of course, by Donald Trump.

So these days I find myself rooting for the Democrats about 70 percent of the time. I’ve taken up residence on what I like to call the rightward edge of the leftward tendency, and I think of myself as a moderate or conservative Democrat. But moving from Red World to Blue World is like moving to a different country. The norms, fashions, and values are all different. Whenever you move to a new place or community or faith, you love some things about it but find others off-putting. So the other 30 percent of the time a cranky inner voice says, “Screw the Democrats, I’m voting for the GOP.”

For context, let me explain a little more about my political peregrinations. I think of myself as a Whig, part of a tradition that begins with Alexander Hamilton’s Federalist Party in the 18th century, continues through the Whig Party of Henry Clay and then the early Republican Party of Abraham Lincoln in the 19th, and then extends to the Republican Party of Theodore Roosevelt in the 20th. Whigs put social mobility at the center of our politics. If liberals prioritize equality and libertarians prioritize individual freedom, Whigs ask: Which party is doing the most to expand opportunity, to help young people rise and succeed in our society? Which party is doing the most to cultivate energy, ambition, creativity, and daring in the citizenry?

Today, Whigs don’t have a permanent home. During the Reagan-Thatcher years, Republicans were the party of dynamism, but now they have become backward looking and reactionary. At the Democratic National Convention, I watched Michelle Obama talk about the generations of mothers who sacrificed so their children could rise and realize their full potential. Those are the people that Whigs like me want the American government to support. So here I find myself, almost all the way to joining Team Blue.

[Read: The Democrats aren’t on the high road anymore]

But my new suit is ill-fitting. I’m still not fully comfortable as a Democrat. And given that there are many other former Republicans who have become politically homeless in the Age of MAGA, I thought it might be useful to explain, first, what it is about the left that can make a wannabe convert like me want to flee in disgust—and then to explain why, ultimately, I’ve migrated in that direction despite sometimes having to suppress my gag reflex.

Progressive aristocrats could accept these realities and act like a ruling class that has responsibilities to all of society. But the more they dominate the commanding heights of society, the more aggressively progressive aristocrats posture as marginalized victims of oppression. Much of what has come to be called “wokeness” consists of highly educated white people who went to fantastically expensive colleges trying to show the world, and themselves, that they are victims, or at least allied with the victims. Watching Ivy League students complain about how poorly society treats them is not good for my digestion.

Elites then use progressivism as a mechanism to exclude the less privileged. To be a good progressive, you have to speak the language: intersectionality, problematic, Latinx, cisgender. But the way you learn that language is by attending some expensive school. A survey of the Harvard class of 2023 found that 65 percent of students call themselves “progressive” or “very progressive.” Kids smart enough to get into Harvard are smart enough to know that to thrive at the super-elite universities, it helps to garb yourself in designer social-justice ideology. Last spring, when the Washington Monthly surveyed American colleges to see which had encampments of Gaza protesters, it found them “almost exclusively at schools where poorer students are scarce and the listed tuitions and fees are exorbitantly high.” Schools serving primarily the middle and working classes, in contrast, had almost no encampments.

This privilege-progressivism loop is self-reinforcing. A central irony of the progressive aristocracy is that the most culturally progressive institutions in society are elite universities—but the institutions that do the most to reinforce social and economic inequality are … those same elite universities. Sure, they may assign Foucault and Fanon in their humanities classes, but their main function is to educate kids who grew up in the richest, most privileged households in America and launch them into rich and privileged adult lives.  

After college, members of the progressive aristocracy tend to cluster in insular places like Brooklyn or Berkeley where almost everybody thinks like them. If you go to the right private school, the right elite college, and live in the right urban neighborhood, you might never encounter anyone who challenges your worldview. To assure that this insularity is complete, progressives have done a very good job of purging Republicans from the sectors they dominate, like the media and the academy.

[Read: The campus-left occupation that broke higher education]

The progressive aristocracy’s assumption that all sophisticated people think like them, its tendency to opine about the right without ever having seriously engaged with a single member of that group, the general attitude of moral and intellectual superiority—in my weaker moments, all of it makes me want to go home and watch a bunch of Ben Shapiro videos.

A second trait that’s making it hard for me to fully embrace the Democratic Party is its tendency toward categorical thinking. People in Blue World are much more conscious of categories than people in Red World are. Among the Democrats, the existence of groups like White Dudes for Harris, or Asians for Harris, is considered natural and normal.

This kind of identity-politics thinking rests on a few assumptions: that a person’s gender, racial, or ethnic identity is the most important thing about them; that we should emphasize not what unites all people but what divides them; that history consists principally of the struggle between oppressor and oppressed; that a member of one group can never really understand the lived experience of someone in another group; and that the supposedly neutral institutions and practices of society—things like free speech, academic standards, and the justice system—are really just tools the dominant groups use to maintain their hegemony.

[Read: Kamala Harris and the Black elite]

These assumptions may or may not be correct (some of them are, at least to a degree), but they produce a boring way of thinking. When I’m around people with the identitarian mindset, I usually know what they are going to say next. Blue World panel discussions put less emphasis on having a true diversity of views represented than on having the correct range of the approved identity categories.

But the real problem is that categorical thinking makes it harder to see people as individuals. Better to see a person first as a unique individual, with their own distinctive way of observing and being in the world, and then to see them also as a member of historic groups, and then to understand the way they fit into existing status and social structures. To see a person well, you’ve got to see them in all three ways.

At its worst, identitarian thinking encourages the kind of destructive us-versus-them thinking—the demonization and division—human beings are so prone to. Identitarianism undermines pluralism, the key value that diverse societies need if they are to thrive. Pluralism is based on a different set of very different assumptions: Human beings can’t be reduced to their categories; people’s identities are complex and shifting; what we have in common matters more than what we don’t; politics is less often a battle between good and evil than it is a competition among partial truths; societies cannot always be neatly divided into oppressor and oppressed; and politics need not always be a Manichaean death struggle between groups but sometimes can consist of seeking the best balance among competing goods.  

I find it more pleasant to live in a culture built on pluralistic assumptions than on identitarian ones—which is why I sometimes have to grit my teeth when I visit an elite-university campus or the offices of one of the giant foundations.

The final quality keeping me from fully casting my lot with Blue World is, to borrow from the title of the classic book by the late historian and social critic Christopher Lasch, its Culture of Narcissism. In Red World, people tend to take a biblical view of the human person: We are gloriously endowed and made in the image of God—and we are deeply broken, sinful, and egotistical.

According to this way of thinking, people are most likely to thrive and act wisely when they are formed by a moral and social order. In the absence of one, they are likely to act selfish and shortsighted. This is why conservatives spend a lot of time worrying about the cohesion of families, the health of the social order, and the coherence of the moral community; we need these primeval commitments and moral guardrails to help us lead good lives.

In 2021, the conservative Christian writer Alan Noble published a book called You Are Not Your Own—a title that nicely sums up these traditional conservative beliefs. You belong to God; to your family; and to the town, nation, and civilization you call home. Your ultimate authority in life is outside the self—in God, or in the wisdom contained within our shared social and moral order.

In Blue World, by contrast, people are more likely to believe that far from being broken sinners, each of us has something beautiful and pure at our core. As the philosopher Charles Taylor put it in The Ethics of Authenticity, “Our moral salvation comes from recovering authentic moral contact with ourselves.” In this culture you want to self-actualize, listen to your own truth, be true to who you are. The ultimate authority is inside you.

But unless your name is Aristotle, it’s hard to come up with an entire moral cosmology on your own. Too often, people in a “culture of authenticity” fall into emotivism—doing whatever feels right. If you live in the world of autonomy and authenticity, you have the freedom to do what you want, but you might struggle to enjoy a sense of metaphysical belonging, a sense that your life fits into a broader scheme of meaning and eternal values.

If you lack metaphysical belonging, you have to rely on social belonging for all your belonging needs, which requires you to see your glorious self reflected in the attentions and affirmations of others. This leads to the fragile narcissism that Lasch saw coming back in 1979: “The narcissist depends on others to validate his self-esteem. He cannot live without an admiring audience. His apparent freedom from family ties and institutional constraints does not free him to stand alone or to glory in his individuality. On the contrary, it contributes to his insecurity.”

This might be why mental-health problems are so much worse in Blue World than in Red World. In one recent study, 34 percent of conservative students say they report feeling in poor mental health at least half the time. That’s pretty bad. But among very liberal students, 57 percent report poor mental health. That’s terrible.

Spending time in Blue World makes me realize how socially conservative I am. I don’t mean socially conservative in the way that term gets used to describe certain stances on hot-button cultural matters like gay marriage or trans issues. (On those topics, I hold what would be considered progressive positions.) Rather, I am a social conservative in believing that the universe has a moral order to it, that absolute right and wrong exist, and that we are either degrading our souls or elevating our souls with every little thing we do. I also believe that the strength of our society is based on the strength of our shared moral and social foundation. And I believe that any nation’s moral culture comes before politics and economics, and when the moral culture frays everything else falls apart. This places me in a conservative tradition that goes back to Edmund Burke and David Hume.

At this point you might be wondering why I don’t just stay in Red World. After all, maybe once Donald Trump’s desecration of the Republican Party ends, the GOP can once again be reconstituted as the most congenial home for a wandering Whig like me. But in the meantime, despite everything that sometimes drives me away from Blue World, there’s more that’s drawing me toward it.

For starters, it has a greater commitment to the truth. This may sound weird, but I became a conservative because of its relationship to knowledge and truth. In the 1980s, I looked around at all those progressive social-engineering projects, like urban renewal, that failed because they were designed by technocratic planners who didn’t realize that the world is more complicated than their tidy schemes could encompass. Back then, the right seemed more epistemologically humble, more able to appreciate the wisdom of tradition and the many varied ways of knowing.

But today the Republican relationship to truth and knowledge has gone to hell. MAGA is a fever swamp of lies, conspiracy theories, and scorn for expertise. The Blue World, in contrast, is a place more amenable to disagreement, debate, and the energetic pursuit of truth. As Jonathan Rauch has written, “We let alt-truth talk, but we don’t let it write textbooks, receive tenure, bypass peer review, set the research agenda, dominate the front pages, give expert testimony or dictate the flow of public dollars.” The people who perform those roles and populate the epistemic regime are mostly Democrats these days, and they’re the ones more likely to nurture a better, fairer, more fact-based and less conspiracy-deranged society.

Second, I’ve come to appreciate the Democrats’ long-standing tradition of using a pragmatic imagination. I like being around people who know that it’s really hard to design policies that will help others but who have devoted their lives to doing it well. During the Great Depression, FDR recognized that bold experimentation was called for, which led to the New Deal. During the financial crisis of the late 2000s, I watched the Obama administration display pragmatic imagination to stave off a second depression and lift the economy again. Over the past four years, I’ve watched the Biden administration use pragmatic imagination to funnel money to parts of America that have long been left behind.

Recently, I watched a current Democratic mayor and a former one talk about how to design programs to help homeless people. The current mayor had learned that moving just one homeless person into a shelter doesn’t always work well. It’s better to move an entire encampment into a well-run shelter, so people can preserve the social-support systems they’d built there. Listening to the mayors’ conversation was like listening to craftspeople talk about their trades. The discussion was substantive, hopeful, and practical. You don’t hear much of this kind of creative problem-solving from Republicans—because they don’t believe in government action.

Another set of qualities now drawing me toward the Democrats: patriotism and regular Americanness. This one has surprised me. Until recently, these qualities have been more associated with flag-waving conservatives than cosmopolitan members of the progressive aristocracy. And I confess that I went to the Democratic convention in August with a lot of skepticism: If Democrats need to win the industrial Midwest, why are they nominating a progressive from San Francisco with a history of left-wing cultural and policy positions? But the surging displays of patriotism; the string of cops, veterans, and blue-collar workers up onstage; the speeches by disaffected former Republicans; Kamala Harris’s own soaring rhetoric about America’s role in the world—all of this stood in happy contrast to the isolationist American-carnage rhetoric that has characterized the GOP in the Trump era. I’ve always felt more comfortable with the “Happy Warrior” Democratic Party of Al Smith, Hubert Humphrey, and Barbara Jordan than the Democratic Party of the Squad, and at the convention that old lineage seemed to be shining through.

But ultimately what’s pulling me away from the Republican Party and toward the Democrats is one final quality of Blue World: its greater ability to self-correct. Democrats, I’ve concluded, are better at scrutinizing, and conquering, their own shortcomings than Republicans are.

Red World suffers today from an unfortunate combination of a spiritual-superiority complex and an intellectual-inferiority complex. It’s not intellectually self-confident enough to argue with itself; absent this self-scrutiny, it’s susceptible to demagogues who tell it what to think. Blue World is now home to a greater tradition of and respect for debate. Despite what I said earlier about the rigid orthodoxy of the progressive aristocracy, the party is bigger than that, and for every Blue World person who practices identity politics, there is another who criticizes it. For every Blue World person who succumbs to the culture of narcissism, another argues that it’s shallow and destructive. For every Blue World person who thinks we should have universal basic income, another adduces evidence suggesting that the UBI saps people’s incentives to work and steers them toward playing video games on the couch.

In Blue World, I find plenty of people who are fighting against all the things I don’t like about Blue World. In Red World, however, far fewer people are fighting against what’s gone wrong with the party. (There’s a doughty band of Never Trump Republicans, but they get no hearing inside today’s GOP.) A culture or organization is only as strong as its capacity to correct its mistakes.

All of this leaves me on the periphery of Team Blue, just on the edge of the inside, which is where I believe the healthiest and most productive part of American politics now lives.

I’m mostly happy here. My advice to other conservatives disaffected by MAGA is this: If you’re under 45, stay in the Republican Party and work to make it a healthy, multiracial working-class party. If you’re over 45, acknowledge that the GOP is not going to be saved in your lifetime and join me on the other side. I don’t deny that it takes some adjustment; I find it weird being in a political culture in which Sunday brunch holds higher status than church. But Blue World is where the better angels of our nature seem lately to have migrated, and where the best hope for the future of the country now lies.


Share

Visit Website

Friday, October 11, 2024

I’m Running Out of Ways to Explain How Bad This Is

What’s happening in America today is something darker than a misinformation crisis.


October 10, 2024, 7:45 PM ET. 

By Charlie Warzel. 

The truth is, it’s getting harder to describe the extent to which a meaningful percentage of Americans have dissociated from reality. As Hurricane Milton churned across the Gulf of Mexico last night, I saw an onslaught of outright conspiracy theorizing and utter nonsense racking up millions of views across the internet. The posts would be laughable if they weren’t taken by many people as gospel. Among them: Infowars’ Alex Jones, who claimed that Hurricanes Milton and Helene were “weather weapons” unleashed on the East Coast by the U.S. government, and “truth seeker” accounts on X that posted photos of condensation trails in the sky to baselessly allege that the government was “spraying Florida ahead of Hurricane Milton” in order to ensure maximum rainfall, “just like they did over Asheville!”

Enjoy a year of unlimited access to The Atlantic—including every story on our site and app, subscriber newsletters, and more.

As Milton made landfall, causing a series of tornados, a verified account on X reposted a TikTok video of a massive funnel cloud with the caption “WHAT IS HAPPENING TO FLORIDA?!” The clip, which was eventually removed but had been viewed 662,000 times as of yesterday evening, turned out to be from a video of a CGI tornado that was originally published months ago. Scrolling through these platforms, watching them fill with false information, harebrained theories, and doctored images—all while panicked residents boarded up their houses, struggled to evacuate, and prayed that their worldly possessions wouldn’t be obliterated overnight—offered a portrait of American discourse almost too bleak to reckon with head-on.

Even in a decade marred by online grifters, shameless politicians, and an alternative right-wing-media complex pushing anti-science fringe theories, the events of the past few weeks stand out for their depravity and nihilism. As two catastrophic storms upended American cities, a patchwork network of influencers and fake-news peddlers have done their best to sow distrust, stoke resentment, and interfere with relief efforts. But this is more than just a misinformation crisis. To watch as real information is overwhelmed by crank theories and public servants battle death threats is to confront two alarming facts: first, that a durable ecosystem exists to ensconce citizens in an alternate reality, and second, that the people consuming and amplifying those lies are not helpless dupes but willing participants.

Don’t miss what matters. Sign up for The Atlantic Daily newsletter.

Email Address
Sign Up
Some of the lies and obfuscation are politically motivated, such as the claim that FEMA is offering only $750 in total to hurricane victims who have lost their home. (In reality, FEMA offers $750 as immediate “Serious Needs Assistance” to help people get basic supplies such as food and water.) Donald Trump, J. D. Vance, and Fox News have all repeated that lie. Trump also posted (and later deleted) on Truth Social that FEMA money was given to undocumented migrants, which is untrue. Elon Musk, who owns X, claimed—without evidence—that FEMA was “actively blocking shipments and seizing goods and services locally and locking them away to state they are their own. It’s very real and scary how much they have taken control to stop people helping.” That post has been viewed more than 40 million times. Other influencers, such as the Trump sycophant Laura Loomer, have urged their followers to disrupt the disaster agency’s efforts to help hurricane victims. “Do not comply with FEMA,” she posted on X. “This is a matter of survival.”

The result of this fearmongering is what you might expect. Angry, embittered citizens have been harassing government officials in North Carolina, as well as FEMA employees. According to an analysis by the Institute for Strategic Dialogue, an extremism-research group, “Falsehoods around hurricane response have spawned credible threats and incitement to violence directed at the federal government,” including “calls to send militias to face down FEMA.” The study also found that 30 percent of the X posts analyzed by ISD “contained overt antisemitic hate, including abuse directed at public officials such as the Mayor of Asheville, North Carolina; the FEMA Director of Public Affairs; and the Secretary of the Department of Homeland Security.” The posts received a collective 17.1 million views as of October 7.

Online, first responders are pleading with residents, asking for their help to combat the flood of lies and conspiracy theories. FEMA Administrator Deanne Criswell said that the volume of misinformation could hamper relief efforts. “If it creates so much fear that my staff doesn’t want to go out in the field, then we’re not going to be in a position where we can help people,” she said in a news conference on Tuesday. In Pensacola, Florida, Assistant Fire Chief Bradley Boone vented his frustrations on Facebook ahead of Milton’s arrival: “I’m trying to rescue my community,” he said in a livestream. “I ain’t got time. I ain’t got time to chase down every Facebook rumor … We’ve been through enough.”

Make your inbox more interesting with newsletters from your favorite Atlantic writers.

It is difficult to capture the nihilism of the current moment. The pandemic saw Americans, distrustful of authority, trying to discredit effective vaccines, spreading conspiracy theories, and attacking public-health officials. But what feels novel in the aftermath of this month’s hurricanes is how the people doing the lying aren’t even trying to hide the provenance of their bullshit. Similarly, those sharing the lies are happy to admit that they do not care whether what they’re pushing is real or not. Such was the case last week, when Republican politicians shared an AI-generated viral image of a little girl holding a puppy while supposedly fleeing Helene. Though the image was clearly fake and quickly debunked, some politicians remained defiant. “Y’all, I don’t know where this photo came from and honestly, it doesn’t matter,” Amy Kremer, who represents Georgia on the Republican National Committee, wrote after sharing the fake image. “I’m leaving it because it is emblematic of the trauma and pain people are living through right now.”

Kremer wasn’t alone. The journalist Parker Molloy compiled screenshots of people “acknowledging that this image is AI but still insisting that it’s real on some deeper level”—proof, Molloy noted, that we’re “living in the post-reality.” The technology writer Jason Koebler argued that we’ve entered the “‘Fuck It’ Era” of AI slop and political messaging, with AI-generated images being used to convey whatever partisan message suits the moment, regardless of truth.

This has all been building for more than a decade. On The Colbert Report, back in 2005, Stephen Colbert coined the word truthiness, which he defined as “the belief in what you feel to be true rather than what the facts will support.” This reality-fracturing is the result of an information ecosystem that is dominated by platforms that offer financial and attentional incentives to lie and enrage, and to turn every tragedy and large event into a shameless content-creation opportunity. This collides with a swath of people who would rather live in an alternate reality built on distrust and grievance than change their fundamental beliefs about the world. But the misinformation crisis is not always what we think it is.

So much of the conversation around misinformation suggests that its primary job is to persuade. But as Michael Caulfield, an information researcher at the University of Washington, has argued, “The primary use of ‘misinformation’ is not to change the beliefs of other people at all. Instead, the vast majority of misinformation is offered as a service for people to maintain their beliefs in face of overwhelming evidence to the contrary.” This distinction is important, in part because it assigns agency to those who consume and share obviously fake information. What is clear from comments such as Kremer’s is that she is not a dupe; although she may come off as deeply incurious and shameless, she is publicly admitting to being an active participant in the far right’s world-building project, where feel is always greater than real.

What we’re witnessing online during and in the aftermath of these hurricanes is a group of people desperate to protect the dark, fictitious world they’ve built. Rather than deal with the realities of a warming planet hurling once-in-a-generation storms at them every few weeks, they’d rather malign and threaten meteorologists, who, in their minds, are “nothing but a trained subversive liar programmed to spew stupid shit to support the global warming bullshit,” as one X user put it. It is a strategy designed to silence voices of reason, because those voices threaten to expose the cracks in their current worldview. But their efforts are doomed, futile. As one dispirited meteorologist wrote on X this week, “Murdering meteorologists won’t stop hurricanes.” She followed with: “I can’t believe I just had to type that.”

What is clear is that a new framework is needed to describe this fracturing. Misinformation is too technical, too freighted, and, after almost a decade of Trump, too political. Nor does it explain what is really happening, which is nothing less than a cultural assault on any person or institution that operates in reality. If you are a weatherperson, you’re a target. The same goes for journalists, election workers, scientists, doctors, and first responders. These jobs are different, but the thing they share is that they all must attend to and describe the world as it is. This makes them dangerous to people who cannot abide by the agonizing constraints of reality, as well as those who have financial and political interests in keeping up the charade.


In one sense, these attacks—and their increased desperation—make sense. The world feels dark; for many people, it’s tempting to meet that with a retreat into the delusion that they’ve got everything figured out, that the powers that be have conspired against them directly. But in turning away, they exacerbate a crisis that has characterized the Trump era, one that will reverberate to Election Day and beyond. Americans are divided not just by political beliefs but by whether they believe in a shared reality—or desire one at all.

About the Author
Charlie Warzel is a staff writer at The Atlantic and the author of its newsletter Galaxy Brain, about technology, media, and big ideas. He can be reached via email.
More Stories

Sunday, October 6, 2024

How odd Christian beliefs about sex shape the world

Despite their shaky grounding in scripture


Read time: 6 minutes

A painting of Adam and Eve, obscurred and censored by pixelation

Illustration: Carl Godfrey

Sep 13th 2024

Lower than the Angels: A History of Sex and Christianity. By Diarmaid MacCulloch. Allen Lane; 688 pages; £35. To be published in America by Viking in April 2025; $40.


The worry was the Virgin Mary’s vagina. Early Christians were very clear on some things. They knew that the Holy Spirit had made the Virgin Mary pregnant but that she was still a virgin. What they were not quite sure about was how those two things could both be true. How, in short, had God got in?


Theologians set about solving this riddle with great debate—and a healthy disregard for biology. Almost no orifice was off limits. God had entered Mary through her eyes, suggested one text. Another scholar thought He had come in through her ear. A third suggested that He had impregnated Mary through her nose—which was inventive, if hard to imagine being incorporated into the annual school nativity play.


God is odd about sex. The Bible and Christian writings are odder yet. If all this weirdness affected only believers, it would be important enough. With more than 2bn adherents, Christianity is the world’s largest religion and—though it might not always feel like it in the smugly secularising West—is still growing in many regions.


But Christianity’s sexual hang-ups—on everything from celibacy to contraception, homosexuality and more—carry consequences for more than the faithful. In America abortion could sway the election. In Russia Vladimir Putin signed legislation against “non-traditional sexual relations”. In Britain a fight over ending restrictions on abortion is brewing. This is a good time to try to understand sex and Christianity.


Modern Christians often look to the Bible for clear answers to sexual questions. But clear answers are impossible to find, argues a compendious new book on sex and Christianity. Its author, Diarmaid MacCulloch, is an Oxford academic whose big, fat books on Christianity are almost always a big deal, winning him awards and starring roles in television series.


The problem is that the Bible, which comprises 60-odd books composed over a period of a millennium and more, is less a book than a library—and displays a correspondingly broad range of sexual attitudes. Its pages offer monogamous marriages, polygamous ones, rape, racy poetry, fulminations about homosexuality and tender descriptions of a man’s passion for his male lover. There is, Mr MacCulloch writes, “no such thing as a single Christian theology of sex”.


Not that such an inconvenient truth has ever stopped Christians from claiming that there is—or getting cross with those they see as deviating from it. From those who burned “sodomites” at the stake in the 12th century to those who flame “deviants” on social media today, Christians have a habit of getting angry about this stuff. Where once they argued about transubstantiation, now they are far more likely to argue about trans issues, notes Mr MacCulloch.


He has a point: the entire Anglican Communion, the third-largest club of Christian churches (after Roman Catholic and Eastern Orthodox), has for years been in danger of a schism. Its members are sparring about whether or not to allow gay marriages in churches. Add the horror over the scale of Catholic priests’ sexual abuse of children, as well as arguments over contraception, abortion and the ordination of women, and it is possible to see why Mr MacCulloch writes that sex and gender are currently causing more arguments within the church than “at virtually any time over the last two millennia of Christian life”.


Any religion is as much almost random accretion as actual doctrine. Christianity’s sexual obsessions are no different. Much of what people “know” about Christianity is, to put it mildly, hard to find in the Bible. There was, for example, no apple in Eden (it reputedly grew out of a translator’s pun: the words for “apple” and “evil” are almost identical in Latin). As a fiery place of torture, hell is similarly almost entirely absent from the pages of the New Testament. And the word “daily” in the Lord’s prayer—often the only Christian prayer that many know—is pure bunkum. (No one has a clue what the Greek word that appears before the word “bread” actually means.)


Christians may have banged on about sex, celibacy and homosexuality for centuries, but, in truth, Jesus had precious little to say about any of them. Though he was fiery in his condemnation of greedy people, he had absolutely nothing to say about gay ones; yet, as one modern theologian pithily pointed out, “No medieval states burned the greedy at the stake.” There is, similarly, little in the way of Christian “family values” to be spotted in the life of this man who was rude to his mother and who himself never married.


Christianity’s oddness about sex and families can be traced, in part, to Christ’s odd start in life. The Mary-Joseph-God ménage à trois was unusual enough for Mary—and was not much fun for Joseph either. While all that was going on between his betrothed and God, St Joseph had to sit on the sidelines—sometimes sanguine, occasionally annoyed, eventually sanctified. Rarely has a man deserved his sainthood more. There were, as Mr MacCulloch puts it, “three of them in that marriage, so it was a bit theologically crowded”.


To understand where the various Christian sexual hang-ups come from, Mr MacCulloch goes on a quick tour of the heroes and villains of two millennia of Christian theology, from St Paul (whose angry epistles inspired centuries of homophobia), via St Jerome (who championed celibacy), and on to St Augustine (who, having screwed around in the fleshpots of Carthage, then helped screw up the ensuing 16 centuries of Christians with his doctrine of original sin). Things finally brighten up a bit with the humanist scholar Erasmus, who in 1518 published a pamphlet championing the pleasures of marriage, dedicated to a patron with the improbable if unimprovable name of “Lord Mountjoy”.


Mr MacCulloch offers other similarly pleasing titbits. It is, for example, interesting to learn that the word “buggery” is a corruption of the word “Bulgarian”, because medieval Christians accused heretics who were thought to come from Bulgaria of it. But far too much of this book is heavy going. Mr MacCulloch’s great strength is that he knows a vast amount. His great weakness is that he has written it all down, over 497 pages, in a tiny font. Doubtless there are some who will thrill to discover that in 451AD, at the Council of Chalcedon, a non-Chalcedonian church “proudly adhered to the ‘Dyophysite’ theology of the displaced Patriarch of Constantinople Nestorios”. Many more will be left scratching their heads.


Does it matter that many will buy Mr MacCulloch’s book, but perhaps not finish it? Christian attitudes to sex are so important in world politics at the moment. But it feels like a mistake to take this oddness towards sex too much on its own terms. Why are American conservatives currently crushing women’s reproductive rights? Why is the Russian Orthodox church inveighing against homosexuality? The writings of St Augustine and St Paul offer one answer. Perhaps a simpler answer is provided by the old saying that everything in the world is about sex, except for sex, which is about power. The Christian church, which has been described as the most powerful persecuting force that the world has ever seen, knows this well. ■


This article appeared in the Culture section of the print edition under the headline “Christianity’s sex addiction”

Saturday, October 5, 2024

It’s Time to Stop Taking Sam Altman at His Word

 It’s Time to Stop Taking Sam Altman at His Word

Understand AI for what it is, not what it might become.

By David Karpf

Photograph of Sam Altman

SeongJoon Cho / Bloomberg / Getty

October 4, 2024, 12:57 PM ET

Share

Save

OpenAI announced this week that it has raised $6.6 billion in new funding and that the company is now valued at $157 billion overall. This is quite a feat for an organization that reportedly burns through $7 billion a year—far more cash than it brings in—but it makes sense when you realize that OpenAI’s primary product isn’t technology. It’s stories.

Case in point: Last week, CEO Sam Altman published an online manifesto titled “The Intelligence Age.” In it, he declares that the AI revolution is on the verge of unleashing boundless prosperity and radically improving human life. “We’ll soon be able to work with AI that helps us accomplish much more than we ever could without AI,” he writes. Altman expects that his technology will fix the climate, help humankind establish space colonies, and discover all of physics. He predicts that we may have an all-powerful superintelligence “in a few thousand days.” All we have to do is feed his technology enough energy, enough data, and enough chips.

Maybe someday Altman’s ideas about AI will prove out, but for now, his approach is textbook Silicon Valley mythmaking. In these narratives, humankind is forever on the cusp of a technological breakthrough that will transform society for the better. The hard technical problems have basically been solved—all that’s left now are the details, which will surely be worked out through market competition and old-fashioned entrepreneurship. Spend billions now; make trillions later! This was the story of the dot-com boom in the 1990s, and of nanotechnology in the 2000s. It was the story of cryptocurrency and robotics in the 2010s. The technologies never quite work out like the Altmans of the world promise, but the stories keep regulators and regular people sidelined while the entrepreneurs, engineers, and investors build empires. (The Atlantic recently entered a corporate partnership with OpenAI.)

Read: AI doomerism is a decoy

Despite the rhetoric, Altman’s products currently feel less like a glimpse of the future and more like the mundane, buggy present. ChatGPT and DALL-E were cutting-edge technology in 2022. People tried the chatbot and image generator for the first time and were astonished. Altman and his ilk spent the following year speaking in stage whispers about the awesome technological force that had just been unleashed upon the world. Prominent AI figures were among the thousands of people who signed an open letter in March 2023 to urge a six-month pause in the development of large language models ( LLMs) so that humanity would have time to address the social consequences of the impending revolution. Those six months came and went. OpenAI and its competitors have released other models since then, and although tech wonks have dug into their purported advancements, for most people, the technology appears to have plateaued. GPT-4 now looks less like the precursor to an all-powerful superintelligence and more like … well, any other chatbot.

The technology itself seems much smaller once the novelty wears off. You can use a large language model to compose an email or a story—but not a particularly original one. The tools still hallucinate (meaning they confidently assert false information). They still fail in embarrassing and unexpected ways. Meanwhile, the web is filling up with useless “AI slop,” LLM-generated trash that costs practically nothing to produce and generates pennies of advertising revenue for the creator. We’re in a race to the bottom that everyone saw coming and no one is happy with. Meanwhile, the search for product-market fit at a scale that would justify all the inflated tech-company valuations keeps coming up short. Even OpenAI’s latest release, o1, was accompanied by a caveat from Altman that “it still seems more impressive on first use than it does after you spend more time with it.”

In Altman’s rendering, this moment in time is just a waypoint, “the doorstep of the next leap in prosperity.” He still argues that the deep-learning technique that powers ChatGPT will effectively be able to solve any problem, at any scale, so long as it has enough energy, enough computational power, and enough data. Many computer scientists are skeptical of this claim, maintaining that multiple significant scientific breakthroughs stand between us and artificial general intelligence. But Altman projects confidence that his company has it all well in hand, that science fiction will soon become reality. He may need $7 trillion or so to realize his ultimate vision—not to mention unproven fusion-energy technology—but that’s peanuts when compared with all the advances he is promising.

There’s just one tiny problem, though: Altman is no physicist. He is a serial entrepreneur, and quite clearly a talented one. He is one of Silicon Valley’s most revered talent scouts. If you look at Altman’s breakthrough successes, they all pretty much revolve around connecting early start-ups with piles of investor cash, not any particular technical innovation.

Read: OpenAI takes its mask off

It’s remarkable how similar Altman’s rhetoric sounds to that of his fellow billionaire techno-optimists. The project of techno-optimism, for decades now, has been to insist that if we just have faith in technological progress and free the inventors and investors from pesky regulations such as copyright law and deceptive marketing, then the marketplace will work its magic and everyone will be better off. Altman has made nice with lawmakers, insisting that artificial intelligence requires responsible regulation. But the company’s response to proposed regulation seems to be “no, not like that.” Lord, grant us regulatory clarity—but not just yet.

At a high enough level of abstraction, Altman’s entire job is to keep us all fixated on an imagined AI future so we don’t get too caught up in the underwhelming details of the present. Why focus on how AI is being used to harass and exploit children when you can imagine the ways it will make your life easier? It’s much more pleasant fantasizing about a benevolent future AI, one that fixes the problems wrought by climate change, than dwelling upon the phenomenal energy and water consumption of actually existing AI today.

Remember, these technologies already have a track record. The world can and should evaluate them, and the people building them, based on their results and their effects, not solely on their supposed potential.

About the Author

David Karpf is an associate professor in the School of Media and Public Affairs at the George Washington University.


It’s Time to Stop Taking Sam Altman at His Word

It’s Time to Stop Taking Sam Altman at His Word
Understand AI for what it is, not what it might become.

Photograph of Sam Altman
SeongJoon Cho / Bloomberg / Getty
Share
Save
OpenAI announced this week that it has raised $6.6 billion in new funding and that the company is now valued at $157 billion overall. This is quite a feat for an organization that reportedly burns through $7 billion a year—far more cash than it brings in—but it makes sense when you realize that OpenAI’s primary product isn’t technology. It’s stories.

Case in point: Last week, CEO Sam Altman published an online manifesto titled “The Intelligence Age.” In it, he declares that the AI revolution is on the verge of unleashing boundless prosperity and radically improving human life. “We’ll soon be able to work with AI that helps us accomplish much more than we ever could without AI,” he writes. Altman expects that his technology will fix the climate, help humankind establish space colonies, and discover all of physics. He predicts that we may have an all-powerful superintelligence “in a few thousand days.” All we have to do is feed his technology enough energy, enough data, and enough chips.

Enjoy a year of unlimited access to The Atlantic—including every story on our site and app, subscriber newsletters, and more.

Maybe someday Altman’s ideas about AI will prove out, but for now, his approach is textbook Silicon Valley mythmaking. In these narratives, humankind is forever on the cusp of a technological breakthrough that will transform society for the better. The hard technical problems have basically been solved—all that’s left now are the details, which will surely be worked out through market competition and old-fashioned entrepreneurship. Spend billions now; make trillions later! This was the story of the dot-com boom in the 1990s, and of nanotechnology in the 2000s. It was the story of cryptocurrency and robotics in the 2010s. The technologies never quite work out like the Altmans of the world promise, but the stories keep regulators and regular people sidelined while the entrepreneurs, engineers, and investors build empires. (The Atlantic recently entered a corporate partnership with OpenAI.)

About the Author
David Karpf is an associate professor in the School of Media and Public Affairs at the George Washington University.
More Stories

Sunday, September 29, 2024

The Republican freak show. The Atlantic - Politics / by Peter Wehner

The Republican Freak Show
 / Sep 28, 2024 at 11:21 PM
The GOP is a moral freak show, and freak shows attract freaks. Which is why Mark Robinson fits in so well in today’s Republican Party.

Robinson, the Republican candidate for governor in North Carolina, has described himself as a “devout Christian.” But a recent CNN story reported that several years ago, he was a porn-site user who enjoyed watching transgender pornography (despite a history of an anti-transgender rhetoric), referred to himself as a “Black Nazi,” and supported the return of slavery. According to CNN, commenters on the website discussed whether to believe the story of a woman who said she was raped by her taxi driver while intoxicated. Robinson wrote in response, “And the moral of this story….. Don’t f**k a white b*tch!” Politico reports that Robinson’s email address was also registered on Ashley Madison, a website for married people seeking affairs. (Robinson, the current lieutenant governor of North Carolina, has denied all of the claims.)

These allegations aren’t entirely shocking, because Robinson—a self-described “MAGA Republican”—has shown signs in the past of being a deeply troubled person. (My Atlantic colleague David Graham wrote a superb profile of Robinson in May.)

[David A. Graham: Mark Robinson is testing the bounds of GOP extremism]

Regarding the dedication of the Martin Luther King Jr. Memorial, in 2011, Robinson wrote, “Get that fucking commie bastard off the National Mall!” Robinson also has referred to the slain civil-rights champion as “worse than a maggot,” a “ho fucking, phony,” and a “huckster.” During the Obama presidency, Robinson wrote, “I’d take Hitler over any of the shit that’s in Washington right now!” He promoted the conspiracy theory claiming that Obama was born in Kenya. He referred to Michelle Obama as a man and Hillary Clinton as a “heifer.” He compared Nancy Pelosi to Hitler, Mao, Stalin, and Castro and mocked the near-fatal assault on her husband, Paul Pelosi. He is also an election denier, claiming that Joe Biden “stole the election.”

In 2017, Robinson wrote, “There is a REASON the liberal media fills the airwaves with programs about the NAZI and the ‘6 million Jews’ they murdered.” He has used demeaning language against Jews and gay people. He has cruelly mocked school-shooting survivors (“media prosti-tots”). And he supported a total ban on abortion, without exceptions for rape or incest, even though he admitted that he’d paid for an abortion in the past.

Much of this was known before he ran for governor. No matter. Republicans in North Carolina nominated him anyway, and Donald Trump has lavished praise on the man he calls his “friend,” offering Robinson his “full and total endorsement” and dubbing him “one of the hottest politicians” in the country.

SOME REPUBLICANS ARE distancing themselves from Robinson partly because they are worried he’ll be defeated, but also because they’re even more concerned that he will drag down other Republicans, including Trump. But the truth is that Robinson is a perfect addition to the Republican ensemble.

The GOP vice-presidential candidate, J. D. Vance, has been relentlessly promoting the lie that Haitians in Springfield, Ohio, were abducting and eating pets. In 2021, he said that the United States was being run by Democrats, corporate oligarchs, and “a bunch of childless cat ladies who are miserable at their own lives and the choices that they’ve made and so they want to make the rest of the country miserable, too.”

Representative Marjorie Taylor Greene has blamed wildfires on a Jewish space laser, promoted a conspiracy alleging that some Democratic Party leaders were running a human-trafficking and pedophilia ring, and agreed with commenters who suggested that the 2018 shooting at Marjory Stoneman Douglas High School, in Florida, was a “massive false flag.” Another House Republican, Paul Gosar, has promoted fluoride conspiracy theories and posted an animated video depicting him slashing the throat of a Democratic congresswoman and attacking President Biden. Yet another Republican member of Congress, Lauren Boebert, was ejected from a family-friendly musical for vaping, being disruptive, and groping her date (and vice versa). She also falsely claimed that school authorities “are putting litter boxes in schools for people who identify as cats.”

The Atlantic’s Elaine Godfrey reported that Republican Representative Matt Gaetz, who is under House investigation for having sex with an underage girl, “used to walk around the cloakroom showing people porno of him and his latest girlfriend,” according to a source Godfrey spoke with.  

[Read: Matt Gaetz is winning]

This is not normal.

The GOP is home to a Republican governor, Kristi Noem, who describes in her book shooting her 14-month-old dog, Cricket, in a gravel pit, as well as killing an unnamed goat. A Republican senator, Ron Johnson, claimed that COVID was “pre-planned” by a secret group of “elites” even while he promoted disinformation claiming that Ivermectin, which is commonly used to deworm livestock, was an effective treatment for COVID. (Because people were hospitalized for taking the drug, the FDA tweeted, “You are not a horse. You are not a cow.”)

Earlier this month, Trump attended a 9/11 memorial event in New York City. He took as his guest a right-wing conspiracy theorist, Laura Loomer, who has claimed that 9/11 was an inside job, referred to Kamala Harris as a “drug using prostitute,” and said that Democrats should be tried for treason and executed. (Trump has called Loomer a “woman with courage” and a “free spirit.”)

Trump’s first national security adviser, Michael Flynn, floated the idea of having Trump declare martial law so that he could “rerun” the 2020 election. He suggested that the president should seize voting machines. He predicted that a governor will soon declare war. He has also warned about the dangers of a “new world order” in which people such as Bill Gates, George Soros, and World Economic Forum Executive Chairman Klaus Schwab “have an intent to track every single one of us, and they use it under the skin. They use a means by which it’s under the skin.”

Tucker Carlson, a keynote speaker at the Republican National Convention and an unofficial Trump adviser, recently hosted a Holocaust revisionist on his podcast. He praised the conspiracy theorist Alex Jones as having been “vindicated on everything” and described Jones as “the most extraordinary person” he has ever met. (Two years ago, Sandy Hook families won nearly $1.5 billion in defamation and emotional-distress lawsuits against Jones for his repeatedly calling the 2012 school shooting, in which 20 first graders and six educators were killed, a hoax staged by “crisis actors” to get more gun-control legislation passed. As The New York Times reports, “The families suffered online abuse, personal confrontations and death threats from people who believed the conspiracy theory.”)

Carlson, one of the most influential figures on the American right, has also peddled the claim that the violence on January 6, 2021, was a “false flag” operation involving the FBI and used to discredit Trump supporters; alleged that former Attorney General Bill Barr covered up the murder of Jeffrey Epstein; and promoted testicle tanning.

Then there’s Robert F. Kennedy Jr., a former Democrat who recently endorsed Trump. The former president has asked Kennedy to be on his transition team should Trump win the election and “help pick the people who will be running the government and I am looking forward to that.” Trump told CNN’s Kristen Holmes, “I like him, and I respect him. He’s a brilliant guy. He’s a very smart guy.”

Sara Dorn of Forbes listed some of the conspiracy theories that Kennedy has promoted—vaccines can cause autism; COVID was genetically engineered and is targeted to attack Caucasian and Black people (and Ashkenazi Jews and Chinese people are mostly immune); mass shootings are linked to Prozac; the 2004 presidential election was stolen from John Kerry; the CIA was involved in the death of his uncle John F. Kennedy; and Sirhan Sirhan was wrongly convicted of murdering his father.

In addition, Kennedy, who has revealed that he had a parasitic brain worm, told the podcaster Joe Rogan that Wi-Fi causes cancer and “leaky brain.” He believes that chemicals in the water supply could turn children transgender. He claims that 5G networks are being used for mass surveillance. He’s said that Katherine Maher, the president and CEO of NPR, is a CIA agent. “Even journals like Smithsonian and National Geographic … appear to be compromised by the CIA,” according to Kennedy.

[Read: Why RFK Jr. endorsed Trump]

According to Kennedy’s daughter Kick Kennedy, her father chain-sawed the head off a dead whale on a beach in Hyannis Port, Massachusetts, bungee-corded it to the roof of their car, and drove it five hours to the family home in Mount Kisco, New York. (The severed head streamed “whale juice” down the side of the family minivan on the trip home. “It was the rankest thing on the planet,” Kick told Town & Country magazine in 2012. “We all had plastic bags over our heads with mouth holes cut out, and people on the highway were giving us the finger, but that was just normal day-to-day stuff for us.”) Kennedy has also recently admitted to leaving the carcass of a bear cub in Central Park a decade ago, as a joke.

Donald Trump Jr. has said that he could see Kennedy being given some sort of oversight role in any number of government agencies if his father is reelected, including the FDA and the Department of Health and Human Services. “I can see a dozen roles I’d love to see him in.”

Like Mark Robinson, RFK Jr. fits right in.

THE REPUBLICAN PARTY today isn’t incidentally grotesque; like the man who leads it, Donald Trump, it is grotesque at its core. It is the Island of Misfit Toys, though in this case there’s a maliciousness to the misfits, starting with Trump, that makes them uniquely dangerous to the republic. Since 2016, they have been at war with reality, delighting in their dime-store nihilism, creating “alternative facts” and tortured explanations to justify the lawlessness and moral depravity and derangement of their leader.

None of this is hidden; it is on display in neon lights, almost every hour of every day. No one who supports the Republican Party, who casts a vote for Trump and for his MAGA acolytes, can say they don’t know.

They know.

Aleksandr Solzhenitsyn, in an essay titled “As Breathing and Consciousness Return,” warned that no one who “voluntarily runs with the hounds of falsehood” will be able to justify himself to the living, or to posterity, or to his friends, or to his children. Don’t surrender to corruption, the great Russian writer and dissident said; strive for the liberation of our souls by not participating in the lie. Don’t consent to the lies. The challenges facing Solzhenitsyn were quite different, and certainly far more difficult, than anything we face, but his fundamental point still holds.

The Trump movement is built on layers of lies. It’s late, but it’s never too late to liberate yourself from them. One word of truth outweighs the world.


Share

Visit Website

Friday, September 27, 2024

The Problems with Polls | Samuel Earle | The New York Review of Books


archive.md
The Problems with Polls | Samuel Earle | The New York Review of Books
October 17, 2024, issue.
30 - 39 minutes

https://archive.md/Y7AmQ



The twenty-first century was supposed to be a new golden age for political polling. In 2008 Nate Silver, a thirty-year-old sports journalist, became an overnight celebrity after predicting Barack Obama’s election victory with uncanny accuracy, calling forty-nine of fifty states correctly on his personal website, FiveThirtyEight. His method was to aggregate multiple polls, weight them based on various factors, and then subject them to the kind of forensic statistical analysis used to evaluate the performance of baseball players.

In the 2012 presidential election, Silver went from celebrity to sage. He picked the winner in all fifty states while traditional pollsters delivered mixed results. “You know who won the election tonight?” asked Rachel Maddow. “Nate Silver.” According to Marie Davidian, the president of the American Statistical Association, the reason Silver “could predict the election perfectly” was simple: “dispassionate use of the data.” The New Republic declared that it was “1936 all over again”—a reference to the year that launched modern polling, when pollsters like George Gallup and Elmo Roper predicted Franklin D. Roosevelt’s victory, upstaging more old-fashioned election forecasts that he would lose to Alfred Landon. Their innovation was the sample survey—gathering responses from a group of people deemed representative of the entire population, according to characteristics such as age, gender, and race, rather than gathering as many responses as possible through much larger but untargeted opt-in surveys or straw polls. Silver’s innovation was to bring the sample survey into the age of big data.

The excitement of 2012 proved short-lived. In the 2016 election, polls were ubiquitous—by one count, television networks discussed election forecasts around sixteen times a day—but Donald Trump defied almost all their predictions and won the presidency. Worse than that, the polls were accused of enabling his victory by creating a fog of complacency that inadvertently sank Hillary Clinton’s candidacy. In his book A Higher Loyalty (2018), for example, former FBI director James Comey expressed regret for publicizing the bureau’s resumed investigation into her e-mails mere days before the election. “I had assumed from media polling that Hillary Clinton was going to win,” he wrote.

After Trump won, the polling industry joined journalists—many of whom were lulled into similar complacency by misleading polling numbers—in a period of soul-searching. How had their supposedly objective methods underestimated Trump’s support so starkly? Their British colleagues’ failure to foresee the Brexit vote months earlier enhanced the mood of doubt and introspection. Then, in 2020, after concerted efforts by polling companies and their aggregators to correct previous mistakes, the polls ended up being more inaccurate than at any time since 1980. The polling industry plunged into a reputational crisis from which it has yet to recover fully.

In Strength in Numbers: How Polls Work and Why We Need Them, the journalist and data scientist G. Elliott Morris sets out to defend the polling industry against its detractors and restore some self-confidence to his peers. “The rush to declare polling dead is misguided,” he writes. Morris understands the challenges polls face today: plummeting response rates, rising costs, erratic voting behavior, and public suspicion of pollsters (particularly among Republicans). But he argues that the real problem is not so much the polls as the public’s and the press’s misunderstandings of how they work. For Morris, the answer is not fewer polls but more of them, with audiences better educated to interpret and—most importantly—appreciate them. After all, he asks, “would we want to go back to sending out newspaper reporters to trawl the streets for enough willing participants to release straw polls before voting day?”

Morris’s bullishness is typical of the polling industry, a reflex that shields it from facing knottier questions about polling’s political and social usefulness. To many, the point of it seems self-evident: political polls measure public opinion, and every democracy should want its leaders to know more about what the public thinks than the broad results that elections can provide. “Good polls can reveal the will of the people,” Morris writes. “Condemning them as worthless is dangerous to this cause.” But that obscures their greatest achievement and larger influence, which lies not in any particular prediction or service to democracy but in the industry’s complete co-option of our understanding of public opinion, a concept that predates polling but that we can no longer imagine without it. The nature of this conquest now seems so natural, so self-evident, that it passes without remark—even in a book on the achievements of polling.

Public opinion has always been an elusive concept. “How does this vague, fluctuating complex thing we call public opinion—omnipotent yet indeterminate—a sovereign to whose voice everyone listens, yet whose words, because he speaks with as many tongues as the waves of a boisterous sea, it is so hard to catch—how does public opinion express itself in America?” the British jurist, historian, and Liberal politician James Bryce asked in The American Commonwealth (1888). A half-century later Gallup invoked Bryce and announced that he had found the answer: polling with sample surveys. It was as if polls would do for public opinion in the twentieth century what Eadweard Muybridge’s photographs had done for animal motion in the nineteenth century: reveal to a wide audience what was previously imperceptible to the naked eye. But like those of photography, polling’s claims of accuracy—one early pollster called the sample survey a “psychological X-Ray”—veiled an intrinsic deception: it was all too easy to forget how reality is framed and flattened by the medium’s design. (Pollsters have cultivated comparisons with photography, describing their polls as “snapshots”—ironically as a way to prove both their accuracy and their partialness.)

Following their success in 1936, Gallup and his fellow pollsters—the term was coined in 1939—promised that polling would revolutionize not only our understanding of public opinion but democracy itself. No longer would voters need to rely on elections to make their voices heard. No longer would politicians need to gauge public opinion by the size and volume of the crowds that cheered or jeered them or by the ventriloquism of journalists. Thanks to their “new instrument,” Gallup wrote, “the will of the majority of citizens can be ascertained at all times,” realizing a “truer democracy” and ensuring—“with little probability of error”—that dictatorships will “become mere bogey stories to frighten our great-grandchildren.” Such optimism was shared by Roper, who claimed that the public opinion survey represented “the greatest contribution to democracy since the introduction of the secret ballot.”

Gallup and Roper did not invent the sample survey. They imported it from the increasingly professionalized field of market research, where both their careers began. It is hard to determine whether advancing democracy was an honest goal or simply part of their marketing spiel. But it’s clear they thought that political polling would make them rich. “If it works for toothpaste, why not for politics?” Gallup reasoned. “I saw [it] as a veritable gold mine if we could learn fast enough how to use it in all of its ramifications,” Roper said.

These early pollsters preferred to ground the industry’s origin story in the scientific method rather than the profit motive. To this end, journals, institutions, and complex terminology proliferated in the field’s first decades, giving polling the aura of scientific inquiry. Gallup played the role of scientist, comparing his craft to that of a meteorologist. He made sure his name was always prefixed by “Dr.”—he had received his Ph.D. in applied psychology in 1928—and he made a great performance of not voting in elections, which supposedly proved that he was separate from “the new science of public opinion” he studied. “We have not the slightest interest in who wins an election,” Gallup said. “All we want to do is be right.” Roper agreed, describing the field as an “infant science.”

Some of polling’s problems in measuring public opinion are indeed typical of the natural sciences: supposedly “objective” methods were, and still are, suffused with the prejudices of their day, creating blind spots and distortions that only become clear in hindsight. In the early decades of polling, for instance, college-educated white men were widely assumed to be more interested in politics than anyone else, and so survey research drastically underrepresented black people, women, and low-income households in pursuit of accuracy. (Surveyors also preferred spending time in more affluent areas and households, while poorer neighborhoods were sometimes avoided out of fear.) Such problems persist: one explanation for polling’s failure to predict Trump’s win in 2016 is that college graduates, who were more likely to favor Clinton, were overrepresented among respondents.

Other problems with polling are typical of the social sciences: every attempt to study how people think and act has the potential to influence how they think and act, thus changing what is being recorded, either in self-fulfilling or self-negating ways. The results of any poll on a particular issue are liable to change how people think about that issue, just as any poll showing a candidate’s popularity is liable to influence that candidate’s popularity. The effects are unpredictable: some social scientists record a bandwagon effect, when people rally behind a candidate who is ahead in the polls, while other studies point to an underdog effect, when the opposite happens. Add to this respondents’ hypersensitivity toward the wording and ordering of questions—Roper once quipped that “you can ask a question in such a way as to get any answer you want”—and any analogy between opinion polls and “a weather forecast,” which Morris makes at least twice, collapses. (Like “snapshot,” the weather forecast analogy suggests both accuracy and unreliability.)

But the most fundamental problem with polling is that the phenomenon it claims to record—public opinion—has no coherent meaning or existence. The polling industry resolves this conundrum by simply making “public opinion” synonymous with its methods: polls record public opinion; public opinion is what polls record. Skeptics could see this sleight of hand from the start. “Dr. Gallup does not make the public more articulate,” Lindsay Rogers, a political scientist at Columbia University, wrote in an early polemic against polling in 1949. “He only estimates how in replying to certain questions, it would say ‘yes’ or ‘no’ or ‘don’t know.’ Instead of feeling the pulse of democracy, Dr. Gallup listens to its baby talk.”

Polling, in this analysis, was not so much an infant science as an infantilizing one: political matters were reduced to facile either/or stances, with little concern for how lightly or intensely one held an opinion or whether the opinion even existed before the survey. One of the oldest and most ambiguous concepts in the social sciences—a survey of the literature in 1965 quoted almost fifty conflicting attempts at a definition of “public opinion”—was reduced to a simple percentage: “60% think this, 40% think that.”

The conceits of such a percentage—its mirage of an equally informed, equally engaged citizenry, its impression of a country that has spoken—have been criticized by figures as varied as Martin Luther King Jr. and Pierre Bourdieu, for whom public opinion was too amorphous and impressionable to be fixed in the form of a number. Those conceits have also been exposed by many researchers. In an experiment conducted in 1980, people were asked whether they thought “the 1975 Public Affairs Act” should be repealed: a third gave an opinion, even though the act does not exist. In 1995 The Washington Post replicated the study with similar results, but found that another tenth could be goaded into an opinion with a follow-up question. (“Which [stance] comes closest to the way you feel?”) When people were told that either President Clinton or the Republicans wanted to repeal the act, more than half of respondents had a view. More recently, a UK poll found that nearly half of respondents claimed an opinion on a nonexistent politician, who actually proved relatively popular. (Anyone who has knowingly nodded along to a name they’ve never heard, hoping to avoid embarrassment, can relate to this.)

No poll can ever be sure what portion of answers are similarly offered off the cuff or to what extent respondents hold their positions outside the survey setting. The sociologist Leo Bogart said in 1972, “The first question a pollster should ask is: ‘Have you thought about this at all? Do you have an opinion?’” But usually polling companies don’t want to know: adding questions costs time and money, and ideally they want everyone to have an opinion on everything.

Morris has strong opinions about polling and a wealth of experience beyond his years. Born in 1996, he rose to prominence while still an undergraduate at the University of Texas at Austin by accurately predicting that the Democrats would regain the House in the 2018 midterms. After graduating he joined The Economist as a data analyst and journalist. He published Strength in Numbers in July 2022. In May 2023 he was announced as Nate Silver’s successor at FiveThirtyEight. (Silver left amid a round of job cuts at FiveThirtyEight, now under the stewardship of Disney, with about two thirds of its staff reportedly laid off.)

Morris’s book is filled with fighting talk: whatever the doubters say, polls remain “one of the most democratizing forces in American political history”; they can “reveal the will of the people”; they serve “as a pipeline from the governed to the government and as a bulwark against despots”; they are “the key to social knowledge”; they “hand a megaphone to the voice of the people, causing it to reverberate through the halls of government.” In one of his most strident moments, Morris even suggests that critics of polling are enemies of democracy: “In many cases, the denigration of polls is made by elites, elected officials, and ideological activists who have a stake in the public’s voice not being heard”—a claim that would be easier to take seriously if he engaged with the critical scholarship on the polling industry. Rogers’s 1949 polemic and Gallup’s combative response receive a few paragraphs. Susan Herbst and Sarah Igo are referenced in the acknowledgements, but any influence of their important work on how polling hollows out understanding of political participation and on the foundational ties between the polling industry and market research is hard to find in the main text of Morris’s book.

The reasons Morris gives for his fervent faith in polling are underwhelming and overwrought. In the introduction he celebrates how Republicans and Democrats now use polling to determine which presidential primary candidates participate in debates, hailing this as proof that “you don’t have to look far to find concrete examples of polls serving meaningful functions in our electoral, judicial, and governing systems.” He omits the fact that even major polling companies have criticized this use of their findings. (“I just don’t think polling is really up to the task of deciding the field for the headliner debate,” Scott Keeter, then Pew’s director of survey research, said in 2016.) Later we learn how in 1960 John F. Kennedy’s pioneering pollsters “advised…a strategy for his upcoming debate [with Richard Nixon], telling him to come off strong, competent and understandable to the average American”—leaving us to wonder how any candidate could ever have fared without such scintillating guidance.

But Morris also knows that polls are not the “crystal balls” that their most avid cheerleaders sometimes claim them to be, and he vacillates between championing their indispensable place in democracy and admitting their fallibility. Morris the populist revels in nebulous expressions like “the will of the people,” “the power of the people,” and “the voice of the people.” Morris the scientist takes every opportunity to plead for caution and emphasize plurality. Morris the populist prevails: by the book’s conclusion, he is still insisting that “the will of the people is now quantified and easily accessible by any reformer, legislator or interested citizen”—despite beginning the same paragraph with a nod to “what we have learned about the uncertainty in polling and the varying quality of public opinion across issues.”

Caught between the seriousness of its science and the need to market its product, the entire polling industry is trapped in a version of this double act. Gallup was no different. As Igo noted in The Averaged American (2007), he wrote that “the American people are as various as their land” and in the same article repeatedly invoked the mythical “average man” discovered by polls.

Morris concedes that, overall, polling has yet to live up to its lofty promises. But his reasons for why polls don’t work are even less convincing than his reasons for why they do. His main targets for blame are not pollsters or their methods but the public and, above all, the press. As I read his defense of polling, the words of Oscar Wilde came repeatedly to mind: “The play was a great success. The audience was a failure.”

According to Morris, the public has failed to appreciate that every poll comes with a margin of error, so really no poll can be wrong: “Consumers of polling and election models should not trick themselves into mistaking polls and projections for a science they’re not—and will likely never be.” While more polls—particularly in the very close 2024 presidential election—have started to include the margin of error in their results, Morris’s mixed messages will hardly help confused consumers: he advises resisting total faith in polls but also says that “they are scientific” and that “informed readers” should turn to “RealClearPolitics and Pollster to know who’s ahead, and to FiveThirtyEight to know whether they’ll win.”

But Morris saves his harshest words for the media, decrying “the damage done to the polling industry by an overconfident and naïve press.” The polling industry and the media have always had a difficult, if also mutually dependent, relationship. While many journalists initially resisted polls as an encroachment on their craft and authority—“Today, unless you can say ‘According to the Poop-A-Doop survey, Umpty-ump percent of the people chew gum while they read Hot Shot News!’ you fail to make an impression,” one journalist lamented in 1950—it’s also true that from the start, the pollsters’ most important client was the press, and the two quickly established symbiotic ties. The press commissioned polls to generate news stories and bolstered its reporting with persuasive statistics, while polls relied on the press for funding and, crucially, publicity. By the end of the century, most major news organizations had their own in-house polling operations or formal partnerships with polling companies.

This partnership inevitably affected the nature and purpose of polls: newspapers didn’t want to pay for boring findings; they wanted engaging, dramatic stories, tales of conflict and controversy. The polling industry obliged, with varying degrees of reluctance and enthusiasm, and received not just money and publicity but an alibi: the media could now be blamed for its worst traits—exaggerating social conflict, simplifying issues, overstating accuracy.

In the same vein, Morris insists that whereas the media “want attention-grabbing, confident predictions,” pollsters understand “all the nuance and uncertainty that are inherent in their data.” Elsewhere, Morris concedes that “pollsters systematically overestimate their own accuracy,” but the nature and gravity of this contradiction—that pollsters understand and systematically ignore inconvenient truths—elude him.

The fact is that polling companies need engaging, dramatic results, not only because such results keep their patrons in the press happy but also because interesting poll results travel further and faster, spreading the name of the company and thus attracting more clients. While Morris laments how the pollsters and the press both do a “poor job” conveying polling’s limitations, with no account—no mention—of the business side of polling and no sense of how polls need publicity, he misses how pollsters can become invested in their own simplifications and misinterpretations. In pursuit of both accuracy and profit, compromises are made.

It’s hard to believe, given the number of polls being conducted in 2024, but Gallup and Roper were always skeptical of election forecasts. “All of us in the field of public opinion research regard election forecasting as one of our least important contributions,” Gallup said; Roper thought they were “socially useless” and might “do very much more harm than good.” But election forecasts are the only verifiable “theory” that this “science” puts forward: their accuracy is fact-checked by the final ballot in a way that other opinion polls never can be. For polling companies, election campaigns are thus marketing campaigns. The results are twofold: an inordinate number of polling companies participating in the game of predicting elections, on the one hand (in 2020, there were at least 1,572 state-level preelection polls, including 438 in the final two weeks alone, by over 200 different polling companies—all eager, in Morris’s telling, to strengthen American democracy); and on the other, a huge investment in election-forecasting over opinion-measuring methods.

Polls may have once promised to make politics about more than elections, but in practice they have surely done the opposite, with each vote presaged by months, sometimes years, of obsessively dissected forecasts and horse-race coverage. No one embodies this trend more than the politically indifferent, election-obsessed Nate Silver. “With the politics stuff, I just like the elections part,” he told The New Yorker as he was leaving FiveThirtyEight.

Perhaps the polling industry’s standing in society today is most analogous to that of the advertising industry that spawned it: polling organizations are similarly ubiquitous, profitable, and treated cynically by members of the public, who suspect an ulterior motive. Like advertising, political polls are increasingly associated with attempts to manipulate public opinion, tailor messaging in superficial ways, and inform public relations strategies. Politicians of all stripes denigrate polls in public and obsess over them in private. “I don’t have a pollster,” Trump declared on the campaign trail in 2015, before soon hiring one. “No one tells me what to say.” In the months before Joe Biden’s decision to withdraw his candidacy, his advisers frequently attacked the polls in the press for underestimating his support. Despite polls that increasingly indicated he was unlikely to defeat Trump, Biden refused to leave the race until the political fallout from his disastrous debate performance forced his hand.

In the 1990s a new technology replaced polling as the tool destined to transform democracy: the Internet. In a fuller realization of polling’s potential, people would be able to speak up and share their opinions at all times, leading to a better-informed public, more responsive governments, and a truer version of democracy for all. Just as Gallup promised to bring the “town meeting” ideal into the twentieth century—“This time, the whole nation is within the doors,” he wrote—the Internet promised to bring it into the twenty-first. “The function of the Net, in this conception, is to facilitate a running national poll of public opinion, with immediate electronic feedback from citizens to government and vice-versa,” the political scientist Bruce Bimber explained in 1998.

Soon a specific kind of website became the medium for these hopes and dreams: the social media platform, Facebook and Twitter in particular, which launched in 2004 and 2006 respectively. Twitter pitched itself, in a way reminiscent of Gallup’s early polls, as “The Town Hall Meeting… In Your Pocket” and a “real-time measure of public opinion.” It also seems relevant that as a nineteen-year-old sophomore at Harvard, Mark Zuckerberg first made a name for himself by designing an online poll: FaceMash had users choose the more attractive of two female students from their photos from the Internet and built a university-wide ranking. “I almost want to put some of these faces next to pictures of farm animals and have people vote on which is more attractive,” the young Zuck wrote on his blog before launching the short-lived site. Facebook followed a year later, and soon he was celebrated as a champion of democracy.

Like the polling industry, social media platforms’ ties to the advertising industry were either downplayed or ignored: the aim was to give people a voice to enrich democracy (and then use what they said and did to sell them stuff in increasingly sophisticated ways). Social media platforms arrived at the same convenient conclusion as the polling industry: healthy markets and healthy democracies needed the same thing—to know what the public thinks. But surveys were no longer necessary: through social media, users’ thoughts and actions could be tracked at all times. By 2008, advertising gurus excitedly announced, the Internet had already overtaken all other market research methods—“postal, face-to-face and telephone”—to become “the leading global modality for quantitative data collection.” “No longer is recruitment an issue; no longer is the phrasing of the question an issue; no longer is the duration of the interview an issue; and no longer is respondent fatigue an issue,” Finn Raben, director general of ESOMAR, one of Europe’s largest conglomerates of market researchers, enthused in 2010. “If the topic is of interest, then the material is already there…thus is born the ‘Age of Listening’ as opposed to the ‘Age of Questioning.’”

What the advertising industry celebrated as “listening,” however, others saw as something more sinister. The digital economy, premised on the invasion of privacy, was soon denounced as “surveillance capitalism.” Information became its lifeblood, and digital companies developed insatiable appetites for more and more information on users, however trivial. This created a double dynamic: a desire not just to record information but to generate more information.

This is one of social media’s most significant resonances with the polling industry. Just as polls want respondents to have an opinion on everything, cuing views through specific questions and portraying an opinionated public while claiming a neutral detachment, social media platforms like Facebook and Twitter, now X, repeat the same trick on an even greater scale. “You don’t have to have an opinion on everything” has become a refrain online, reflecting how much pressure is applied in the opposite direction. Facebook asks its users, “What’s on your mind?” X prompts its users, “What is happening!?” (The panicked exclamation mark is a new addition, neatly symbolizing both the platform’s neediness toward its users’ information and our disorienting present moment.) Shortly after purchasing the platform in October 2022 for $44 billion, Elon Musk implored users, “If I may beg your indulgence, please add your voice to the public dialogue!”

Social media companies assure the public that their ravenous hunger for opinions simply stems from their deeply felt desire to give people “a voice.” (Speaking at Georgetown University in 2019, Zuckerberg used the word “voice” over thirty times in his thirty-five-minute address.) But what they really want is a reaction: a “like,” a “share,” an emoji, a short comment, or some other form of quantifiable communication that, following Lindsay Rogers, we might call twenty-first-century “baby-talk”—information that can then be packaged, analyzed, and sold. In this monetized vision of the “town square,” more talk means more profit, and users are ideally both perpetual pollsters—always courting reactions to their thoughts and experiences—and obsessive respondents, offering simplified views on a huge number of issues, from politics to brands of toothpaste.

The affinity between social media and polling is perfectly captured by the polling function on many social media sites, which brings the straw poll into the age of big data. Twitter launched one in 2015, and Facebook followed two years later. As Twitter’s new CEO, Musk was initially fond of using the feature. In November 2022 he announced that his decision to reinstate Trump—who was banned from the platform after the storming of the Capitol on January 6—would be determined by a Twitter poll. More than 15 million users voted, and 51.8 percent voted “yes.” “The people have spoken,” Musk tweeted. “Trump will be reinstated. Vox Populi, Vox Dei.” (Trump’s account was restored, but it wasn’t until August of this year that he added his voice to the platform’s public dialogue once again.) In December 2022, facing mounting criticism over his leadership of Twitter, Musk held another Twitter poll on whether he should continue as CEO. The online survey lasted twelve hours and 17.5 million users responded, with 57.5 percent wanting him out. Musk remains in his post—and his penchant for hosting polls seems to have passed. But he continues to defend their integrity. In March, spreading the conspiracy theory that the polling industry uses fake interviews, he posted: “The vast majority of polls are bs. Polls on this platform at least reach some real users.”

In 1921, as the editor of the student newspaper at the University of Iowa, the nineteen-year-old George Gallup wanted to attract new readers. He published a notorious article titled “The Unattractive Women,” which took the form of an ostensibly overheard conversation between two male students and declared that it was women’s “duty to…make themselves as attractive as they can”—a duty that, like Zuckerberg some eight decades later, Gallup seemed to think many women on his campus were failing at. The article led to a spike in circulation and on-campus misogyny. “All of the girls were angry,” Gallup later recalled, but “from that day on, the newspaper was eagerly read.”

Gallup’s interest in getting attention and his desire to discover “what the public wants” were two sides of the same coin. His Ph.D. dissertation sought to pioneer an “objective” way of measuring what parts of a newspaper readers spent time on. Gallup found that they really enjoyed looking at comic strips and pictures, not the hard news they liked to claim in surveys, and he called on “the modern newspaper” to offer more of both “to get itself read” and become more appealing “from an advertising point of view.” In Gallup’s crowd-pleasing quest, polls were doubly useful: they were both a means to discover what people wanted (respondents’ dishonesty notwithstanding) and a product that people wanted—a form of journalism that, like cartoons and pictures, could make politics light and accessible.  

Today that product remains overwhelmingly popular: polls saturate election coverage, turn politics into a spectator sport, and provide an illusion of control over complex, unpredictable, and fundamentally fickle social forces. That isn’t to say that polls don’t have uses beyond entertainment: they can be a great asset to campaigns, helping candidates refine their messages and target their resources; they can provide breakdowns of election results that are far more illuminating than the overall vote count; and they can give us a sense—a vague and sometimes misleading sense—of what 300 million people or more think about an issue. But, pace Morris, the time for celebrating polls as a bastion of democracy or as a means of bringing elites closer to voters is surely over. The polling industry continues to boom. Democracy isn’t faring quite so well.

Silicon Valley ultimately peddled the same feel-good story about democracy as the polling industry: that the powerful are unresponsive to the wider public because they cannot hear their voices, and if only they could hear them, then of course they would listen and act. The virtue of this diagnosis is that structural inequalities in wealth and power are left intact—all that matters in democracy is that everyone has a voice, regardless of background. In a very narrow, technical sense, their innovations have made this a reality. But the result is a loud, opinionated, and impotent public sphere, coarsened by social and economic divisions and made all the more disillusioned by the discovery that, in politics, it takes more than a voice to be heard.

—September 18, 2024