Читайте только на ЛитРес

Книгу нельзя скачать файлом, но можно читать в нашем приложении или онлайн на сайте.

Читать книгу: «The Death of Truth», страница 2

Шрифт:

For instance, the State Department has been hollowed out as a result of Steve Bannon’s vow to fight for the “deconstruction of the administrative state” and the White House’s suspicion of “deep state” professionals. The president’s son-in-law, Jared Kushner, a thirty-six-year-old real-estate developer with no government experience, was handed the Middle East portfolio, while the shrinking State Department was increasingly sidelined. Many important positions stood unfilled at the end of Trump’s first year in office. This was partly because of downsizing and dereliction of duty, partly because of a reluctance to appoint diplomats who expressed reservations about the president’s policies (as in the case of the crucial role of ambassador to South Korea), and partly because of the exodus of foreign service talent from an agency that, under new management, no longer valued their skills at diplomacy, policy knowledge, or experience in far-flung regions of the world. Combined with Trump’s subversion of longtime alliances and trade accords and his steady undermining of democratic ideals, the carelessness with which his administration treated foreign policy led to world confidence in U.S. leadership plummeting in 2017 to a new low of 30 percent (below China and just above Russia), according to a Gallup poll.

In some respects, the Trump White House’s disdain for expertise and experience reflected larger attitudes percolating through American society. In his 2007 book, The Cult of the Amateur, the Silicon Valley entrepreneur Andrew Keen warned that the internet not only had democratized information beyond people’s wildest imaginings but also was replacing genuine knowledge with “the wisdom of the crowd,” dangerously blurring the lines between fact and opinion, informed argument and blustering speculation.

A decade later, the scholar Tom Nichols wrote in The Death of Expertise that a willful hostility toward established knowledge had emerged on both the right and the left, with people aggressively arguing that “every opinion on any matter is as good as every other.” Ignorance now was fashionable.

“If citizens do not bother to gain basic literacy in the issues that affect their lives,” Nichols wrote, “they abdicate control over those issues whether they like it or not. And when voters lose control of these important decisions, they risk the hijacking of their democracy by ignorant demagogues, or the more quiet and gradual decay of their democratic institutions into authoritarian technocracy.”

THE TRUMP White House’s preference for loyalty and ideological lockstep over knowledge is on display throughout the administration. Unqualified judges and agency heads were appointed because of cronyism, political connections, or a determination to undercut agencies that stood in the way of Trump’s massive deregulatory plans benefiting the fossil fuel industry and wealthy corporate donors. Rick Perry, who was famous for wanting to abolish the Department of Energy, was named to head it, presiding over cutbacks to renewable energy programs; and the new EPA head, Scott Pruitt, who had repeatedly sued the Environmental Protection Agency over the years, swiftly began dismantling and slow walking legislation designed to protect the environment.

The public—which opposed the GOP tax bill and worried that its health care would be taken away—was high-handedly ignored when its views failed to accord with Trump administration objectives or those of the Republican Congress. And when experts in a given field—like climate change, fiscal policy, or national security—raised inconvenient questions, they were sidelined, or worse. This, for instance, is what happened to the Congressional Budget Office (created decades ago as an independent, nonpartisan provider of cost estimates for legislation) when it reported that a proposed GOP health-care bill would leave millions more uninsured. Republicans began attacking the agency—not just its report, but its very existence. Trump’s director of the Office of Management and Budget, Mick Mulvaney, asked whether the CBO’s time had “come and gone,” and other Republicans proposed slashing its budget and cutting its staff of 235 by 89 employees.

For that matter, the normal machinery of policy making—and the normal process of analysis and review—were routinely circumvented by the Trump administration, which violated such norms with knee-jerk predictability. Many moves were the irrational result of a kind of reverse engineering: deciding on an outcome the White House or the Republican Congress wanted, then trying to come up with rationales or selling points afterward. This was the very opposite of the scientific method, whereby data is systematically gathered and assessed to formulate and test hypotheses—a method the administration clearly had contempt for, given its orders to CDC analysts to avoid using the terms “science-based” and “evidence-based.” And it was a reminder that in Orwell’s dystopia in 1984 there is no word for “science,” because “the empirical method of thought, on which all the scientific achievements of the past were founded,” represents an objective reality that threatens the power of Big Brother to determine what truth is.

In addition to announcing that it was withdrawing from the Paris climate accord (after Syria signed on, the United States was left as the lone country repudiating the global agreement), the Trump administration vowed to terminate President Obama’s Clean Power Plan and reverse a ban on offshore oil and gas drilling. Scientists were dismissed from government advisory boards, and plans were made to cut funding for an array of research programs in such fields as biomedicine, environmental science, engineering, and data analysis. The EPA alone was facing proposed cuts from the White House of $2.5 billion from its annual budget—a reduction of more than 23 percent.

IN APRIL 2017, the March for Science, organized in Washington to protest the Trump administration’s antiscience policies, grew into more than four hundred marches in more than thirty-five nations, participants marching out of solidarity with colleagues in the United States and also out of concern for the status of science and reason in their own countries. Decisions made by the U.S. government about climate change and other global problems, after all, have a domino effect around the world—affecting joint enterprises and collaborative research, as well as efforts to find international solutions to crises affecting the planet.

British scientists worry about how Brexit will affect universities and research institutions in the U.K. and the ability of British students to study in Europe. Scientists in countries from Australia to Germany to Mexico worry about the spread of attitudes devaluing science, evidence, and peer review. And doctors in Latin America and Africa worry that fake news about Zika and Ebola are spreading misinformation and fear.

Mike MacFerrin, a graduate student in glaciology working in Kangerlussuaq, a town of five hundred in Greenland, told Science magazine that the residents there had practical reasons to worry about climate change because runoff from the ice sheet had partially washed out a local bridge. “I liken the attacks on science to turning off the headlights,” he said. “We’re driving fast and people don’t want to see what’s coming up. Scientists—we’re the headlights.”

ONE OF THE most harrowing accounts of just how quickly “the rule of raison”—faith in science, humanism, progress, and liberty—can give way to “its very opposite, terror and mass emotion,” was laid out by the Austrian writer Stefan Zweig in his 1942 memoir, The World of Yesterday. Zweig witnessed two globe-shaking calamities in his life—World War I, followed by a brief respite, and then the cataclysmic rise of Hitler and descent into World War II. His memoir is an act of bearing witness to how Europe tore itself apart suicidally twice within decades—the story of the terrible “defeat of reason” and “the wildest triumph of brutality,” and a lesson, he hoped, for future generations.

Zweig wrote about growing up in a place and time when the miracles of science—the conquest of diseases, “the transmission of the human word in a second around the globe”—made progress seem inevitable, and even dire problems like poverty “no longer seemed insurmountable.” An optimism (which may remind some readers of the hopes that surged through the Western world after the fall of the Berlin Wall in 1989) informed his father’s generation, Zweig recalled: “They honestly believed that the divergencies and the boundaries between nations and sects would gradually melt away into a common humanity and that peace and security, the highest of treasures, would be shared by all mankind.”

When he was young, Zweig and his friends spent hours hanging out at coffeehouses, talking about art and personal concerns: “We had a passion to be the first to discover the latest, the newest, the most extravagant, the unusual.” There was a sense of security in those years for the upper and middle classes: “One’s house was insured against fire and theft, one’s field against hail and storm, one’s person against accident and sickness.”

People were slow to recognize the danger Hitler represented. “The few among writers who had taken the trouble to read Hitler’s book,” Zweig writes, “ridiculed the bombast of his stilted prose instead of occupying themselves with his program.” Newspapers reassured readers that the Nazi movement would “collapse in no time.” And many assumed that if “an anti-semitic agitator” actually did become chancellor, he “would as a matter of course throw off such vulgarities.”

Ominous signs were piling up. Groups of menacing young men near the German border “preached their gospel to the accompaniment of threats that whoever did not join promptly, would have to pay for it later.” And “the underground cracks and crevices between the classes and races, which the age of conciliation had so laboriously patched up,” were breaking open again and soon “widened into abysses and chasms.”

But the Nazis were careful, Zweig remembers, not to disclose the full extent of their aims right away. “They practiced their method carefully: only a small dose to begin with, then a brief pause. Only a single pill at a time and then a moment of waiting to observe the effect of its strength”—to see whether the public and the “world conscience would still digest this dose.”

And because they were reluctant to abandon their accustomed lives, their daily routines and habits, Zweig wrote, people did not want to believe how rapidly their freedoms were being stolen. People asked what Germany’s new leader could possibly “put through by force in a State where law was securely anchored, where the majority in parliament was against him, and where every citizen believed his liberty and equal rights secured by the solemnly affirmed constitution”—this eruption of madness, they told themselves, “could not last in the twentieth century.”

2
THE NEW CULTURE WARS

The death of objectivity “relieves me of the obligation to be right.” It “demands only that I be interesting.”

—STANLEY FISH

IN A PRESCIENT 2005 ARTICLE, DAVID FOSTER Wallace wrote that the proliferation of news outlets—in print, on TV, and online—had created “a kaleidoscope of information options.” Wallace observed that one of the ironies of this strange media landscape that had given birth to a proliferation of ideological news outlets (including so many on the right, like Fox News and The Rush Limbaugh Show) was that it created “precisely the kind of relativism that cultural conservatives decry, a kind of epistemic free-for-all in which ‘the truth’ is wholly a matter of perspective and agenda.”

Those words were written more than a decade before the election of 2016, and they uncannily predict the post-Trump cultural landscape, where truth increasingly seems to be in the eye of the beholder, facts are fungible and socially constructed, and we often feel as if we’ve been transported to an upside-down world where assumptions and alignments in place for decades have suddenly been turned inside out.

The Republican Party, once a bastion of Cold War warriors, and Trump, who ran on a law-and-order platform, shrug off the dangers of Russia’s meddling in American elections, and GOP members of Congress talk about secret cabals within the FBI and the Department of Justice. Like some members of the 1960s counterculture, many of these new Republicans reject rationality and science. During the first round of the culture wars, many on the new left rejected Enlightenment ideals as vestiges of old patriarchal and imperialist thinking. Today, such ideals of reason and progress are assailed on the right as part of a liberal plot to undercut traditional values or suspicious signs of egghead, eastern-corridor elitism. For that matter, paranoia about the government has increasingly migrated from the Left—which blamed the military-industrial complex for Vietnam—to the Right, with alt-right trolls and Republican members of Congress now blaming the so-called deep state for plotting against the president.

The Trump campaign depicted itself as an insurgent, revolutionary force, battling on behalf of its marginalized constituency and disingenuously using language which strangely echoed that used by radicals in the 1960s. “We’re trying to disrupt the collusion between the wealthy donors, the large corporations, and the media executives,” Trump declared at one rally. And in another he called for replacing this “failed and corrupt political establishment.”

More ironic still is the populist Right’s appropriation of postmodernist arguments and its embrace of the philosophical repudiation of objectivity—schools of thought affiliated for decades with the Left and with the very elite academic circles that Trump and company scorn. Why should we care about these often arcane-sounding arguments from academia? It’s safe to say that Trump has never plowed through the works of Derrida, Baudrillard, or Lyotard (if he’s even heard of them), and postmodernists are hardly to blame for all the free-floating nihilism abroad in the land. But some dumbed-down corollaries of their thinking have seeped into popular culture and been hijacked by the president’s defenders, who want to use its relativistic arguments to excuse his lies, and by right-wingers who want to question evolution or deny the reality of climate change or promote alternative facts. Even Mike Cernovich, the notorious alt-right troll and conspiracy theorist, invoked postmodernism in a 2016 interview with The New Yorker. “Look, I read postmodernist theory in college. If everything is a narrative, then we need alternatives to the dominant narrative,” he said, adding, “I don’t seem like a guy who reads Lacan, do I?”

SINCE THE 1960s, there has been a snowballing loss of faith in institutions and official narratives. Some of this skepticism has been a necessary corrective—a rational response to the calamities of Vietnam and Iraq, to Watergate and the financial crisis of 2008, and to the cultural biases that had long infected everything from the teaching of history in elementary schools to the injustices of the justice system. But the liberating democratization of information made possible by the internet not only spurred breathtaking innovation and entrepreneurship; it also led to a cascade of misinformation and relativism, as evidenced by today’s fake news epidemic.

Central to the breakdown of official narratives in academia was the constellation of ideas falling under the broad umbrella of postmodernism, which arrived at American universities in the second half of the twentieth century via such French theorists as Foucault and Derrida (whose ideas, in turn, were indebted to the German philosophers Heidegger and Nietzsche). In literature, film, architecture, music, and painting, postmodernist concepts (exploding storytelling traditions and breaking down boundaries between genres, and between popular culture and high art) would prove emancipating and in some cases transformative, resulting in a wide range of innovative works from artists like Thomas Pynchon, David Bowie, the Coen brothers, Quentin Tarantino, David Lynch, Paul Thomas Anderson, and Frank Gehry. When postmodernist theories were applied to the social sciences and history, however, all sorts of philosophical implications, both intended and unintended, would result and eventually pinball through our culture.

There are many different strands of postmodernism and many different interpretations, but very broadly speaking, postmodernist arguments deny an objective reality existing independently from human perception, contending that knowledge is filtered through the prisms of class, race, gender, and other variables. In rejecting the possibility of an objective reality and substituting the notions of perspective and positioning for the idea of truth, postmodernism enshrined the principle of subjectivity. Language is seen as unreliable and unstable (part of the unbridgeable gap between what is said and what is meant), and even the notion of people acting as fully rational, autonomous individuals is discounted, as each of us is shaped, consciously or unconsciously, by a particular time and culture.

Out with the idea of consensus. Out with the view of history as a linear narrative. Out with big universal or transcendent meta-narratives. The Enlightenment, for instance, is dismissed by many postmodernists on the left as a hegemonic or Eurocentric reading of history, aimed at promoting colonialist or capitalistic notions of reason and progress. The Christian narrative of redemption is rejected, too, as is the Marxist road to a Communist utopia. To some postmodernists, the scholar Christopher Butler observes, even the arguments of scientists can be “seen as no more than quasi narratives which compete with all the others for acceptance. They have no unique or reliable fit to the world, no certain correspondence with reality. They are just another form of fiction.”

THE MIGRATION OF postmodern ideas from academia to the political mainstream is a reminder of how the culture wars—as the vociferous debates over race, religion, gender, and school curricula were called during the 1980s and 1990s—have mutated in unexpected ways. The terrorist attacks of 9/11 and the financial crisis of 2008, it was thought, had marginalized those debates, and there was hope, during the second term of President Barack Obama, that the culture wars in their most virulent form might be winding down. Health-care legislation, the Paris climate accord, a stabilizing economy after the crash of 2008, same-sex marriage, efforts to address the inequities of the criminal justice system—although a lot of essential reforms remained to be done, many Americans believed that the country was at least set on a progressive path.

In his 2015 book, A War for the Soul of America, the historian Andrew Hartman wrote that the traditionalists who “resisted the cultural changes set into motion during the sixties” and “identified with the normative Americanism of the 1950s” seemed to have lost the culture wars of the 1980s and 1990s. By the twenty-first century, Hartman wrote, “a growing majority of Americans now accept and even embrace what at the time seemed like a new nation. In this light, the late-twentieth-century culture wars should be understood as an adjustment period. The nation struggled over cultural change in order to adjust to it. The culture wars compelled Americans, even conservatives, to acknowledge transformations to American life. And although acknowledgment often came in the form of rejection, it was also the first step to resignation, if not outright acceptance.”

As it turns out, this optimistic assessment was radically premature, much the way that Francis Fukuyama’s 1989 essay “The End of History?” (arguing that with the implosion of Soviet Communism liberal democracy had triumphed and would become “the final form of human government”) was premature. A Freedom House report concluded that “with populist and nationalist forces making significant gains in democratic states, 2016 marked the eleventh consecutive year of decline in global freedom.” And in 2017, Fukuyama said he was concerned about “a slow erosion of institutions” and democratic norms under President Trump; twenty-five years earlier, he said, he “didn’t have a sense or a theory about how democracies can go backward” but now realized “they clearly can.”

As for the culture wars, they quickly came roaring back. Hard-core segments of the Republican base—the Tea Party, birthers, right-wing evangelicals, white nationalists—had mobilized against President Obama and his policies. And Trump, as both candidate and president, would pour gasoline on these social and political fractures—as a way to both gin up his base and distract attention from his policy failures and many scandals. He exploited the partisan divides in American society, appealing to the fears of white working-class voters worried about a changing world, while giving them scapegoats he selected—immigrants, African Americans, women, Muslims—as targets for their anger. It’s no coincidence that Russian trolls—working to get Trump elected while trying to undermine faith in the U.S. democratic system—were, at the same time, using fake social media accounts in efforts to further amplify divisions among Americans. For instance, it turned out that Russian trolls used an impostor Facebook account called “Heart of Texas” to organize a protest called “Stop the Islamization of Texas” in May 2016 and another impostor Facebook account called “United Muslims of America” to organize a counterprotest at the same time and place.

Some of the most eloquent critics of Trump’s politics of fear and division have been conservatives like Steve Schmidt, Nicolle Wallace, Joe Scarborough, Jennifer Rubin, Max Boot, David Frum, Bill Kristol, Michael Gerson, and the Republican senators John McCain and Jeff Flake. But most of the GOP rallied behind Trump, rationalizing his lies, his disdain for expertise, his contempt for many of the very ideals America was founded upon. For such Trump enablers, party trumped everything—morality, national security, fiscal responsibility, common sense, and common decency. In the wake of stories about Trump’s alleged affair with the porn star Stormy Daniels, evangelicals came to his defense: Jerry Falwell Jr. said “all these things were years ago,” and Tony Perkins, president of the Family Research Council, said he and his supporters were willing to give Trump a pass for his personal behavior.

It’s an ironic development, given where conservatives stood during the first wave of the culture wars in the 1980s and 1990s. Back then, it was conservatives who promoted themselves as guardians of tradition, expertise, and the rule of law, standing in opposition to what they saw as the decline of reason and a repudiation of Western values. In his 1987 book, The Closing of the American Mind, the political philosophy professor Allan Bloom railed against relativism and condemned 1960s campus protests in which, he said, “commitment was understood to be profounder than science, passion than reason.” And the scholar Gertrude Himmelfarb warned that the writing and teaching of history had been politicized by a new generation of postmodernists: in viewing the past through the lenses of variables like gender and race, she argued, postmodernists were implying not just that all truths are contingent but that “it is not only futile but positively baneful to aspire to them.”

Some critics unfairly tried to lump the pluralistic impulses of multiculturalism together with the arguments of radical postmodernists who mocked the very possibility of teaching (or writing) history fairly. The former offered a crucial antidote to traditional narratives of American exceptionalism and Western triumphalism by opening the once narrow gates of history to the voices of women, African Americans, Native Americans, immigrants, and other heretofore marginalized points of view. Multiculturalism underscored the incompleteness of much history writing, as Joyce Appleby, Lynn Hunt, and Margaret Jacob argued in their incisive and common-sense-filled book, Telling the Truth About History, and offered the possibility of a more inclusive, more choral perspective. But they also warned that extreme views could lead to the dangerously reductive belief that “knowledge about the past is simply an ideological construction intended to serve particular interests, making history a series of myths establishing or reinforcing group identities.”

Science, too, came under attack by radical postmodernists, who argued that scientific theories are socially constructed: they are informed by the identity of the person positing the theory and the values of the culture in which they are formed; therefore, science cannot possibly make claims to neutrality or universal truths.

“The postmodern view fit well with the ambivalence toward science that developed after the bomb and during the Cold War,” Shawn Otto wrote in The War on Science. Among left-leaning academics in the humanities departments of universities, he went on, “science came to be seen as the province of a hawkish, pro-business, right-wing power structure—polluting, uncaring, greedy, mechanistic, sexist, racist, imperialist, homophobic, oppressive, intolerant. A heartless ideology that cared little for the spiritual or holistic wellness of our souls, our bodies, or our Mother Earth.”

It was ridiculous, of course, to argue that a researcher’s cultural background could affect verifiable scientific facts; as Otto succinctly put it, “Atmospheric CO2 is the same whether the scientist measuring it is a Somali woman or an Argentine man.” But such postmodernist arguments would clear the way for today’s anti-vaxxers and global warming deniers, who refuse to accept the consensus opinion of the overwhelming majority of scientists.

As on so many other subjects, Orwell saw the perils of this sort of thinking decades ago. In a 1943 essay, he wrote, “What is peculiar to our own age is the abandonment of the idea that history could be truthfully written. In the past people deliberately lied, or they unconsciously coloured what they wrote, or they struggled after the truth, well knowing that they must make many mistakes; but in each case they believed that ‘facts’ existed and were more or less discoverable.”

“It is just this common basis of agreement,” he went on, “with its implication that human beings are all one species of animal, that totalitarianism destroys. Nazi theory indeed specifically denies that such a thing as ‘the truth’ exists. There is, for instance, no such thing as ‘Science.’ There is only ‘German Science,’ ‘Jewish Science,’ etc.” When truth is so fragmented, so relative, Orwell noted, a path is opened for some “Leader, or some ruling clique” to dictate what is to be believed: “If the Leader says of such and such an event, ‘It never happened’—well, it never happened.”

People trying to win respectability for clearly discredited theories—or, in the case of Holocaust revisionists, trying to whitewash entire chapters of history—exploited the postmodernist argument that all truths are partial. Deconstructionist history, the scholar Deborah E. Lipstadt observed in Denying the Holocaust, has “the potential to alter dramatically the way established truth is transmitted from generation to generation.” And it can foster an intellectual climate in which “no fact, no event, and no aspect of history has any fixed meaning or content. Any truth can be retold. Any fact can be recast. There is no ultimate historical reality.”

POSTMODERNISM NOT ONLY rejected all meta-narratives but also emphasized the instability of language. One of postmodernism’s founding fathers, Jacques Derrida—who would achieve celebrity status on American campuses in the 1970s and 1980s thanks in large part to such disciples as Paul de Man and J. Hillis Miller—used the word “deconstruction” to describe the sort of textual analysis he pioneered that would be applied not just to literature but to history, architecture, and the social sciences as well.

Deconstruction posited that all texts are unstable and irreducibly complex and that ever variable meanings are imputed by readers and observers. In focusing on the possible contradictions and ambiguities of a text (and articulating such arguments in deliberately tangled and pretentious prose), it promulgated an extreme relativism that was ultimately nihilistic in its implications: anything could mean anything; an author’s intent did not matter, could not in fact be discerned; there was no such thing as an obvious or commonsense reading, because everything had an infinitude of meanings. In short, there was no such thing as truth.

As David Lehman recounted in his astute book Signs of the Times, the worst suspicions of critics of deconstruction were confirmed when the Paul de Man scandal exploded in 1987 and deconstructionist rationales were advanced to defend the indefensible.

De Man, a professor at Yale and one of deconstruction’s brightest stars, had achieved an almost cultlike following in academic circles. Students and colleagues described him as a brilliant, charismatic, and charming scholar who had fled Nazi Europe, where, he implied, he had been a member of the Belgian Resistance. A very different portrait would emerge from Evelyn Barish’s biography The Double Life of Paul de Man: an unrepentant con man—an opportunist, bigamist, and toxic narcissist who’d been convicted in Belgium of fraud, forgery, and falsifying records.

Бесплатный фрагмент закончился.

958,73 ₽
Возрастное ограничение:
0+
Дата выхода на Литрес:
30 июня 2019
Объем:
162 стр. 4 иллюстрации
ISBN:
9780008312794
Правообладатель:
HarperCollins

С этой книгой читают

Новинка
Черновик
4,9
152