By Michelle Dean
When I was at university, some 20 years ago, learning about the world was a tactile, toilsome sort of experience. I got up in the morning, put on my boots
and, yes, trudged through snow to the library. I went through the stacks and pulled down texts older than I was. I don’t remember ever wondering how authoritative those texts might have been. After all, here they were, in a library — and a library, we were taught, was the High Church of “credible” authority. People who published books were assumed to be correct. They had proved themselves worthy of it, had cleared hurdles to put their words and ideas into wide circulation.
When I was at university, some 20 years ago, learning about the world was a tactile, toilsome sort of experience. I got up in the morning, put on my boots
and, yes, trudged through snow to the library. I went through the stacks and pulled down texts older than I was. I don’t remember ever wondering how authoritative those texts might have been. After all, here they were, in a library — and a library, we were taught, was the High Church of “credible” authority. People who published books were assumed to be correct. They had proved themselves worthy of it, had cleared hurdles to put their words and ideas into wide circulation.
Using the internet as an authority for any claim — in a paper, even in conversation — would have gotten you laughed out of the room back then. The internet, in 1998, in 2000, even in 2005, was not real life; it carried little weight in conversations about politics or economics. The starry-eyed prognosticators of the time, figures like John Perry Barlow and Lawrence Lessig, insisted that one day it would. “We are creating a world where anyone, anywhere may express his or her beliefs, no matter how singular, without fear of being coerced into silence or conformity,” Barlow wrote, stirringly, in his 1996 Declaration of the Independence of Cyberspace.
Barlow was not exactly wrong. We do live in a world where anyone, anywhere, can express his or her beliefs, including some alarmingly singular ones. If there is one thing the last year of public American life has taught us, it is that the public sphere is now definitively a free-for-all. Authority has become something diffuse and flammable, like spray paint. Type out an opinion, lob it into the network and you might be able to plaster yourself all over the world. You may also be on fire, but at least you’ll have an audience.
Authority has become something diffuse and flammable, like spray paint.
In 2012, the reality-TV star Donald Trump tweeted that “An ‘extremely credible source’ has called my office and told me that @BarackObama’s birth certificate is a fraud.” We didn’t know it at the time, but this was a flag flown from our new state of normal, where the incredible would suddenly become — for some — credible. Now, years later, the Steele dossier, with its raw intelligence on President Donald Trump and Russia, is alternately deemed a document that is “more credible than ever” (Washington Monthly, last fall) or one whose “credibility is collapsing” (The New York Post, last month). In corners of the right, some of the nation’s top law-enforcement mechanisms are said to suffer from a “credibility crisis” (Daily Caller on the FBI) or a “credibility problem” (The Wall Street Journal on Robert Mueller). Allegations about the misconduct of notable men — Al Franken, Roy Moore — are, former allies announce with disappointment, “credible.” The stories of the early Trump administration in Michael Wolff’s “Fire and Fury” struck some people as obviously believable; Wolff himself struck some as the opposite. When Trump dismissed the book as full of lies, the author responded that “my credibility is being questioned by a man who has less credibility than, perhaps, anyone who has ever walked on earth.”
Everything is an argument in a way it didn’t used to be, every article your friends and relatives link to a declaration of rhetorical war. Day after day, comment after comment, we live in a state of fractious, troubling doubt. The internet has always been a cozy home for partisans and pedants, conspiracists and crusaders, but gradually, their spirit has crept into the rest of our lives. The president conducts affairs of state over Twitter. More important, he has somehow taken up the chaos of the internet as a defining ethos of his administration, delighting in his ability to blithely shift course and keep people guessing. One day he is said to have described several foreign nations in foul-mouthed terms; the next day, people who were present deny it, or insist he used a different vulgarity to do it. The business of the country is now conducted like an argument on an unmoderated internet message board — an unceasing thread of squabbles, reversals and revisions.
A recent editorial in Foreign Affairs articulated the “credibility gap” this habit has created. In the prim terms of international relations, this is no small thing: Being seen as likely to act on your word is the basis of every threat, every diplomatic agreement between nations, and for America to abandon that credibility is a grave loss. During the Cold War, “credible” was a mainstay of nuclear-defense chatter, where it could be used to describe any weapon that actually worked. (If your weapon is operational — and you have the capacity to defend yourself against retaliation — you are said to have a “credible first-strike capability,” a point of leverage possibly more important than the missiles themselves.)
Credibility matters at home too. In the midst of bedlam over the three-day government shutdown, Sen. Lindsey Graham complained of the administration’s moving-target negotiations for a border wall, which seemed to cost $18 billion one day and $33 billion the next. Graham, addressing a group of reporters, sputtered in disappointment: “That’s just not credible.”
Credibility really is kind of metaphysical: We have to take a little leap of faith to get there. The main root of “credible,” after all, is the Latin word for “belief.” Aristotle thought the best way to inspire such faith was to seem like a good person: “Persuasion is achieved by the speaker’s personal character,” he wrote in “Rhetoric,” “when the speech is so spoken as to make us think him credible.” The audience is inclined to trust someone it already thinks well of.
People do not think particularly well of politicians — at least not anymore. Blame for this state of affairs is often placed on the Watergate scandal, but it’s not exactly clear that America trusted its politicians before that. A decade earlier, the phrase “credibility gap” was used to describe the difference between the Johnson administration’s assurances about Vietnam (the end was in view) and the reality on the ground (it was very much not). Before that, many Americans were prepared to believe Sen. Joseph McCarthy’s claims that the government had been overrun with secret Communists.
The internet has always been a cozy home for partisans and pedants, conspiracists and crusaders, but gradually, their spirit has crept into the rest of our lives.
What Watergate did was provide a blueprint of just how craven real-life political intrigue could get — how credible it was that, say, a president might tape-record himself planning to interfere with an FBI investigation. But discovering the details required questioning dozens of people. Was the testimony of former White House counsel John Dean credible? In May 1974, Sen. Lowell P. Weicker, a Republican, answered that question in the affirmative, telling The New York Times that the real question was “the credibility of the president of the United States.” But even today, the website of the Nixon Foundation features a long, somewhat hyperbolic argument attacking Dean’s “credibility.”
Now, each seismic cultural event sends information flying around the internet with such force and velocity that no individual could possibly sort through it all. Mass-casualty events in particular are flash points of informational chaos, with reporters both professional and amateur sorting through rumors and snippets of dialogue from traumatized people and shaky video clips and alarmed tweets and claims of a second gunman, trying to figure out what to believe. When a killer leaves behind little trace of his motives — as in Las Vegas last year — anyone can step up and try to fill the void with a theory. Just last month, Rep. Scott Perry, a Pennsylvania Republican, said he had received “credible evidence, credible information, regarding terrorist infiltration through the Southern border” connected with the shooting. The fact that few news outlets gave this claim any credence is as remarkable as the claim itself: We can now look at a stunning statement from an elected lawmaker, immediately discount its credibility and move casually along.
Right now, lawmakers are doing the same among themselves, batting around claims about a private memo, compiled by the Republican chairman of the House Intelligence Committee, containing either shocking revelations about abuses of government surveillance or misleading, politically motivated nonsense. And yet so many ideas that once seemed to live on the fringes of credibility can now take center stage. There has indeed been something liberating in this new informational anarchy: It is easier than ever to address the world and say something true. But it is equally easy to tell the world something false. I remember what it was like before — being in a brightly lit library, feet on the floor, books in front of me, the High Church of authority very much intact. There seemed to be such clear limits on what was worth believing and what wasn’t. This is precisely the opposite of how it feels now, scrolling through the news each morning, the incredibleness of things screaming at you before you’ve even had coffee.
You might, for instance, be sitting at breakfast when your phone shakes and announces, via the most credible of official channels: “BALLISTIC MISSILE THREAT INBOUND TO HAWAII. SEEK IMMEDIATE SHELTER. THIS IS NOT A DRILL.” Thirty-eight minutes later, it will emerge that it was a mistake; someone just clicked the wrong thing. Misinformation can come from anywhere, even the authorities we considered the most credible of all.
Michelle Dean is the author of “Sharp: The Women Who Made an Art of Having an Opinion,” which will be published in April.
For more great stories, subscribe to The New York Times.
© 2018 New York Times News Service
No comments:
Post a Comment