Barton Swaim has a beaut in the Wall Street Journal. He provides a perspective about our current nutty culture that nails it.
Here is his piece: "How Disagreement Became "Disinformation".
----------------------------------------
The preoccupation with “misinformation” and “disinformation” on the part of America’s enlightened influencers last month reached the level of comedy. The Department of Homeland Security chose a partisan scold, Nina Jankowicz, to head its new Disinformation Governance Board despite her history of promoting false stories and repudiating valid ones—the sort of scenario only a team of bumblers or a gifted satirist could produce.Less funny but similarly paradoxical was Barack Obama’s April 21 address lamenting online disinformation, in which he propounded at least one easily disprovable assertion. Tech companies, the former president said, “should be working with, not always contrary to, those groups that are trying to prevent voter suppression [that] specifically has targeted black and brown communities.” There is no evidence of voter suppression in “black and brown communities” and plenty of evidence of the contrary, inasmuch as black and Latino voter participation reached record levels in the 2020 election.
One of the great ironies of American political life in the 2020s is that the people most exercised about the spread of false information are frequently peddlers of it. Their lack of self-understanding arises from the belief that the primary factor separating their side from the other side isn’t ideology, principle or moral vision but information—raw data requiring no interpretation and no argument over its importance. It is a hopelessly simpleminded worldview—no one apprehends reality without the aid of interpretive lenses. And it is a dangerous one.
The roots of this self-deceiving outlook are complicated but worth a brief look.
The animating doctrine of early-20th-century Progressivism, with its faith in the perfectibility of man, held that social ills could be corrected by means of education. People do bad things, in this view, because they don’t know any better; they harm themselves and others because they have bad information. That view is almost totally false, as a moment’s reflection on the many monstrous acts perpetrated by highly educated and well-informed criminals and tyrants should indicate. But it is an attractive doctrine for a certain kind of credentialed and self-assured rationalist. It places power, including the power to define what counts as “good” information, in the hands of people like himself.
There was also, among a host of intellectuals in the middle of the last century, the expectation of a “postpartisan” future of technocratic centrism in which the large ideological questions are mostly settled. What is mainly needed from the political process, the thinking went, isn’t visionary leadership but skillful management. Arthur M. Schlesinger Jr.’s “The Vital Center” (1949) is an expression of that outlook, as are John Kenneth Galbraith’s “The Affluent Society” (1958) and Daniel Bell’s “The End of Ideology” (1960). These writers wanted the cool control of experts, not the messy brawling of democracy, which they felt lent itself too easily to revolution. “The tendency to convert concrete issues into ideological problems, to invest them with moral color and high emotional charge,” Bell wrote, “is to invite conflicts which can only damage a society.”
The technocratic impulse is now an integral part of our politics. Those most given to it tend to view themselves not as adherents of any conception of political life but simply as people who acknowledge the world as it is. They regard differing outlooks as deviations from reality that can only cause trouble for no good reason. They believe their critics, who look at the same facts but draw different conclusions, aren’t simply mistaken but irrational, corrupt or both.
No politician deployed the rhetoric of technocratic postpartisanship more openly than Mr. Obama. In a 2007 speech to Google employees, early in his campaign for president, he expressed it concisely. “The American people at their core are a decent people,” he allowed. “There’s a generosity of spirit there, and there’s common sense there.” You could hear the “but” coming. “But,” he said, “it’s not tapped.”
He continued: “Mainly people—they’re just misinformed, or they are too busy, they’re trying to get their kids to school, they’re working, they just don’t have enough information, or they’re not professionals at sorting out all the information that’s out there, and so our political process gets skewed. But if you guys give them good information, their instincts are good and they will make good decisions. And the president has the bully pulpit to give them good information.”
The self-regard implicit in that observation is astounding. More important is its naiveté. The prevalence of bad information is nothing new. Lies, half-truths, wild exaggerations and farcical inventions are part of democratic politics and always have been. Mr. Obama’s remarks reveal a failure to understand that large, complex arguments always involve assumptions and philosophical commitments arising from background, experience and personality.
For him—and he shows no signs of change since he made those remarks 15 years ago—politics is a simple Manichaean struggle in which the righteous and well-intentioned use good data, and the malign and ignorant use bad. Mr. Obama’s most ardent admirers, accordingly—I think of the founders of the “explainer” site Vox.com—view themselves not as proponents of a particular ideological conviction but as disseminators of good data.
It was during the Obama years, not coincidentally, that “fact checking” took firm hold in American journalism. This doesn’t refer to the old-fashioned practice of scrubbing an article for errors before publication. Instead, media organizations issue “fact checks” of statements by public officials, candidates and pundits. Websites such as Snopes.com, PolitiFact.com, FactCheck.org and the Washington Post’s Pinocchio-issuing Fact Checker consider themselves America’s arbiters of truth.
But what looked like a renewed appreciation for factual accuracy quickly became, as this newspaper’s James Taranto pointed out relentlessly, an easy way to lend peremptory authority to badly argued opinion pieces and to undermine defensible arguments as “false” or “mostly false” or “lacking context.” In many instances these allegedly scrupulous fact-checkers would count true statements “false” even as they conceded the statements were true.
During a 2020 presidential debate, to take one memorable instance among hundreds, Joe Biden claimed the Obama administration hadn’t separated children from parents caught illegally crossing the border from Mexico. CBS’s fact-checking unit then published a piece claiming Mr. Biden’s statement was “true” on the grounds that “the Obama administration only separated migrant children from families under certain limited circumstances.”
The fact checkers’ prestige had begun to wane years before the 2020 election, but the belief that our direst social and political ills stem from the circulation of false information hasn’t lost its appeal among opinion makers. A report published in 2021 by New York University’s Stern Center for Business and Human Rights recommended a federal “Digital Regulatory Agency” to police online content. A New York Times technology columnist wrote favorably of recommendations by “experts” that “the Biden administration put together a cross-agency task force to tackle disinformation and domestic extremism, which would be led by something like a ‘reality czar.’ ” Meet Czarina Jankowicz.
Censorship, to adapt a phrase Mr. Obama is fond of, is an idea whose time has returned. A quarter-century ago the word “censorship” was almost a profanity in American politics. By the mid-2010s it was permitted, even praised, so long as it targeted heterodox thought. Speakers on college campuses were shouted down without a word of protest from people who in the 1980s had defended the public funding of sacrilegious photographs. Commentators in mainstream journals of opinion advocated the reinstatement of the Fairness Doctrine, which required broadcasters to present both sides of controversial issues and had the effect of chilling debate on every contentious question. A large number of respected academics and intellectuals suddenly believed the U.S. government had a duty to stop people from saying things those same academics and intellectuals held to be factually inaccurate.
Skeptics mostly attribute this new support for censorship to bad faith. I prefer a more charitable explanation. The new censors sincerely mistake their own interpretations of the facts for the facts themselves. Their opinions, filtered unconsciously through biases and experience, are, to them, simply information. Their views aren’t “views” at all but raw data. Competing interpretations of the facts can be only one thing: misinformation. Or, if it’s deliberate, disinformation.
It is in many ways a strange outcome. From the 1970s to the early 2000s, academic philosophies associated with “postmodernism” coursed through American higher education. They held that there was no objectively knowable truth, only subjective interpretation. As if to demonstrate postmodernism’s total impracticality, yesterday’s straight-A college students have now retreated into a risibly facile non-philosophy in which there is no interpretation, only objective “fact.”
Such was the mental disposition of America’s enlightened politicos and media sophisticates when the pandemic hit in early 2020. The challenge of public policy, as they saw it, was not to find practical, broadly acceptable solutions. The challenge, rather, was to find and implement the scientifically “correct” solution, the one endorsed by experts. Sound policy, for them, was a matter of gathering enough data and “following” it.
But of course you can’t follow data. Data just sits there and waits to be interpreted.
When Covid-19 came ashore, the country’s political class, in thrall to the authority of public-health experts and the journalists who listen to them, was singularly ill-equipped to lead in a sensible way. What the pandemic required was not the gathering and mastery of information and the quick implementation of “data driven” policy. The data was wildly elusive, changing shape from day to day and yielding no obvious interpretation. No one understood the spread of this astoundingly resilient virus, least of all the experts confidently purporting to understand it. There was, in fact, no clinically correct response.
The situation called for the acknowledgment of risk, the weighing of costs against benefits, the clear declaration of reasonable compromises between competing interests. What happened was an exercise in societal self-ruin—in the U.S. and elsewhere in the developed world. Politicians, especially those most inclined to see themselves as objective, pro-science data-followers, ducked accountability and deferred to experts who pretended to have empirically proven answers to every question put to them. They gave us a series of policies—business shutdowns, school closures, mask mandates—that achieved at best minor slowdowns in the disease’s spread at the cost of tremendous economic destruction and social embitterment.
With the two-year pandemic response now all but over, what stands out most is the absence of any acknowledgment of error on the part of anyone who advocated these disastrous policies. There is a reason for that absence other than pride. In the technocratic, data-following worldview of our hypereducated decision makers, credentials and consensus are sure guides to truth, wisdom is nothing next to intelligence, and intelligence consists mainly in the ability to absorb facts. That mindset yielded a narrow array of prescriptions, which they dutifully embraced, careful to disdain alternative suggestions. They can hardly be expected to apologize for following the data.
No comments:
Post a Comment