Down the Rabbit Hole
UMass experts cast a clear eye on fake news and our disorienting media landscape.
A man storms into a pizza parlor carrying a gun. He is convinced that there is a Satanic child-trafficking ring in the basement—but the restaurant doesn’t have a basement.
“Pizzagate” is just one example of a phenomenon of what has come to be known as fake news—disinformation, often politically motivated, that goes viral through the charged medium of the internet, and even though far-fetched, can have real-life consequences.
Fake news has taken hold at a time when people have actually become more savvy in their evaluation of mainstream media—sensitive to bias, aware of how advertising money might influence editorial content. Such critical awareness is a good thing—but many media users in their skepticism can go so far as to doubt any mainstream source and are willing to embrace alternative sources simply because they are outside the mainstream, without subjecting them to the same scrutiny.
At the moment, the very concept of reliable sources is in question. Where does the buck stop? Whom do you ultimately believe when they tell you what is real? And when there are those who doubt even the go-to, fact-checking site Snopes as being compromised by a hidden agenda, how can the public agree on anything enough to move forward? When everything is suspect, how do you know what facts to trust?
UMass Amherst’s communication, research, and news literacy specialists address the big picture of why this rapidly shifting media landscape has come into being and how to get one’s bearings within it.
Times We Are In
The democratization of news platforms is positive in many ways. People are more individually empowered, and there is no longer a dominant perspective in the form of a fatherlike Walter Cronkite dispensing information to the populace. A blog you create at your desk is as shareable as CNN. Writers and photographers do not need to belong to a news organization or even have the approval of an authority to “get published.” “In the digital era, everything has been flattened,” says Steve Fox, senior lecturer in journalism who has many years of experience as an editor and reporter, including 10 years at the Washington Post. “There is no barrier to entry anymore.”
“The new landscape allows for new individual voices to be heard, and brings to the surface previously swept-under-the-rug views and issues that had been marginalized,” reflects undergraduate education librarian and doctoral student Kate Freedman ’10G. “And you have things like Twitter, where hashtags allow people to find each other.” Freedman, who teaches a course on information literacy, puts the fake news phenomenon in the larger context of media history. She compares the rise of new media to the advent of cheaper printing in the 18th and 19th centuries, which allowed for pamphleteering and also brought its own social convulsions—as well as plenty of disinformation and calumny.
Blocking accounts and blacklisting fake news sites does not treat the underlying causes.
Jonathan Corpus Ong
The Human Element
While it’s easy to blame social media, much of what allows disinformation to go viral are the myriad intricacies and impulses of human psychology behind clicking “Share.”
“People ask, ‘What is the motivation behind fake news?’ ” muses Lisa Di Valentino, law and public policy librarian at the W.E.B. Du Bois Library. Di Valentino created the library’s “fake news” resource page, an aggregation of tools that support news literacy. “There are so many different motivations, from the person who writes it, the person who shares it, and the person who believes it. So there is no one way to stop fake news.”
There is the desire simply to pass on information you feel might be beneficial to your friends in the moment: “People think they’re helping to inform,” says Di Valentino, “but they haven’t looked into what [the item] is or done any critical thinking about it.”
Freedman sheds even more light on this phenomenon: we are likely to propagate disinformation because of trust in our own network—both a benefit and vulnerability in social animals. “People are less likely to be critical of something shared by a friend,” she relates. “You believe the things that people you know tell you to believe. That’s just how human brains work. When somebody you know shares a news article, says something political, or generally gives an opinion, people are more likely to not be critical of it as a source, because they want to trust their friends. So that creates an echo.”
Another motivator, one that particularly sparks the sharing of conspiracy theories, observes Di Valentino, is “that people want to be Neo,” the hero of the film The Matrix. “They want to have taken the red pill, to be one of the elite, not a ‘sheeple.’ They think, ‘I want to be one of the people that knows the Truth.’”
The sharing of disinformation becomes a problem when it starts to influence behavior on a wider scale and carries over from the theoretical into the real. “If you repeat something enough times, people will believe it,” says Fox, who teaches a news literacy course in the journalism department. “What is scary is when things get repeated that have no basis in fact”—such as crisis actor accusations or saying that real-world events didn’t happen. Such innuendo can incite aggressive people to harass others named in a story. “It’s not just for entertainment anymore. It becomes a serious thing,” says Di Valentino.
Reporting on the “troll factories” in Russia—offices where digital workers spend hours posting and boosting content online—has revealed an actual political strategy to create chaos by amplifying stories that create negative, polarized feelings in a populace. And that’s where fake news gets even more intricate: how many of the personae you see commenting on stories are actual people? How many of the “likes” and shares that create a “trending” status are bots or paid operatives?
Disinformation Society
Jonathan Corpus Ong, associate professor of communication, has documented the “everyday, ordinary work of disinformation” in the Philippines where freelance PR executives and advertisers lead disinformation campaigns. Ong’s case study sheds light on the dynamics behind the seeding of fake news around the world.
In a report coauthored with Jason Cabañes from the University of Leeds, Ong documented interviews with public relations strategists who revealed a morally shadowy digital underground. Using the same methods with their political clients as they use with their corporate accounts, strategists make hashtags trend and release divisive stories—such as innuendo about politicians—into the social media bloodstream. Such digital campaigns start seemingly innocuously, such as through fake personae pushing content about something as neutral as a beauty pageant or a pop culture fan account. Eventually, operators begin slipping in paid political content. In order to amplify these numbers, clients will pay “click armies” to make a hashtag surge or boost the signal of a story, sometimes gathering teams of influencers to weekend hackathons in five-star suites or rented mansions inside gated villages. Because of these luxe surroundings, the operatives “never feel like seedy workers,” explains Ong. “They’re in a fancy place, so it legitimizes their operations.”
Ong is careful in his report to identify hierarchies in fake news dissemination in the Philippines. First come the chief architects of disinformation who report to political clients, the elite ad and PR strategists applying their techniques. After them are the digital influencers, both opinion leaders and also anonymous influencers who post things like inspirational quotes. Beneath them yet again are the lower-level workers hired to push the numbers, making articles trend, and thereby creating a false image of public opinion that then goes on to create public opinion, and then reality—such as electoral outcomes.
Is there concern in the Philippines about the effects of these campaigns? “There is, from strong-willed journalists and previous political leaders—what we would consider the liberal elite,” replies Ong. “But younger people are less critical of influencer culture.” And, he adds, the campaigns are only fanning the flames of pre-existing economic and social discontent.
Look Past the Symptom to the Cause
The rabbit hole didn’t dig itself: truly stemming the tide of disinformation means trying to understand the social factors that give rise to fake-news creators. A disinformation industry is a sign that something is already out of balance. “Blocking accounts and blacklisting fake news sites does not treat the underlying causes,” insists Ong. In the Philippines, he points out, digital work is celebrated as an opportunity to get oneself out of poverty. “The work of disinformation is just a sideline. People do it on top of a main job, without a real awareness of professional ethics,” says Ong. “We need to understand the motivations of information disorder agents and the systems in place that professionalize their work. Before we talk about them as trolls, let’s talk about them as workers. What are the financial incentives?”
Once the vulnerabilities within particular industries are identified, they can be addressed. Rather than creative workers slipping into the digital underground, for example, they have to have an economy and communities that support them. Digital disinformation is an unexpected symptom of a weak economic structure that makes people willing to do work outside of their integrity—and one, like a disease gone awry, that further attacks and weakens the structure of the organism.
Mastering the Media
Countering disinformation involves knowing how information is created. It also demands awareness of one’s own mind-set and deepening responsibility for our actions and reactions. UMass Amherst experts consulted for this article are all careful to delineate between healthy skepticism and utter, unbridled cynicism. “I think you can be skeptical but still positive about the state of the world,” reasons Fox. “To say, it’s not that I don’t believe this, but I want to be sure before I share it with somebody.”
On a practical level, media users can avail themselves of resources such as the Du Bois Library’s fake news fact-checking flow chart. “Librarians have done a lot of thinking about these information structures,” explains Freedman. “What we can tell you is not what to believe, but how this information ecosystem is constructed and how to think about it critically.”
And, beyond relying on fact-checking sources, participants can hone their own intuition. “I ask, what’s most likely?” says Di Valentino. If it’s too fanciful, wish-fulfilling, or apocalyptic to be true—such as physicians recommending chocolate cake for breakfast, or sharks were seen swimming the streets of Lower Manhattan after Hurricane Sandy—it probably is. Items that seek to elicit an emotional response are also suspect. Is it news or is it opinion? Where is the news item truly coming from, and what does it want of you? Is it trying to get you to share it by stoking emotions such as outrage and alarm? Slow down, just a moment, and take a breath before you share.
In an atmosphere in which a healthy mode of critical engagement with media content is being amplified to create paranoia and twisted toward dismissing mainstream media altogether, Ong urges teachers of media literacy to find new, supple, and imaginative ways of engagement that balance between skepticism and credulity. “Criticism also involves appreciation,” states Ong. “Instead of dismissing media altogether, we also should recognize the powerful outcomes of media, such as advancing new forms of cultural understanding and representation, instead of saying that just because everything is owned by capitalist entities, it therefore serves hegemonic interests.”
Fox is not quite as measured: “Who is ‘the media’ anyway?” he asks impatiently—and asks us to ask ourselves. “The phrase has been so overused as to mean almost nothing.” In this new information landscape, we are all the media, each one of us an information propagator. Users are more than just consumers; we are now creators. And with our new sophistication and sphere of influence comes responsibility: in this case, that we become more responsible for making sure the information we share is accurate.
And, in closing, to keep in mind Ong’s advice: “I don’t trust any hashtag that goes number one in the Twittersphere.”
Illustrations by Brian Stauffer
We’re on the lookout
Share your most intriguing nooks, niches, coordinates, or curiosities on campus or anywhere in the region. Email magazine@umass.edu and we’ll investigate!