We hear a lot about fake news – it’s widely recognized as a problem not just for individual consumers of media but for democratic societies en masse. What is fake news and how should we respond to it? It is helpful to understand fake news as part of a bigger problem we can call media manipulation. Laws and policies may help to counter this and, in some cases, will be necessary. But, at best, they offer incomplete solutions. More important is to understand the sort of people we need to become to inoculate ourselves against this form of manipulation.
Questions like this – about the sort of people we are and need to become – are questions about moral character. To best respond to the problem of fake news we need a sharper understanding of moral character (and as it turns out, the problem of fake news can help us to understand better what moral character is). But before approaching these questions, it will first help to understand what fake news and media manipulation are. It will also help to look at several proposed solutions to the problem and why they are incomplete and, in some cases, worrying. This will be the focus in what follows. The next essay in this series will focus on questions of moral character, providing a philosophical framework on which to build a solution.
If we understand fake news as “false stories that appear to be news” and news as “verifiable information in the public interest,” then fake news is an oxymoron. What we often refer to as fake news is just disinformation, as a UNESCO handbook for journalists points out. This disinformation is also typically intended to influence political views. Part of what makes fake news so pernicious is that it is disinformation meant to control public opinion and it does this by controlling the beliefs of its individual consumers. In other words, its intention is not to inform the public but to manipulate it.
Russian interference in America’s 2016 presidential election is now a classic example of this. A more recent example is Beijing’s attempts to undermine the credibility of protesters and sway public opinion throughout the Hong Kong protests of summer 2019. In addition to the standard tactics of the political propagandist, this included the use of either false or distorted information in as many as 200,000 Twitter accounts masqueraded as individual users, though they were in fact controlled and coordinated in a State-backed operation by Beijing. This kind of manipulation and intent to control damages democracy and it disrespects citizens. It prevents citizens from making reasonably informed decisions; it bypasses their consent and uses them as instruments for broadly political purposes.
However, information need not directly misinform to be manipulative. There is a large gap between straightforward disinformation and responsible journalism. When we talk about fake news, we sometimes mean to include reporting that does not meet the standards of responsible journalism – reporting that is heavily biased, withholds important information, or otherwise distorts the truth, again often with the intention to influence political views. Partial truths can be as harmful as lies, and news that doesn’t directly disinform can still be manipulative.
Nowadays, this manipulation often comes in the form of emotionally charged content and headlines designed to make us share that content. The emotion communicates a sense of urgency and importance, causing us to feel as though the world must know what we know (or what we believe we know). All the better if we can have a ‘moral experience’ along the way, by publicly identifying with what we take to be a just cause. These shares increase web traffic and ad revenue for the companies that create that content and run the platforms we share it on.
When we do this, though we might feel like we are in control, we are simultaneously being used as instruments for financial gain. Sharing a list like “Ten Reasons Everyone Should be Furious About Trayvon Martin’s Murder” may make you feel as though you’re doing something good, but the reason you’re reading it in the first place is because it makes someone else (and not Trayvon Martin’s family) money. As Jonah Peretti, cofounder and CEO of Buzzfeed, once explained in a pitch to venture capitalists: “raw buzz is automatically published the moment it is detected by our algorithm” and “the future of the industry is advertising as content.” This and other broader forms of manipulation by media is the general problem that fake news is a part of.