The Data Conspiracy November 2020
Want more free featured content?
Subscribe to Insights in Brief
Social-media algorithms and message boards are fuelling a rise in misinformation and conspiracy theories. Although technology may eventually solve the problem, real societal harm may occur in the meantime. Popular conspiracy theories such as QAnon appear to be affecting consumer decision-making, including voting choices, health-care decisions (for example, whether or not to have vaccinations), and technology use (for example, some people have sabotaged 5G‑network equipment, believing it to be harmful).
Stakeholders have blamed data-driven social-media algorithms for feeding people conspiracy-theory-related content. YouTube's content-recommendation algorithm has come under particular criticism, and the algorithm certainly played a key role in spreading the flat-Earth conspiracy theory. In response, in 2019, the YouTube team reengineered its recommendation algorithm, including a machine-learning system that learned to recognize (and demote) conspiratorial content. Researchers at the University of California, Berkeley, say that the release of the updated algorithm corresponded with an overall decrease in conspiracy content but that the platform still recommends conspiracy content to people who have previously shown an interest in it. Similarly, although Facebook has various measures in place to combat misinformation, a recent investigation by the Guardian found that Facebook algorithms are still recommending QAnon content to people who indicate pro-Trump, anti-vaccine, and anti-lockdown preferences.
The charge that social-media algorithms are the root cause of the rise in conspiracy theories is too simplistic. Recent misinformation spread has often been via active sharing of content on message boards rather than via content views from recommendation systems. Factors such as online anonymity and the echo chambers that result from collections of like-minded people are clearly at play, as are many political and social factors that are well beyond the confines of social media and technology.
Conspiracy theories will likely become increasingly mainstream in the near future, as misinformation shifts from individual fake-news stories to broader narratives. Through algorithms and other features, social media will accelerate the trend as vendors' mitigation efforts fail to keep pace. Eventually, the algorithms that control information sharing may become smart enough to sort truth from reality (including within message boards), though perhaps not before misinformation causes real societal damage.
A rise in misinformation is a substantial threat to governments, corporations, and society in general. Already, many governments consider the rise of conspiracy theories and fake news to be a major security threat, and state-sponsored misinformation campaigns on social media may have already influenced the outcomes of elections. Conspiracy theories are also playing a role in societal health and may well influence product and brand decisions among consumers. Although many factors have contributed to the rise of fake news and conspiracies, big-data-enabled algorithms have clearly been part of the problem; however, algorithms that weed out conspiratorial content could also form part of the solution.