Skip to Main Content

Strategic Business Insights (SBI) logo

Trolls' and Bots' Online Unreality Featured Pattern: P0856 December 2015

Author: Guy Garrud

Propaganda, algorithmic rankings, and artificial-intelligence bots can create a biased and fake reality on the internet.

Abstracts in this Pattern:

Open-source intelligence—information gatherable from public sources—is a tool seeing wide use by government intelligence agencies. However, using public data sources such as social-media websites and blog posts is becoming increasingly difficult, according to intelligence experts. Russia's so-called troll army (see SC-2015-06-03-089), which comprises a group of pro-Russian commentators, and similar groups in other countries leave opinions and biased information on media sites. These efforts diminish the reliability of open-source intelligence.

Some systems are susceptible to bias even when no one is attempting deliberate manipulation. In such cases, bias can be a potential outcome of the systems' applications. For example, researchers at the American Institute for Behavioral Research and Technology (Vista, California) found that biases in internet-search rankings could have a substantial influence on the views of undecided voters during elections. Most people tend to use only high-ranking results or to assume that high-ranking results are more reliable than low-ranking results. The study found that deliberate biasing of search results could influence voting preferences among undecided voters by as much as 20%, which means that search-engine providers could wield a significant amount of political influence. The researchers argue that this manipulation is especially effective because it is not apparent to the people it affects.

Extramarital-dating-service provider Ashley Madison (Avid Life Media; Toronto, Canada) employed a completely different type of unreality. In August 2015, hackers published large quantities of user data they stole from the Ashley Madison website. Researchers investigating the data leak found that Ashley Madison had set up a large number of algorithmic bots that acted like human women and conversed with customers, encouraging them to buy premium services. According to technology journalist Annalee Newitz, company emails indicate that "80 percent of first purchases on Ashley Madison were a result of a man trying to contact a bot, or reading a message from one." If interactions with a user progressed, a real person assumed control of the automated account. The company used algorithmic conversations during the entire initial-contact phase to reduce costs and efforts.