Trolls' and Bots' Online Unreality Featured Pattern: P0856 December 2015
Abstracts in this Pattern:
Open-source intelligence—information gatherable from public sources—is a tool seeing wide use by government intelligence agencies. However, using public data sources such as social-media websites and blog posts is becoming increasingly difficult, according to intelligence experts. Russia's so-called troll army (see SC-2015-06-03-089), which comprises a group of pro-Russian commentators, and similar groups in other countries leave opinions and biased information on media sites. These efforts diminish the reliability of open-source intelligence.
Some systems are susceptible to bias even when no one is attempting deliberate manipulation. In such cases, bias can be a potential outcome of the systems' applications. For example, researchers at the American Institute for Behavioral Research and Technology (Vista, California) found that biases in internet-search rankings could have a substantial influence on the views of undecided voters during elections. Most people tend to use only high-ranking results or to assume that high-ranking results are more reliable than low-ranking results. The study found that deliberate biasing of search results could influence voting preferences among undecided voters by as much as 20%, which means that search-engine providers could wield a significant amount of political influence. The researchers argue that this manipulation is especially effective because it is not apparent to the people it affects.
Extramarital-dating-service provider Ashley Madison (Avid Life Media; Toronto, Canada) employed a completely different type of unreality. In August 2015, hackers published large quantities of user data they stole from the Ashley Madison website. Researchers investigating the data leak found that Ashley Madison had set up a large number of algorithmic bots that acted like human women and conversed with customers, encouraging them to buy premium services. According to technology journalist Annalee Newitz, company emails indicate that "80 percent of first purchases on Ashley Madison were a result of a man trying to contact a bot, or reading a message from one." If interactions with a user progressed, a real person assumed control of the automated account. The company used algorithmic conversations during the entire initial-contact phase to reduce costs and efforts.
The Development of this Pattern
Russia's so-called troll army, which comprises a group of pro-Russian commentators, and similar groups in other countries leave opinions and biased information on media sites.
Researchers at the American Institute for Behavioral Research and Technology found that biases in internet-search rankings could have a substantial influence on the views of undecided voters during elections.
Extramarital-dating-service provider Ashley Madison set up a large number of algorithmic bots that acted like human women and conversed with customers, encouraging them to buy premium services.
Trolls' and Bots' Online Unreality
Propaganda, algorithmic rankings, and artificial-intelligence bots can create a biased and fake reality on the internet.
- SoC102 — Spin, PR, and the Evolving Media Landscape (April 2005)
Public-relations professionals are using new media and new methods to advance their causes. Consumers are using the same media to engage some PR folks in a battle of wits for the high ground and for an advantageous position in the evolving information landscape.
- SoC167 — Media Manipulators (April 2006)
Cyberspace is experiencing an increasing number of battles as vested parties manipulate to their advantage the open environment that the Internet is famous for. An increasing number of developments in cyberspace go beyond marketing or advertising because they involve the manipulation of forums that depend on objectivity for their value to the consumer.
- SoC222 — Authenticity on the Internet (February 2007)
One of the original developers of the World Wide Web warned in a recent talk that the Web is in danger of becoming "a place where untruths start to spread more than truths" and spoke of "the risks associated with inaccurate, defamatory and uncheckable information." He believes that "technology must help us express much more complicated feelings about who we'll trust with what."
- P0021 — Focusing the Point of Decision (February 2010)
As busy consumers of information rely increasingly on electronic connections to make decisions, providers potentially could become powerful sources of decision-making manipulation.
- P0437 — Network-Information Veracity (January 2013)
New applications could provide methods of ensuring the accuracy and reliability of information on the internet.
- SoC696 — Hive Mindfulness (December 2013)
Many websites that invite users to post content and commentary have features that allow users to rank one another's contributions in some way, resulting in a digital hive mind.
- P0639 — The Truth Is out There (June 2014)
Novel approaches have the potential to separate fact from fiction in the online world.