Reflections on Cambridge Analytica and Facebook May 2018
Want more free featured content?
Subscribe to Insights in Brief
The now-defunct Cambridge Analytica used the data of at least 50 million Facebook users without their permission and in violation of the laws of some countries. Following media coverage of the incident, Facebook CEO Mark Zuckerberg faced two days of questioning by US politicians. Senator Lindsey Graham asked Zuckerberg about Facebook's competitive position. Zuckerberg responded by saying that the average US citizen uses eight apps to communicate (but did not say that Facebook owns several of them). Graham then asked, "You don't feel you have a monopoly?"
An online campaign, #DeleteFacebook, encouraged users to delete their Facebook accounts. A US survey by Creative Strategies found that, of the respondents who were aware of the Cambridge Analytica story (76%), 9% had deleted their Facebook accounts and many others had cut back on their Facebook usage.
For some people, the Cambridge Analytica incident has provided a wake-up call about privacy. Even if users do not follow through on their claims that they will cut back on Facebook usage (or, in some cases, stay off Facebook altogether), the incident has increased public awareness about tech firms' use of personal data.
By itself, the Cambridge Analytica incident will probably not be enough to change consumers' overall personal-privacy-related behavior (which, despite any claims people make, still typically suggests that people are happy to trade their data for free services)—but the development might contribute to a behavioral shift over time, as privacy incidents continue to mount (for example, Facebook is already facing a separate class-action lawsuit over face recognition) and as the downsides of sharing personal data become clearer to average consumers.
The Cambridge Analytica incident is also significant because it raises questions about the competitive position of some data-centric firms. In particular, are some companies' data holdings becoming so large and valuable that they are creating systemic risks for businesses and society? Is too much valuable data under the control of too few organizations—a case of too many eggs in too few baskets? Perhaps the data vaults of certain firms are becoming such attractive targets for wrongdoers that problems like the Cambridge Analytica incident are inevitable.
A key issue is whether major data firms can provide data governance on a massive scale. But some of Facebook's failings seem a result of governance practices that are unable to keep pace with growth. Facebook apparently found out about the Cambridge Analytica problems through news sources rather than through its own procedures. In 2017, a Guardian investigation revealed that Facebook's content moderators (who monitor and censor violence, hate speech, terrorism, pornography, racism, and self-harm on the site) often have "just 10 seconds" to evaluate content against complex guidelines. Although future artificial intelligence may automate some aspects of data governance, such technologies are still years away from being foolproof. In the meantime, the trust that individuals, businesses, and lawmakers put in major data firms could decline. The March 2018 Explorer Viewpoints on Big Data mentions the possibility of the power balance for personal data shifting away from corporations and governments and back to consumers (perhaps enabled by changing regulations such as the European Union's General Data Protection Regulation and by new data technologies, including blockchain).