Sites like Facebook will have more and more influence over our elections in the future.
America's favorite uncorroborated news story of the moment is that the Russian government masterminded Trump's rise to power. It's easy to understand why. Introspection after a loss is difficult, and rather than face themselves, the DNC decided to have a seance, evoking a Cold War ghost to explain their defeat. It's somewhat comforting to assume an international conspiracy was behind the Hillary Clinton's failure in the 2016 election. It absolves the DNC of any responsibility to change their conduct or adjust their political strategy. That said, there is no hard evidence of collusion, but rather a string of awkward encounters by Trump's largely inexperienced, and frankly stupid, staff. The meat of Russia's "interference" came in the form of social media bots, fake accounts that would automatically repost sensationalist headlines to drum up support for Trump. These accounts are pretty easy to spot however, as they don't even come close to passing a turing test.
Blaming Russia is too easy
Still, the creation of Russia's bot army had to be predicated on some form of information, and many have accused Putin's government of tracking users' Facebook data in an attempt to gain a psychological understanding of the average American voter. This is where Aleksandr Kogan comes into play. Kogan sold the data of some 87 million Facebook users (collected via a quiz app) to Cambridge Analytica, a political consulting firm hired by the Trump campaign. Cambridge Analytica's goal was to create psychographic voting profiles. While there's no definitive connection between Cambridge Analytica and Russia, the precedent set by CA and their illegal exploitation of Facebook is a frightening one. If a private company is collecting data on citizens, it's a pretty safe bet that governments around the world are doing the same. While the Democratic Party's Russophobia is definitely a reaction to losing in 2016 more than anything else, but it accidentally shed light on an important issue: our data isn't safe, and with recent improvements AI and voice recognition software, we'll soon have the technology to not only create comprehensive individual psych profiles, but to tailor campaigns to individual voters.Obviously companies like Google and Facebook have large stores of internal data, and they've certainly been amenable to selling it, but academic researchers (like Kogan) also have large data caches. Behavioral psychologists use Facebook in studies all the time, and the academic world isn't particularly well-known for its cyber security. Even in the event that these databases aren't hacked, there's nothing to prevent a researcher from selling their findings after their study is complete. The quick fix is to let Facebook block third parties from collecting data on its users, and for its part, Facebook has done just that. They've begun blocking apps from collecting information, and have also limited the number of researchers allowed to look at data on the site. Only academics researching political elections through the lens of social media are permitted to apply for access to Facebook's database.
At a glance, these robust safety measures are a breath of fresh air. It isn't often that a tech company is so committed to its customers' privacy. That said, when things look too good to be true, they usually are. If Facebook continues its path to prohibition, "only Facebook will really know very much about how Facebook actually operates and how people act on Facebook," warns Dr. Rasmus Kleis Nielsen of Oxford University. Sure, measures like these could protect data from outsiders, but it would also give a private company sole proprietorship over the most comprehensive database of human behaviors and tendencies ever created. Facebook would have even more sway over our local and national elections than it already does, and would gain a monopoly over 2 billion people's personal data. Essentially, Facebook could name its price. Because of the way the Internet works, there's no way to effectively protect our Facebook data without severely compromising our freedom. And even if we were to let Zuckerberg shut everyone out of Facebook's data vaults, this doesn't prevent other websites or services from collecting information on us. It doesn't make us any safer. Our sensitive information is freely available to anyone who knows how to access it.
As technology improves, it's going to become more and more difficult to tell what is and isn't fake news–whether or not that article you just read was an advertisement for Tide or some political campaign you weren't aware of. For better or worse, we've set out to map the entire spectrum of human behaviors. Eventually, marketing campaigns will be so advanced, so accurate in their mapping of our desires, we may forget that we ever had the capacity to think. Somewhere, the ghost of B.F. Skinner smiling.