Facebook shows how to lose friends and infuriate people
Facebook’s disregard for the privacy of its users is costing it dearly.
“For those I hurt this year, I ask forgiveness and I will try to be better,” Mark Zuckerberg wrote in a post last year. “For the ways my work was used to divide people rather than bring us together, I ask for forgiveness and I will work to do better.”
Until this week, Facebook’s founder probably couldn’t have imagined a worse year for the social network than 2017. In that year alone the tech giant was accused of failing to prevent the spread of disinformation and violent imagery, of political bias against conservative causes, of distorting news and advertising and of allowing Russian intelligence to exploit its sites to interfere with the 2016 US presidential election.
Zuckerberg is now grappling with the biggest and most damaging controversy in Facebook’s 14-year history as he seeks to explain how the personal data of 50 million users was harvested and exploited to help elect Donald Trump as President.
The ramifications for Facebook and other tech giants of such a spectacular breach of privacy wiped more than $US50 billion ($65bn) off Facebook’s value as Wall Street tried to gauge the extent of the damage.
All of a sudden, Facebook faces a multi-pronged battle to save its reputation and to persuade the world that it can be trusted to protect the personal privacy its 2.1 billion users. Of these, 1.4 billion people, including 12 million Australians, use the site daily. Facebook has become the world’s eighth largest listed company.
“It’s clear these platforms can’t police themselves … they say ‘trust us’,” US senator Amy Klobuchar tweeted this week, echoing the fears of many.
The US Federal Trade Commission and four state attorneys-general are investigating Facebook for possible regulatory breaches for failure to protect its data. The scandal could see fines of up to an astonishing $US2 trillion levied on the company if it is found to have breached its 2011 consent decree with the FTC.
But the larger concern is one that also affects other companies that belong to the Big Data club.
Along with Facebook, Google, Apple and Amazon have been lauded for the way they have changed our lives, but can they be made accountable for the darker civil and social consequences of the tech revolution? Or are they runaway behemoths whose leaders barely understand the nature of the online universe they have created? In other words, are they making it up as they go along?
Tech giants are losing friends. They face accusations of monopolistic behaviour, lax privacy provisions and a lack of transparency. They behave as if they live in a parallel universe where the rules do not apply to them.
Facebook has been slow to react to this latest crisis. The company initially tried to pass blame for the misuse of data from its users to the third parties that harvested and used the data — British-based voter-profiling company Cambridge Analytica and Russian-American academic Aleksandr Kogan.
“This was a scam — and a fraud,” Facebook vice-president Paul Grewal said this week.
“Protecting people’s information is at the heart of everything we do. No systems were infiltrated, and no passwords or sensitive pieces of information were stolen or hacked. It’s a serious abuse of our rules.”
After a long silence during which Wall Street sliced 10 per cent off the value of his company, Zuckerberg finally admitted this week that Facebook ultimately bore responsibility.
“I started Facebook, and at the end of the day I’m responsible for what happens on our platform,” he said. “While this specific issue involving Cambridge Analytica should no longer happen with new apps today, that doesn’t change what happened in the past. We will learn from this experience to secure our platform further and make our community safer for everyone going forward.
“We have a responsibility to protect your data, and if we can’t then we don’t deserve to serve you.”
The seeds of the scandal were sown when Facebook had even weaker rules about what personal data could be harvested from its users.
In 2007 the company opened its platform to developers who took the data of users to create dating, job-search and gaming apps. Facebook made big money from this.
In 2013 Kogan, a psychologist at Cambridge University, created a personality-testing app designed to try to obtain a more comprehensive picture of people’s lives. Kogan told Facebook users that the app was for “research”.
For a small payment he persuaded 270,000 Facebook users to download the app. They did not know this also gave Kogan access to the data of their Facebook friends, meaning that he soon had access to the data of about 50 million Facebook users.
Kogan’s work was overseen by a new company called Cambridge Analytica.
Trump’s former chief strategist Steven Bannon and billionaire Republican donor Robert Mercer helped to launch CA in 2013 as part of an ambitious plan to build detailed profiles of American voters.
With Kogan’s app they received not only the names, home towns and dates of birth of Facebook users; they also had access to their employment details and work history, their education, their religious affiliations and their interests. What’s more, they could see what news stories and other online posts they had liked.
When this information was married with US voter registries, CA suddenly found itself with a devastatingly accurate portrait of the habits, interests and beliefs of 50 million US voters.
In 2014, Facebook belatedly introduced new rules to limit the amount of user data that could be harvested by third-party apps. In December 2015, when it was publicly reported that Kogan and CA had collected a trove of user data, Facebook waited eight months to respond. Even then it merely wrote a letter banning the Kogan app and demanding that Kogan and CA delete the data.
Kogan and CA assured Facebook it had deleted the data but this was not true — a fact Zuckerberg this week described as a “breach of trust”.
At this point Facebook made two mistakes, both of which betrayed a lack of rigour in protecting Facebook users. It took at face value the claims by Kogan and CA that the data was destroyed without verifying those claims. Facebook also chose not to inform the 50 million users that data had been harvested from their sites without their permission.
“(This was) a mistake in retrospect,” Zuckerberg admitted this week. “This was a breach of trust between Kogan, CA and Facebook. But it was also a breach of trust between Facebook and the people who share their data with us and expect us to protect it. We need to fix that.”
Meanwhile, CA, under Bannon’s guidance, used the voter data to help hone Trump’s anti-establishment campaign messages, testing slogans such as “Drain the Swamp” and “Deep State”.
During secretly filmed footage aired by Britain’s Channel 4 this week, CA chief executive Alexander Nix boasted that he had met Trump “many times” in 2016 and that his firm’s research played a key role in his victory.
“We did all the research, all the analytics, all the targeting. We ran all the digital campaign, the television campaign and our data informed all the strategy,” Nix said.
CA’s chief data officer, Alex Tayler, went further: “When you think about the fact that Donald Trump lost the popular vote by three million votes but won the electoral college, that’s down to the data and the research,” he said. “That’s how he won the election.”
Trump campaign officials this week tried to play down these claims, saying they were greatly exaggerated.
“Another day of people taking credit for @realDonaldTrump’s victory … just an overblown sales pitch,” tweeted Brad Parscale, the digital director of Trump’s 2016 campaign.
But Trump’s son-in-law and top campaign aide Jared Kushner admitted in an interview after Trump’s victory that CA played an important role in the campaign.
“We found that Facebook and digital targeting were the most effective ways to reach the audiences,” Kushner said.
“After the primary, we started ramping up because we knew that doing a national campaign is different from doing a primary campaign. That was when we formalised the system because we had to ramp up for digital fundraising. We brought in Cambridge Analytica.”
Data research on voters is hardly a new concept. Trump’s predecessor, Barack Obama, did it with his 2012 campaign, although the Obama camp claims it used its own apps, which complied with Facebook’s terms, and that it received permission from each of the one million users who gave the campaign access to their data.
“Cambridge Analytics obtained their data fraudulently, laundered through a researcher,” Obama’s 2012 campaign manager Jim Messina tweeted this week.
The fact the CA data was used to help elect a Republican president has compounded Zuckerberg’s problems in Washington. Democrats are even more hostile to the network when Facebook is already on the nose with both parties for allowing Russian intelligence to exploit its sites during the 2016 US presidential election.
Zuckerberg now says he will “step up” to fix privacy issues and that he will review “thousands of apps” that have used information from Facebook. “This isn’t rocket science,” he said. “We can get in front of this.”
The problem is that much of the damage may have been done already.
“Are there other Cambridge Analyticas out there?” Zuckerberg asked this week. “Were there apps which could have gotten access to more information and potentially sold it without us knowing or done something that violated people’s trust? We also need to make sure we get that under control.”
Zuckerberg says he will change Facebook policies to make it much tougher for third-party developers by, for example, requiring them to sign contracts before they are allowed to ask users for access to their posts.
He says he also will empower users by giving them greater transparency over how their data is shared and more control over it.
The trouble for Facebook is that its essential method of business is to share the data of its users, a model that has seen it grow and thrive. Last year, Facebook’s advertising sales were a whopping $US40bn, which it earns by selling access to users’ identities and behaviour online.
This may seem harmless when Facebook’s algorithms direct advertisements to users that correlate with their interests, such as hiking gear or fishing. But when users are subjected to propaganda on politics, race and other divisive social issues, the algorithms take on a more sinister hue.
Jonathan Albright of the Tow Centre for Digital Journalism at Columbia University says the problem is that data-sharing is fundamental to Facebook.
“(Zuckerberg) avoided the big issues, which is that for many years Facebook was basically giving away user data like it was handing out candy,” Albright was quoted as saying this week.
“There is no question that handing out that data made Facebook the success it is as a company. This has to be recognised as part of their business model and not just a one-off problem.”
Given its troubled record so far, can Facebook convince people it has the ability to share its user data in a way that is transparent, consensual and safe from abuse?
Perhaps the scandals are already having an impact. The Economist reports that globally Facebook users spent about 50 million fewer hours a day on the platform in the fourth quarter of last year, a 15 per cent drop in time spent compared with 2016.
Regulators also are hovering like never before. In May, Europe will introduce new regulations on user consent and data privacy. The US has been slower to act in this area but congress is now determined to get answers from Zuckerberg. Multiple congressional committees are calling on him to testify before them.
The astronomical growth of Facebook and the key role it occupies in the lives of its daily users gives it a power that makes it the crown jewels for those who would exploit it.
Zuckerberg this week all but admitted that Facebook finds itself in a place that he never expected when he founded the social network.
“If you had asked me, when I got started with Facebook, if one of the central things I’d need to work on is preventing governments from interfering in each other’s elections, there’s no way I thought that’s what I’d be doing if we talked in 2004 in my dorm room.”