Breaking five days of silence, Facebook CEO Mark Zuckerberg apologised for a “major breach of trust,” admitted mistakes and outlined steps to protect user data in light of a privacy scandal involving a Trump-connected data-mining firm.
“I am really sorry that happened,” Zuckerberg said of the scandal involving data mining firm Cambridge Analytica. Facebook has a “responsibility” to protect its users’ data, he said in a Wednesday interview on CNN. If it fails, he said, “we don’t deserve to have the opportunity serve people.”
His mea culpa on cable television came a few hours after he acknowledged his company’s mistakes in a Facebook post , but without saying he was sorry.
Zuckerberg and Facebook’s No. 2 executive, Sheryl Sandberg, had been quiet since news broke Friday that Cambridge may have used data improperly obtained from roughly 50 million Facebook users to try to sway elections. Cambridge’s clients included Donald Trump’s general-election campaign.
Facebook shares have dropped some 8 percent, lopping about $46 billion off the company’s market value, since the revelations were first published.
Even before the scandal broke, Facebook has already taken the most important steps to prevent a recurrence, Zuckerberg said. For example, in 2014, it reduced access outside apps had to user data. However, some of the measures didn’t take effect until a year later, allowing Cambridge to access the data in the intervening months.
uckerberg acknowledged that there is more to do.
In a Facebook post on Wednesday, Zuckerberg said it will ban developers who don’t agree to an audit. An app’s developer will no longer have access to data from people who haven’t used that app in three months. Data will also be generally limited to user names, profile photos and email, unless the developer signs a contract with Facebook and gets user approval.
In a separate post, Facebook said it will inform people whose data was misused by apps. Facebook first learned of this breach of privacy more than two years ago, but hadn’t mentioned it publicly until Friday.
The company said it was “building a way” for people to know if their data was accessed by “This Is Your Digital Life,” the psychological-profiling quiz app that researcher Aleksandr Kogan created and paid about 270,000 people to take part in. Cambridge Analytica later obtained information from the app for about 50 million Facebook users, as the app also vacuumed up data on people’s friends — including those who never downloaded the app or gave explicit consent.
Chris Wylie, a Cambridge co-founder who left in 2014, has said one of the firm’s goals was to influence people’s perceptions by injecting content, some misleading or false, all around them. It’s not clear whether Facebook would be able to tell users whether they had seen such content.
Cambridge has shifted the blame to Kogan, which the firm described as a contractor. Kogan described himself as a scapegoat.
Kogan, a psychology researcher at Cambridge University, told the BBC that both Facebook and Cambridge Analytica have tried to place the blame on him, even though the firm ensured him that everything he did was legal.
“One of the great mistakes I did here was I just didn’t ask enough questions,” he said. “I had never done a commercial project. I didn’t really have any reason to doubt their sincerity. That’s certainly something I strongly regret now.”
He said the firm paid some $800,000 for the work, but it went to participants in the survey.
“My motivation was to get a dataset I could do research on,” he said. “I have never profited from this in any way personally.”
Authorities in Britain and the United States are investigating.
David Carroll, a professor at Parsons School of Design in New York who sued Cambridge Analytica in the U.K., said he was not satisfied with Zuckerberg’s response, but acknowledged that “this is just the beginning.”
He said it was “insane” that Facebook had yet to take legal action against Cambridge parent SCL Group over the inappropriate data use. Carroll himself sued Cambridge Friday to recover data on him that the firm had obtained.
Sandy Parakilas, who worked in data protection for Facebook in 2011 and 2012, told a U.K. parliamentary committee Wednesday that the company was vigilant about its network security but lax when it came to protecting users’ data.
He said personal data including email addresses and in some cases private messages was allowed to leave Facebook servers with no real controls on how the data was used after that.
“The real challenge here is that Facebook was allowing developers to access the data of people who hadn’t explicitly authorized that,” he said, adding that the company had “lost sight” of what developers did with the data.
Leave a reply