Facebook CEO Mark Zuckerberg has apologized to Congress in a written statement ahead of his testimony before two committee hearings today and tomorrow.
The statement suggests that the company didn’t do enough to prevent misuse because the company is ‘idealistic and optimistic’ but it now recognizes that it made a ‘big mistake’ in failing to put sufficient safeguards in place …
The seven-page statement was published in the U.S. House of Representatives Document Repository.
Zuckerberg explains what happened with Cambridge Analytica, and outlines the steps being taken in response. In it, he says that data from a Cambridge University personality quiz was shared with Cambridge Analytica in breach of Facebook policies.
Facebook is an idealistic and optimistic company. For most of our existence, we focused on all the good that connecting people can bring. As Facebook has grown, people everywhere have gotten a powerful new tool to stay connected to the people they love, make their voices heard, and build communities and businesses. Just recently, we’ve seen the #metoo movement and the March for Our Lives, organized, at least in part, on Facebook. After Hurricane Harvey, people raised more than $20 million for relief. And more than 70 million small businesses now use Facebook to grow and create jobs.
But it’s clear now that we didn’t do enough to prevent these tools from being used for harm as well. That goes for fake news, foreign interference in elections, and hate speech, as well as developers and data privacy. We didn’t take a broad enough view of our responsibility, and that was a big mistake. It was my mistake, and I’m sorry. I started Facebook, I run it, and I’m responsible for what happens here.
However, the political consultancy denies that it used the data to assist the Trump campaign.
In 2015, we learned from journalists at The Guardian that Kogan had shared data from his app with Cambridge Analytica. It is against our policies for developers to share data without people’s consent, so we immediately banned Kogan’s app from our platform, and demanded that Kogan and other entities he gave the data to, including Cambridge Analytica, formally certify that they had deleted all improperly acquired data — which they ultimately did.
Last month, we learned from The Guardian, The New York Times and Channel 4 that Cambridge Analytica may not have deleted the data as they had certified. We immediately banned them from using any of our services.
Zuckerberg’s written statement also addresses Russian interference in the U.S. presidential election, acknowledging that the company unwittingly allowed itself to be used to push disinformation to a staggering 126 million people.
The claims that we used GSR data for the Trump campaign are simply untrue. Cambridge Analytica did provide polling, data analytics and digital marketing for the Trump campaign.
While Facebook’s CEO will face some searching questions during the hearings, the consensus view in the media appears to be that there is little prospect of legislation as a result. Reuters says that new laws are ‘extremely unlikely.’
We also learned about a disinformation campaign run by the Internet Research Agency (IRA) — a Russian agency that has repeatedly acted deceptively and tried to manipulate people in the US, Europe, and Russia. We found about 470 accounts and pages linked to the IRA, which generated around 80,000 Facebook posts over about a two-year period.
Our best estimate is that approximately 126 million people may have been served content from a Facebook Page associated with the IRA at some point during that period.
The WSJ agrees that the political will appears lacking.
Re/code suggests that the hearings are ‘theater.’
Zuckerberg previously said that users don’t seem unduly concerned, the #DeleteFacebook campaign having no visible impact, and the same appears to be true of employees. The WSJ reports that the company is keeping a close watch on staff morale, but so far most employees seem to view the coverage as overblown.
Which is all to say that this week will be more about political theater than it will be about political regulation.
Part of that is on lawmakers. The idea of a bipartisan bill passing through Congress right now doesn’t seem likely, especially considering that the Honest Ads Act — a bill proposed late last year that would require more transparency around online political ads — hasn’t been put to a vote in either the House or Senate in almost six months.
But the other part is by Facebook’s design. The company claims that it’s open to certain regulations, including The Honest Ads Act, and has already pushed to self-regulate. Facebook is also preparing to comply with strict GDPR privacy regulations in the EU next month, and has promised to apply the same policies to all of its users globally.
Essentially, Facebook is giving Congress less incentive to regulate it because it’s promising to regulate itself.
You can watch today’s Senate hearing here (a message will show no video until the stream begins), and tomorrow’s testimony here.
Many within Facebook said they continue to believe the company is being unfairly picked on. Several said they viewed the privacy failure as incompetent, but not malicious. As Facebook CEO Mark Zuckerberg has done, they argue that the company’s ability to connect people is overall a good thing for society, and that Facebook will emerge from this episode stronger for having learned from its mistakes.
Photo: NBC