In March, it was revealed political consulting firm Cambridge Analytica had obtained and exploited personal data from 50 million Facebook users (the figure was later revised to 87 million). After a five days of silence, Facebook CEO Mark Zuckerberg went on an apology tour, appearing on television and speaking with newspapers. He even took out full-page print ads in the New York Times, Washington Post and Wall Street Journal, along with several other papers in the U.K., to express how sorry he was about the situation.
“This was a breach of trust,” he said in the ads, “and I’m sorry we didn’t do more at the time. We’re now taking steps to ensure this doesn’t happen again.”
But it was too little too late for many people. A #DeleteFacebook campaign was created, and everyone from Elon Musk to Will Ferrell announced they were getting rid of their Facebook accounts. Although Zuckerberg didn’t seem too worried about the deletions, he told the Times, “I think it’s a clear signal that this is a major trust issue for people, and I understand that. And whether people delete their app over it or just don’t feel good about using Facebook, that’s a big issue that I think we have a responsibility to rectify.”
In a previous editorial, I wrote about those “trust issues” when it was revealed that Facebook, Twitter and Google had all been targeted by Russian agencies to influence their users during the 2016 U.S. presidential election. Now, with this latest breach of trust, we are watching these mighty tech giants fall from grace.
Facebook has a long history trying to make amends to its now 2 billion users. But should they get all the blame? A recent Reuters/Ipsos poll revealed that 41 percent of Americans trusts Facebook to obey laws that protect personal information, compared with 66 percent who said they trust Amazon, 62 percent who trust Google, 60 percent for Microsoft and 47 percent for Yahoo. The same poll also revealed 46 percent of adults said they want more government regulation on how the tech industry is handling information, while 17 percent said they want less. Another 20 percent said they wanted no change, and the remaining 18 percent said they did not know.
Franklin Foer wrote in the Atlantic that it was time to regulate the internet.
“As we privatized the net, releasing it from the hands of the government agencies that cultivated it, we suspended our inherited civic instincts,” he said. “Instead of treating the web like the financial system or aviation or agriculture, we refrained from creating the robust rules that would ensure safety and enforce our constitutional values.”
He added, “The fact that Facebook seems unwilling to fully own up to its role casts further suspicion on its motives and methods. And in the course of watching the horrific reports, the public may soon arrive at the realization that it is the weakness of our laws that has provided the basis for Facebook’s tremendous success.”
Foer has a point. We’ve all been compliant. I look at our industry and see how we handed over our content to Facebook and Google, and our audiences to Twitter and Instagram. We can no longer be “frenemies” with these companies. We have to pick sides now.
On April 10 and 11, Zuckerberg testified before Congress. Over the course of 10 hours, he faced questions from state senators and representatives, ranging from the mishandling of data to hate speech. I followed the testimonies both days and felt like Zuckerberg left more answers unanswered than answered. But he did admit one thing: that federal regulation of Facebook and other internet companies was inevitable.
Until that happens, let the countdown to Zuckerberg’s next apology statement begin.