Recently, Mark Zuckerberg testified in front of Congress regarding Facebook’s mishandling of user data. If you had the opportunity, what questions would you ask him, and what solutions would you want to see?
Cresencio Rodriguez-Delgado, 22, senior, California State University, Fresno
Rodriguez-Delgado is editor-in-chief of the Collegian, the student-run newspaper at CSU Fresno, and is a print journalism major.
Facebook CEO Mark Zuckerberg needs to ensure Facebook’s user data is heavily guarded and kept away from those who crave it.
Zuckerberg admitted to Congress during a hearing in April that his company allowed Cambridge Analytica, a private political research firm, to obtain information of tens of millions of Facebook users since 2014, according to Reuters. That firm, now under investigation by the Justice Department and the FBI and soon shutting down, was then hired by the 2016 Donald Trump presidential campaign—with the potential to provide useful information about American voters without their knowledge.
Facing criticism for allowing the data to be misused, Zuckerberg apologized to Congress. But the lack of responsibility by Facebook is extremely worrisome. Consider that Facebook also was not aware that a Russian troll farm engaged in online influence operations to divide Americans during an already-heated 2016 presidential election.
Mr. Zuckerberg, what will change? Was Facebook meant to grow as fast and as widespread as it has? Did you anticipate privacy issues of this scale? If so, did you decide that company profits outweighed the security and privacy of its users? What reassurance can you give users that their data will not be used for nefarious purposes?
Millions of people across the world willingly give intimate data to Facebook. It’s anyone’s guess as to whether or not they truly understand where it goes. But Facebook needs to keep user information private. The recent mishandling of data proves the company does not have the desire to do so. It is frightening that a media company so big and powerful can invite users to share their most sensitive information and yet tougher safety mechanisms to protect that data do not appear to be in place. Or, Mr. Zuckerberg, would you place the blame on the users for willingly surrendering their private information to Facebook?
Cambridge Analytica misled the Facebook company and the motive may have been to target voters politically. But what if next time, a foreign enemy or power manages to breach Facebook’s security mechanisms with more harm in mind? The experts at Facebook should be aware of these possibilities. Can we trust Zuckerberg to do the right thing moving forward?
Levi Sumagaysay, 43, tech writer and editor, San Jose (Calif.) Mercury News
Sumagaysay is in charge of the SiliconBeat technology blog and Good Morning Silicon Valley newsletter. She has also served as assistant business editor, business copy chief and copy editor for the Mercury News.
In his testimony before Congress and in media interviews, Mark Zuckerberg has said Facebook is in an “arms race” when it comes to protecting users’ privacy and security, which seems like an admission that it’s really beyond the company’s control.
But I’d love to ask him whether his profile is more secure than other profiles on the platform—after all, the company admitted recently that it retracted Zuckerberg’s messages from some users’ inboxes. Also, does he use any third-party apps on Facebook? That would get at whether he trusts app developers with his information. The Cambridge Analytica mess involved users handing over their information to a developer—Cambridge University researcher Aleksandr Kogan—and that developer turning around and selling that data to a political consulting firm. Now Facebook says it has stricter rules about what data developers can collect. So does this mean he would feel perfectly fine letting his daughters use Facebook if they were a little older?
I also wonder why Joseph Chancellor, who was Kogan’s business partner, continues to work for Facebook. The company says it’s investigating his work with Kogan, but won’t confirm that Chancellor has been placed on leave. Has the company fired anyone over all this? Zuckerberg suggested during his testimony that no one at the company has been held accountable.
Facebook recently suspended 200 apps and are looking into whether they misused data. How many employees are devoted to this effort? How long will it take? How often is Facebook finding security gaps on its own vs. finding out through media reports?
On another note, Facebook is reportedly looking into providing a paid version of its service. How would this work? Would such a version be ad-free and come with a greater guarantee of privacy and security? Even paid services collect user data—and get hacked. Given Facebook’s track record, how would the company convince people it can be trusted to be better stewards of their information to the point that they would pay for its services?