In October, executives from Facebook, Google and Twitter appeared before the U.S. Congress to face the music—they were there to talk about the roles they played in Russia’s meddling during the 2016 presidential election. According to the New York Times, Facebook had revealed that more than 126 million users were exposed to political ads bought by Russian agents during the election. Meanwhile, Reuters reported Google had discovered Russian operative had spent tens of thousands of dollars on ads on YouTube, Gmail and Google Search products, and BuzzFeed News announced Twitter had offered a Russian state-owned television network up to 15 percent of its total share of U.S. elections advertising.
As a result, lawmakers introduced a new bill called the Honest Ads Act. “(The) measure, in short, would require tech giants for the first time to make copies of political ads—and information about the audience those ads targeted—available for public inspection,” according to Tony Romm of recode.
In addition, “the new online ad disclosure rules would cover everything from promoted tweets and sponsored content to search and display advertising. And it includes ads on behalf of a candidate as well as those focused on legislative issues of national importance.”
I welcome this new requirement. For too long, these tech companies have operated without a wrangler, and it’s about time they took responsibility for the misinformation being spread on their platforms. If the public expects newspapers to tell the truth, the same standard should be applied to tech companies.
But is the bill too harsh? Not according to Renee Diresta and Tristan Harris, a social media disinformation researcher with Data for Democracy and a former Google design ethicist, who wrote in Politico Magazine, “We think technology platforms have a responsibility to shield their users from manipulation and propaganda. So far, they have done a terrible job doing that. Even worse, they have repeatedly covered up how much propaganda actually infiltrates their platforms.”
They continued, “Design ethics entails understanding and taking responsibility for the actions that users take because of product design. Facebook, Twitter and YouTube are not neutral tools. Their products actively harvest human attention and sell it to advertisers, which is both immensely profitable and vulnerable to manipulation by actors like Russia.”
It’s hard to imagine that you’re being manipulated when scrolling through cute baby pictures on Facebook or memes on Twitter, but next time, look closer and you’ll see ads and items on your news feed meant to sway you. Once upon a time, you usually relied on a “second opinion” with what you read in the paper or saw on television; lately, it seems like “I saw it on the internet” is sufficient enough for people to consider what they saw as facts. But everything on the internet should be taken with a grain of salt.
Facebook may have pledged to work harder on implementing fact-checking tools on its site, but the Guardian recently reported that many fact-checkers who partnered with Facebook consider the effort a failure.
“I don’t feel like it’s working at all. The fake information is still going viral and spreading rapidly,” a journalist who fact-checks for Facebook told the Guardian. “It’s really difficult to hold (Facebook) accountable. They think of us as doing their work for them. They have a big problem, and they are leaning on other organizations to clean up after them.”
Another journalist said, “The relationship they have with fact-checking organizations is way too little and way too late. They should really be handling this internally. They should be hiring armies of moderators and their own fact-checkers.”
Essentially, we all need to be fact-checkers (Check out Tim Gallagher’s column this month, a cautionary tale for newspapers working with tech companies. Tim's column will be published online Dec. 18). Whether you work for a newspaper or a tech company, we are responsible for tearing down the lies and building a foundation on truth.
No comments on this item Please log in to comment by clicking here