Editorial: Trust Issues

Editorial: Trust Issues

In October, executives from Facebook, Google and Twitter appeared before the U.S. Congress to face the music—they were there to talk about the roles they played in Russia’s meddling during the 2016 presidential election. According to the New York Times, Facebook had revealed that more than 126 million users were exposed to political ads bought by Russian agents during the election. Meanwhile, Reuters reported Google had discovered Russian operative had spent tens of thousands of dollars on ads on YouTube, Gmail and Google Search products, and BuzzFeed News announced Twitter had offered a Russian state-owned television network up to 15 percent of its total share of U.S. elections advertising.

As a result, lawmakers introduced a new bill called the Honest Ads Act. “(The) measure, in short, would require tech giants for the first time to make copies of political ads—and information about the audience those ads targeted—available for public inspection,” according to Tony Romm of recode.

In addition, “the new online ad disclosure rules would cover everything from promoted tweets and sponsored content to search and display advertising. And it includes ads on behalf of a candidate as well as those focused on legislative issues of national importance.”

I welcome this new requirement. For too long, these tech companies have operated without a wrangler, and it’s about time they took responsibility for the misinformation being spread on their platforms. If the public expects newspapers to tell the truth, the same standard should be applied to tech companies.

But is the bill too harsh? Not according to Renee Diresta and Tristan Harris, a social media disinformation researcher with Data for Democracy and a former Google design ethicist, who wrote in Politico Magazine, “We think technology platforms have a responsibility to shield their users from manipulation and propaganda. So far, they have done a terrible job doing that. Even worse, they have repeatedly covered up how much propaganda actually infiltrates their platforms.”

They continued, “Design ethics entails understanding and taking responsibility for the actions that users take because of product design. Facebook, Twitter and YouTube are not neutral tools. Their products actively harvest human attention and sell it to advertisers, which is both immensely profitable and vulnerable to manipulation by actors like Russia.”

It’s hard to imagine that you’re being manipulated when scrolling through cute baby pictures on Facebook or memes on Twitter, but next time, look closer and you’ll see ads and items on your news feed meant to sway you. Once upon a time, you usually relied on a “second opinion” with what you read in the paper or saw on television; lately, it seems like “I saw it on the internet” is sufficient enough for people to consider what they saw as facts. But everything on the internet should be taken with a grain of salt.

Facebook may have pledged to work harder on implementing fact-checking tools on its site, but the Guardian recently reported that many fact-checkers who partnered with Facebook consider the effort a failure.

“I don’t feel like it’s working at all. The fake information is still going viral and spreading rapidly,” a journalist who fact-checks for Facebook told the Guardian. “It’s really difficult to hold (Facebook) accountable. They think of us as doing their work for them. They have a big problem, and they are leaning on other organizations to clean up after them.”

Another journalist said, “The relationship they have with fact-checking organizations is way too little and way too late. They should really be handling this internally. They should be hiring armies of moderators and their own fact-checkers.”

Essentially, we all need to be fact-checkers (Check out Tim Gallagher’s column this month, a cautionary tale for newspapers working with tech companies. Tim’s column will be published online Dec. 18). Whether you work for a newspaper or a tech company, we are responsible for tearing down the lies and building a foundation on truth.

Like & Share E&P:
RSS
Follow by Email
Facebook
Facebook
Google+
http://www.editorandpublisher.com/columns/editorial-trust-issues/
LinkedIn
Published: December 8, 2017

2 thoughts on “Editorial: Trust Issues

  • December 8, 2017 at 6:25 pm
    Permalink

    you really should wake up and stop demeaning american voters … this continuing effort to blame mrs. clinton’s loss on some mysterious (and blatantly non-existent) russian advertising effort only shows how perfectly illiterate are those who promote it: if you knew anything whatsoever about russian politicians’ thinking, you would have known that ms. clinton was their preferred candidate: her policies would weaken the u.s. and, perhaps even more importantly, she would be much easier to blackmail …
    any average american voter is smarter than the would-be elites that keep coming up with this russian involvement nonsense …
    and a minor aside: creating laws that control what the media can publish and decreeing what is not kosher sounds like censorship at first glance … at second glance too … which makes the law perfectly unconstitutional …

    Reply
  • December 9, 2017 at 5:07 am
    Permalink

    The evaluation of an online presentation is the responsibility of the reader. Once you start having some tech company or even Congress separate the good postings from the bad postings, you are damaging free speech. I have a perfect right to post lies, half-truths and similar without being edited (perhaps by computer.) So Hillary lost. Big deal. Let’s move on

    Reply

Leave a Reply to Jay Brodell Cancel reply

Your email address will not be published. Required fields are marked *