It was only five years ago when hundreds of thousands of internet users, companies and platforms waged one of the first wide-spread nationwide campaigns fought via social media. Companies like Google and Wikipedia went dark, turning their websites black and urging visitors to read up on the dangers of net neutrality. The campaign grew, as people changed profile photos to protest a Congressional vote on the issue, which eventually failed.
But this year, as the federal government is again considering restricting internet access, the issue became a lot more complicated than just changing profile photos or website banners. The Federal Communications Commission received a reported 7.75 million comments from fake email domains, some of the 23 million comments sent were filed under the same 60 names, and 444,938 comments originated from Russia.
Quickly, this story turned from concerns about false identity to concerns that Russia was playing a role in our politics, again, capping off a year-long wave of concerns about the vulnerabilities of social media platforms corrupted by foreign foes, where social powerhouses like Google, Facebook and Twitter were called to Capitol Hill to explain.
The drumbeat of conversations about distrust hit a crescendo when people like Facebook COO Sheryl Sandberg were called to testify and admitted thousands of Russian-created ads were posted on Facebook during the 2016 presidential election. Social media platforms that once claimed there was no interference from Russia, later claiming there was some inference to later admitting the influence was a lot worse than expected, such as tens of millions of people saw the ads on the various platforms.
Tipping Point
In September 2017, Facebook released a statement admitting 3,000 ads were published on the platform in violation of their policies, allegedly linked to and operated out of Russia. The ads weren’t strictly the types usually seen and recognized as campaign ads, disguising themselves as news articles or memes led to their further infiltration.
Facebook released content of the ads to investigators on Capitol Hill, stating “The ads and accounts appeared to focus on amplifying divisive social and political messages across the ideological spectrum—touching on topics from LGBT matters to race issues to immigration to gun rights.”
Next, Twitter also admitted the purchase of $274,100 spent by pro-Russia and Russia controlled outlet Russia Today to influence the U.S., specifically targeting news consumers. They said the accounts sent 1.4 million election-related tweets.
Google wasn’t exempt. A platform already familiar with attempts by bots and content farms to play the algorithm, Google announced during investigations into Russian interference to the tune of tens of thousands of dollars spent on ads on Google.
But the news continued. An investigation by ProPublica found the purchasing of ads by Russian-connected accounts didn’t stop at the election. The website reported: “A sample of 600 Twitter accounts linked to Russian influence operations have been promoting hashtags for Charlottesville such as ‘antifa,’ a term for activists on the far left; and ‘alt-left,’ a term Trump used, which was interpreted by many as suggesting an equivalence between liberal demonstrators and white nationalists in the so-called alt-right.”
A study from the Associated Press-NORC Center for Public Affairs found the news about social media was having an impact. The report showed less than a quarter of people who use social media trust the news from those sources “a great deal or a lot.”
Another study from the Pew Research Center said Americans were losing faith in the integrity of the news they were reading. Pew reported 64 percent of American adults say made-up news is creating confusion about what to believe online.
This didn’t happen overnight because of the 2016 election, Parse.ly CEO Sachin Kamdar said. There’s a ladder to the widespread distrust people have of social media platforms, and we are at the peak.
The first, or “prerequisite” to creating this distrust, is the fact that social media platforms have become the powerhouse in the way that people consume information. According to the Pew Research Center, 67 percent of Americans get some of their news from social media.
“If you think about it, ten years ago although there were nascent forms of social media on the internet, it wasn’t really the way people experienced content,” Kamdar said. “They would go to the place they trusted. Inherent in visiting that site was a trust.”
Now, about 75 to 80 percent of all traffic that Kamdar’s publisher customers receive is through Google and Facebook, meaning those platforms are now the gatekeepers of what you do and don’t read.
Next, Came the Algorithms
“The whole industry has gone the way with an algorithmic approach. They can be really great at finding the information that you would want to read or that you would engage with, but there’s no trust necessarily built into the algorithm; it’s not like somebody is vetting every single piece of content on Facebook—yet,” Kamdar continued. “I think when you combine that there is this algorithm telling me what information to experience on the web, and the fact that you have these really large looming issues around the last election around what people were seeing, then I think it just raises a ton of questions. So, how is this actually influencing or charging mass media instead of just what I’m seeing?”
The study from the Associated Press-NORC Center for Public Affairs Research delved more into what makes people feel a sense of trust, and what doesn’t.
Jennifer Benz, the study’s principal research scientist, said before they began an experiment testing to see what had more influence over a person trusted a news site—the publishing company or the person who shared it—people in a previous study said they were more likely to trust the company. But the data on the new survey didn’t show that. Using fake stories shared by the likes of Oprah Winfrey, a person was more trustful of that story published by an actual fake news website, than they were of a story posted on the Associated Press.
“Overall, the someone’s trust in the sharer of the news on social media seems to have a greater impact both in terms of how they assess the content that they read but also in how they might engage with that content or the publisher of that content after the fact,” Benz said.
Taking a cue from that, Democratic supporter Peter Daou created Verrit, a site aimed at those still mourning Hillary Clinton’s campaign loss.
The mission statement reads: “With the essence of American democracy at stake, 65.8 million people saw through the lies and smears and made a wise, patriotic choice. But they continue to be marginalized and harassed.” Verrit’s purpose is to become their trusted source of political information and analysis—to provide them (and anyone like-minded) sanctuary in a chaotic media environment; to center their shared principles; and to do so with an unwavering commitment to truth and facts. But after Clinton tweeted a link to the site, which boasted a mission of fact-checking news shared on social media and the web, the site was crashed by a DDOS attack.
People are more likely to distrust a news source shared on social media that they already don’t like, but people’s likelihood of trusting a source they don’t have an unfavorable opinion of stays flat, Benz said.
A partner in the survey, the American Press Institute, wrote, “When people see news from a person they trust, they are more likely to think it got the facts right, contains diverse points of view, and is well reported than if the same article is shared by someone they are skeptical of.”
But as Kamdar said social platforms are seeing the problems and leaning toward the basics. In other words: how the “old media” does it.
“When (Facebook) presented in front of Congress that they are hiring people to help moderate what they’re presenting directly to the audience like how the media is…it’s back to how regular media works,” he said. “It’s a boon for publishers. They've done that for centuries. (Deciding) what information to show—Is it going to be valuable? And they tie it back to the editors who are deciding based upon those questions. I think people in general trust other people more than they trust the block bog algorithm—that’s really hard to trust.”
Snapchat CEO Evan Spiegel also said he is planning to move back to the approach of humans sharing what they think is interesting, and creating algorithms based on that.
“With the upcoming redesign of Snapchat, we are separating the social from the media, and taking an important step forward towards strengthening our relationships with our friends and our relationships with the media,” he wrote on Axios. “This will provide a better way for publishers to distribute and monetize their Stories, and a more personal way for friends to communicate and find the content they want to watch.”
Give Trusted Readers a Voice
The distrust of the social platforms can’t stop publishers from using them.
“Don’t run away from the platforms either. It comes down to whether you’re trustworthy or not. We always tell our customers to not just look at individual metrics but a few different metrics, like engagement time, loyalty, and visiting the site on a regular basis,” Kamdar said. “The fact of the matter is you still have to do it. There’s still a large source of traffic.”
Benz said her project gave them a lot of insight into what goes into trust and that people’s perceptions of news gives researchers a new perspective to look into. But there are takeaways for publishers, platforms and the general public.
“The research points toward an opportunity for the platforms themselves to think carefully about how they are displaying news content, and perhaps there are ways for the actual reporting source to get a more prominent space in the way the news is displayed on the platform,” she said.
This recommendation came from the fact that people were more likely to remember the name of the person who shared the article than they could recall the name of the person who wrote the article.
Then, there’s a certain amount of learning the public can do.
“We also definitely pointed toward the need for additional research and ways of looking into how news literacy can be improved in the population overall,” Benz said. “There’s opportunity to better educate the public about evaluating what they’re taking in.”
And to the publishers and journalists who may be suffering from this distrust, the research shows social media ambassadors can go a long way.
“Your readers and their credibility have an impact on how other people who are getting your news perceive your content, and so is how you engage your readership besides just monetizing them for advertisements.” Benz said.
One suggestion is finding people who are trusted and who read your news and getting them to share articles—letting them be the voice.
“It’s important for outlets to think about that—that when they do lose control of the content, when it goes onto one of these social network platforms, the readers and people who are still sharing are sharing their brand,” Benz said.
Jennifer Swift is the co-founder and editor-in-chief of D.C. Witness, a website that tracks every homicide in Washington. Prior to moving to D.C., Jennifer worked for Connecticut magazine as their state politics reporter, and covered multiple topics at the New Haven Register including city hall, education and police.
Comments
No comments on this item Please log in to comment by clicking here