The World Wide Web celebrated 30 years on March 12 this year. To mark the occasion, web inventor Sir Tim Berners-Lee wrote an essay about what to expect from the next 30 years.
“While the web has created opportunity, given (marginalized) groups a voice, and made our daily lives easier, it has also created opportunity for scammers, given a voice to those who spread hatred, and made all kinds of crime easier to commit,” he wrote.
Berners-Lee called for governments to create laws and regulations for the digital age; for companies to do more to ensure better business practices, especially with privacy, diversity and security issues; and for citizens to hold companies and governments accountable for the commitments they make.
Although these fixes won’t solve all the problems, Berners-Lee said they do signify a “shift.”
“It’s our journey from digital adolescence to a more mature, responsible and inclusive future,” he wrote. “The web is for everyone and collectively we hold the power to change it. It won’t be easy. But if we dream a little and work a lot, we can get the web we want.”
Berners-Lee is correct about one thing—the web is for everyone.
Just three days after the web turned 30, a mass shooter attacked two mosques in Christchurch, New Zealand, killing 50 people. What made the incident even more horrendous was the fact that the gunman livestreamed the attack on Facebook Live.
“The New Zealand massacre was livestreamed on Facebook, announced on 8chan, reposted on YouTube, commentated about on Reddit, and mirrored around the world before the tech companies could even react,” Washington Post’s national tech reporter Drew Harwell wrote in a somber tweet.
At the time of the attack, the video was viewed fewer than 200 times during the live broadcast, according to Facebook. But in the first 24 hours, the company removed more than 1.2 million videos of the attack at upload, and 300,000 additional copies were removed after they were posted. Facebook also reported that no users reported the video during the live broadcast (the first user report came in at 29 minutes after the video started). Since the attack, Facebook has pledged to improve their AI technology and video moderation to prevent such videos from being uploaded again.
“The platforms all say that they want to do something about this kind of content, but in almost every case they are left scrambling after the fact, sometimes for hours, to clean up the videos and links. Despite renewed pressure from governments—including laws against hate speech in countries like Germany and France—they seem almost completely unprepared and ineffective,” said Mathew Ingram in the Columbia Journalism Review.
In an article published on The Conversation, Jennifer Grygiel, an assistant professor of communications at Syracuse University, even called for Mark Zuckerberg to shut down Facebook Live in the interest of public health and safety.
“No child should ever see the sort of ‘raw and visceral content’ that has been produced on Facebook Live—including mass murder. I don’t think adult users should be exposed to witnessing such heinous acts either, as studies have shown that viewing graphic violence has health risks, such as post-traumatic stress,” said Grygiel.
On the other hand, Jack Shafer, Politico’s senior media writer, argued that the media shouldn’t censor the New Zealand video.
“Suppression of the news is the censor’s game,” he said. “When the 9/11 terrorists struck, the networks live-streamed that atrocity into American living rooms, including the human demolition of desperate people leaping from high windows to escape the flames. This attack was every bit as deliberate, calculated and web-savvy as the Christchurch assault, but the press didn’t retreat behind worries that the coverage might encourage another attack.”
Berners-Lee wants the web to be more responsible, and at the same time, he wants to protect the open web. But after what happened in New Zealand, I’m not sure how that will be possible.