By setting itself up as a gatekeeper against racism, Silicon Valley could weaken its arguments against legislation that targets other harmful online content
The tech industry’s crackdown on racism could complicate one of its biggest fights in Congress, where Silicon Valley is lobbying hard against legislation aimed at weeding out other harmful online content.
In opposing bills that target online sex trafficking, internet companies have argued that they provide platforms for the free exchange of data and should not be forced into serving as societal gatekeepers.
But at the same time, numerous tech companies have taken on the gatekeeper role with gusto this past week, using their power over the digital world to shut down neo-Nazi internet forums, kick white supremacists off of fund-raising, ride-sharing, lodging and dating websites, and otherwise limit hate groups’ ability to spread their influence online. Those included some of the industry’s biggest players, such as Google and Facebook.
Even with the best intentions, the industry’s reaction to last weekend’s white supremacist violence in Charlottesville, Va., raises complicated questions about who should determine what content is appropriate for the internet — as even some of Silicon Valley’s staunchest allies acknowledge.
“Every time a company throws a vile neo-Nazi site off the net, thousands of less visible decisions are made by companies with little oversight or transparency,” the Electronic Frontier Foundation, a digital rights group that has opposed many government attempts to meddle in the internet, wrote in a blog post late Thursday. “Precedents being set now can shift the justice of those removals.”
Already, supporters of the sex-trafficking bills are citing the quick response to Charlottesville as a reason to proceed with legislation holding online providers liable for knowingly hosting content that facilitates trafficking. The industry has lined up in force against the bills, which would whittle away at protections it has long enjoyed under a section of the 1996 Communications Decency Act.
“We applaud the tech community for taking a strong stance in the aftermath of Charlottesville and hope they will take a similar stand regarding online sex trafficking by supporting the Stop Enabling Sex Traffickers Act,” said a spokesman for Rob Portman, the Republican senator from Ohio who introduced the legislation along with Sen. Richard Blumenthal (D-Conn.).
The two anti-trafficking bills, introduced just before Congress’ August recess, are part of a long-running congressional battle with Backpage.com, a classified-advertising website that drew notoriety for its adults-only section. The measures would also allow states — not just the federal government — to prosecute online companies for facilitating sex trafficking.
Under the 1996 law’s Section 230, online companies are not legally responsible for policing the vast quantities of material people create and share on their networks. Supporters say the provision has allowed the internet to flourish, with a wealth of user-provided content available on sites such as YouTube, Facebook, Twitter and Google.
Groups like the Consumer Technology Association and the Internet Association, which represents tech giants including Google, Facebook and Twitter, have called the bills well-intentioned but overly broad, saying they would expose tech platforms to a litany of lawsuits and open the door to future legislation aimed at other kinds of content.
“This legislation is a misguided attempt that would destroy the foundations of the internet instead of doing the hard work of confronting the criminals who are exploiting the most vulnerable members of our society,” said a spokesman for Sen. Ron Wyden (D-Ore.), who opposes the bills.
Under current law, the companies can voluntarily take down material they consider inappropriate — as they did after Charlottesville.
“They were already not going to be held liable for it, but they weren’t so much concerned about the law here as they were about public backlash,” said Thomas Struble, the tech policy manager at the R Street Institute, a free-market think tank that opposes the bills. “They don’t want to be seen as supporting hate speech.”
The tech industry’s crackdown began even before last weekend’s rally turned violent. Home-sharing website Airbnb jettisoned the accounts of users it suspected of renting rooms to attendees of the “Unite the Right” event.
Then the internet domain registrars at GoDaddy and Google kicked the neo-Nazi publication The Daily Stormer off their servers, saying the website’s content advocated violence and thus violated their terms of service. The prominent security and web performance company Cloudflare, which had been delivering The Daily Stormer protection from certain kinds of cyberattacks, terminated the site’s account because, as described in a blog post by CEO Matthew Prince, it had “made the claim that we were secretly supporters of their ideology.”
Prince did not take the move lightly, writing in the blog post that it was a “dangerous” decision.
“Without a clear framework as a guide for content regulation, a small number of companies will largely determine what can and cannot be online,” he wrote.
Facebook removed an event page for the rally and deleted links to a Daily Stormer article that assailed a peaceful protester killed in Charlottesville, The Washington Post reported, while a few Twitter accounts tied to extremists were taken offline.
Web services where users raise or exchange money — including PayPal, Apple Pay, Kickstarter and GoFundMe — announced they would no longer welcome certain websites, individuals and accounts associated with hate groups.
Even dating site OkCupid tweeted Thursday that it had banned one user accused of being a white supremacist and asked members to report “people involved in hate groups.” “There is no room for hate in a place where you’re looking for love,” the company tweeted. Another dating app, Bumble, announced it’s urging users to flag and report others thought to be promoting hate speech.
Tech supporters say there’s no inconsistency in taking action against hate groups while opposing legislation that would weaken the openness of the internet.
“Tech companies have used long-standing community standards to fight illegal activity, terrorism and racist content, and that hasn’t changed,” a tech industry insider said. “There is no conflict between enforcing community standards and the tech industry’s support for essential laws like CDA 230 that protect intermediaries and allow user-generated content online. In fact, CDA 230 helps platforms work with law enforcement to fight illegal content.”
But John M. Simpson, director of the Privacy Project at Consumer Watchdog, said tech companies are being “hypocrites” in the wake of the Charlottesville violence.
“I think they’re hypocrites, I think they’re doing this because of perception; it suits their political agenda and their bottom lines,” he said. “The websites that they’re talking about for the most part have not actively crossed the line in the way Backpage.com has, which is actively aiding and abetting sex trafficking.” (Backpage shut down its adults-services section in January while rejecting Senate accusations of knowingly profiting from prostitution and child trafficking.)
Mary Mazzio, director of the recently released film “I Am Jane Doe,” which follows the attempts of two mothers of victims of sex trafficking to seek legal action against Backpage, said the tech companies’ moves to ban hate speech from their platforms was laudable but hypocritical. Tech companies want to take responsibility for certain types of content on their own terms, she said.
“I thought, here is an appropriate response — why aren’t they responding to criminal conduct and harm to women and girls with sex trafficking?” Mazzio said. “Since when did Facebook and Google get to say, certain content we think is harmful and noxious and others aren’t?”
Authors: