-
Thursday, January 14, 2021

ANALYSIS/OPINION:

Thanks to a political uprising at the Capitol building and its online fallout, the scrutiny toward major tech companies has never been more intense. Following a flurry of deplatforming decisions on the part of major corporations, Apple moved to polish off Parler from the app store for its alleged failure to adequately moderate posts inciting violence. 

The choice wasn’t wise — and not just because it supports the narrative that tech companies are out to stifle conservatives. There’s a lot more at stake than that.


Google Play and Amazon, which hosts Parler’s servers, have followed Apple in revoking service to the social media platform favored by conservatives and Trump supporters as a “pro-free speech” Twitter alternative. While these moves could cripple Parler, they’ll also be watched by more mainstream competitors, who know that they too could find themselves on the firing line from the same app stores at some point. Facebook and Twitter have already responded to last week’s events by rapidly taking down a slew of pro-Trump accounts linked to election fraud allegations and the QAnon conspiracy theory.

The message seems simple. With the ability to reach users increasingly controlled by a limited number of companies behaving in similar ways, platforms must more proactively police what their users post or risk losing their business.

While the goal of curbing misinformation or violence is a good one, that doesn’t make Apple and Google’s actions very wise. Parler already prohibits content that explicitly incites violence. It has also created a temporary taskforce to bolster moderation efforts, and has taken down posts such as Trump surrogate Lin Wood’s call for Vice President Mike Pence to be hung for disloyalty to the president.

Apple doesn’t think that this goes far enough. Besides, laws like Section 230 might shield platforms from legal liability for their users’ posts, but private companies can’t be forced to provide services to entities that they believe have failed to meet their own terms of service.

Yet, Apple should consider that even much larger platforms like Facebook — which can afford to employ scores of moderators and advanced algorithms for flagging content — often struggle to catch all the posts that are violent, illegal or violate their terms of service. Error (human or algorithmic) means that some content will always slip through the cracks, and, conversely, that non-offending content can unjustly be caught up in the system.

This isn’t merely a problem for conservatives, either. Everyone from Middle Eastern journalists trying to expose human rights abuse on YouTube to left-leaning outlets have been victims of overzealous moderation or algorithms that have shut them down.

Since journalists, media outlets and businesses often depend on these platforms for their livelihoods, the effects can be disastrous. They can also reduce the diversity of news coverage since large media conglomerates are better placed to appeal against bad decisions than their smaller, independent peers. That’s bad news for local outlets, which are increasingly responsible for a lot of news and investigative journalism that mainstream news won’t touch.

Reactionary moves can have unforeseen consequences for innocent users and for the quality of our informed liberal democracy. Just consider the Patriot Act, hastily drafted amid post 9/11 fervor. It has long deprived Americans of their civil liberties, even in ways that don’t help combat terrorism.

That’s why the utility of zealously taking down fringe or extremist content should also be questioned. Public platforms out in the open make it easier for security and intelligence agencies to track and police threats. Even 4chan’s accessibility led the government to a successful airstrike on ISIS — thanks to the hapless terrorists’ YouTube postings. It’s a bad idea to drive nefarious actors entirely underground to encrypted messaging apps or dark web platforms where they can organize effectively without being seen.

These problems were acknowledged by former FBI assistant counterintelligence director Frank Figliuzzi. Pushing folks out of mainstream platforms can also isolate them “in an amplified extremist echo-chamber where they hear only their truth and their reality,” he noted.

That said, there’s no point in making social media the scapegoat for governments’ failures. Government agencies are better equipped than platforms to combat citizen violence to speedily monitor all posts by countless users, and that’s what they should do. Despite the FBI flagging the threat of mob violence last week, possibly thanks to monitoring social media, there were still crucial security lapses instrumental to the Capitol storming. As a result, the chief of the U.S. Capitol Police was forced to resign. 

But major tech companies are making a misstep by going after Parler soon after a close election, wherein nearly 75 million Americans demonstrated support for the president. They could very well be further fracturing the country’s existing divides. But, if nothing else, they’ll reduce competition in the social media space and encourage an overzealous censorship culture — one with an unknowable victim tally. Tech companies are right to criticize President Trump for not being cognizant of the political consequences of his actions, but they should take care to do the same for themselves.

• Satya Marar is a Young Voices senior contributor and tech policy fellow.


Copyright © 2021 The Washington Times, LLC.