Parler | Courtesy of the AV Club

0 Shares

This is Part 1 of a two-part series on the social media site Parler, founded by DU Alumni.

Founded in August of 2018 by DU Alumni John Matze and Jared Thomson, the social media application Parler has wormed its way into the spotlight of big tech and politics. In the current media landscape, the overlap between these two realms frequently spells controversy. Parler’s most recent claim to fame can be attributed to its role in the Jan. 6 insurrection at the United States Capitol, providing a platform for far-right extremists to plan for the attack. 

Parler’s foundation is rooted in a simple concept: a free speech-oriented utopia for those dissatisfied with the regulation found on platforms like Twitter and Facebook. But when the true complexity of the internet is considered, it becomes clear that social media without regulation is nothing but an oversimplified dream.

In an interview with CNBC, Matze described Parler as “a [open] community town square with no censorship,” adding, “if you can say it on the streets of New York, you can say it on Parler.”

However, if one lifts the veil of free speech Parler uses to shield itself from criticism, “no censorship” is revealed to be nothing more than an excuse for selective censorship. 

In an interview with Kara Swisher of the New York Times, Matze described that all content on Parler was left up to a “community jury.” Questionable content must be reported by users and then reviewed by a group of peers who deem whether the post violates community standards.

But if a jury of one’s peers remains ideologically monolithic, it is hard to establish a proper system of justice. Parler’s user base, despite its founding principles, leans heavily conservative.

Supposedly, the app was not created for any specific political party. Matze has maintained that Parler was a nonpartisan endeavor, open to users of any political affiliation. However, the app was perfectly suited to conservatives who felt that Twitter and Facebook hold liberal-leaning biases which led to the blanket shadow-banning of conservative content.

In recent months, the app attracted the likes of Republican Senators Rand Paul and Ted Cruz as its popularity grew. The validation of the app by mainstream pundits, along with Parler’s affiliation with conservative investors like Rebekah Mercer, created what has been called a “conservative echo chamber.” It led to the initial nonpartisan goal giving way to a radical right-wing hotspot.

Contradictory to Matze’s vision for the platform, but further exemplifying the hypocrisy of its free-speech approach, several left-wing users who joined the platform received bans. For all of the high-minded free speech ideals that are touted by the minds behind Parler, their user agreement allows them to remove any content for any reason—a provision that impedes liberal users more than conservatives.

As Parler gained traction, conspiracy theories from the right grew in prominence with it. Unfounded narratives of a stolen election, while quashed on Twitter and Facebook, became part of the culture of Parler. What was intended to promote free speech encouraged the dissemination of misinformation and propaganda.

This tempest of hate and falsehoods was made worse as the social media company’s top-brass were reluctant to use algorithmic censorship. The higher-ups believed that algorithms served to suppress speech, and they forced all of the site’s content to be judged by a human. The platform made its way towards two million users before having its web services removed by Apple, Google and Amazon for inciting violence. Matze has since stated that he will adopt an algorithmic approach to comply, and the company continues to function. The company’s attempted avoidance of self-censorship failed when faced with the self-confirming filter bubbles that can quickly lead to extremism.

Understanding that regulation must be implemented in one form or another, the question becomes whether it should be achieved federally, or left to private entities. Section 230 of the Communications Decency Act already makes it difficult for private entities to be held accountable for the content that is posted on their platform, and for good reason. If platforms were held accountable for the content published by online users, the fast-paced and thriving digital ecosystem would screech to a halt in lieu of legal threats and court fees.

Nonetheless, steps must be taken to enforce social media giants to make effective use of unbiased algorithms. Adequate resources must be dedicated to ensure that A.I. prevents the spread of misinformation without hindering the conversation. By encouraging a stronger culture of Corporate Social Responsibility (CRS) surrounding these aims, it is possible to nudge corporations in the right direction without undue government oversight.

Using CRS to foster self-regulation is not a novel idea. It is the same strategy that led Parler to take a more serious approach to censorship upon its return to the web. Amazon, Apple and Google all experienced a sense of social responsibility that forced them to act on the issue. The next step is to strengthen CRS within the social media landscape. Private entities must preemptively combat the undesirable content before cultish violence manifests, as we now know it can.

The internet remains an unknown variable with unpredictable consequences when it comes to free speech. As our country’s founders knew, it is an essential right. But stories like Parler’s demonstrate how rapidly the first amendment’s meaning changes during the internet age. It is clear that there must be more regulation, as the medium is too powerful for a hands-off approach. Matze’s idea seems constitutionally idyllic at first glance. But the proceedings at the Capitol show how far down the rabbit hole some can go if left to their own devices with an unchecked extremist megaphone like Parler at their fingertips. 

0 Shares