Understanding Parler: Exploring the Evolution of an Unmoderated Platform and the Effect of Deplatforming on Extremist Communities
Extremist and violent content on social media poses an existential threat to the fabric of our democracy. Social media has democratized information dissemination, enabling a single message to reach millions of individuals in an instant. Individualized news feeds contribute to growing polarization, delegitimizing verified sources of news, and threatening fundamental democratic processes. Yet not all social media platforms propagate such harmful information in the same way.
Parler, a social media platform, differentiates itself from others due to its lack of content moderation. Drawing a base of right-wing users, Parler rose to national prominence throughout 2020, boasting nearly 20 million users by December 2020. Parler’s success reflected a dangerous and growing trend of unease with Trump’s loss. The platform was partially blamed for the insurrection on January 6th and was taken offline shortly after by Amazon Web Services, with Google and Apple removing it from app stores. Parler represents a unique case study of a harmful online community growing at a rapid pace. In addition, Parler’s deplatforming sparked a national debate concerning best practices to ameliorate the hate extremist groups spread while simultaneously protecting free speech. Investigating both the evolution of Parler and the effectiveness of its deplatforming allows for important insights into this radical community and the connection between social media and democracy at large.
QAnon, a dangerous and hateful conspiracy theory, was able to grow exponentially on Parler. This content found a home on Parler and skyrocketed in popularity as more users flowed to the platform. Parler has provided a home for radical and hateful rhetoric, allowing for content banned on other platforms. As more and more users joined Parler, the content reflects rising hatred and violence. Studying Parler’s evolution makes it inextricably clear that it is inherently linked with Trump. Trump remains popular on the platform throughout its lifespan, heavily influencing the content and rhetoric.
Deplatforming as a potential solution to limiting the power of radical communities is an intricate balance. While deplatforming succeeded in reducing the number of users and the volume of content on the platform, core users with a greater number of posts returned in higher numbers. While hosting services may have succeeded in reducing the general appeal of Parler, dedicated right-wing users were not dissuaded and returned to regularly post. Web hosting services hold tremendous power in the arena of controlling hate. In order to properly exercise this power, they must proceed with caution and a principled approach.
|Type of resource
|December 5, 2022
|June 15, 2022; May 13, 2022
|Degree granting institution
|Center on Democracy, Development and the Rule of Law (CDDRL)
|Social media > Influence
|Trump, Donald, 1946-
|QAnon conspiracy theory
- Use and reproduction
- User agrees that, where applicable, content will not be used to identify or to otherwise infringe the privacy or confidentiality rights of individuals. Content distributed via the Stanford Digital Repository may be subject to additional license and use restrictions applied by the depositor.
- This work is licensed under a Creative Commons Attribution 4.0 International license (CC BY).
- Preferred citation
- Guha , S. (2022). Understanding Parler: Exploring the Evolution of an Unmoderated Platform and the Effect of Deplatforming on Extremist Communities. Stanford Digital Repository. Available at https://purl.stanford.edu/hn011kb3770
Stanford University, Fisher Family Honors Program in Democracy, Development, and the Rule of LawView other items in this collection in SearchWorks
Also listed in
Loading usage metrics...