On January 8, 2021, Twitter announced that it was permanently suspending the account of President Donald Trump "due to the risk of further incitement of violence." Following Twitter's announcement, President Trump accused Twitter of "banning free speech" and violating the First Amendment. However, it is clear that as a private company, Twitter is not subject to the First Amendment, and there is no legal prohibition on it restricting who may use its platform.
This squares with Supreme Court precedent, which has long held that the First Amendment's prohibition on restricting free speech only applies to the government, not private entities. In fact, the Supreme Court reaffirmed this principle in 2019, holding in Manhattan Community Access Corp. v. Halleck, 139 S. Ct. 1921, that "[t]he Free Speech Clause does not prohibit private abridgment of speech."
Since Twitter is a private company, it can impose whatever restrictions it deems appropriate, up to and including banning violators from continuing to use its platform. Nonetheless, Twitter's actions do raise important questions about the role of social media in public discourse, which will undoubtedly be the subject of further policy debates.
Another developing social media story since President Trump's Twitter ban has been the fate of Parler, which was forced to shut down after Amazon Web Services announced it would no longer host the website. This came days after both Google and Apple removed Parler from their respective app stores, citing its failure to remove posts advocating violence. This has further fueled the partisan divide over the power of social media companies.
Parler is a Twitter clone that allows users to post and share short messages with other users. Since being founded in 2018, Parler has positioned itself as a free-speech alternative based on its lax content moderation policies. This has made Parler a favorite of conservatives, with 15 million users by the end of 2020. The user count spiked again after Twitter's ban of President Trump led millions to join Parler. However, this also led to a spike in content advocating violence.
Ultimately, Parler's approach to content moderation was its undoing. Since the attack on the Capitol, technology companies have been under pressure to strictly enforce their policies against inciting violence. This led to crackdowns on such content by Twitter and Facebook, but Parler maintained its hands-off approach even after Amazon confronted Parler with nearly 100 examples of such posts. This led Amazon, Google, and Apple to cut ties with Parler until it comes into compliance with their respective policies.
Parler's downfall came at the hands of other big tech firms that essentially cut its platform off at the knees by denying it access to necessary technology. Parler's crime, failing to police content on its platform, is a transgression of which Big Tech has been consistently accused.
It is interesting that Republicans and Democrats have been calling for a repeal of Section 230 of the Communications Decency Act, a federal law which limits liability for platforms by not treating them as publishers, therefore not requiring them to be responsible for third-party content. Criticism of Parler's failure to regulate posts encouraging violence is certainly legitimate, but there is no shortage of irony in the fact that Parler, which is protected by Section 230, was handed a virtual death sentence by Big Tech.
While Parler's fate remains to be seen, it will be an issue to watch as Congress grapples with how to address the power and influence of technology companies.