A tale of two states: Social media & content moderation policy in Texas and Florida
The rise of social media and the associated concerns about its impact on users, especially youth, have led to numerous federal and state policies in the U.S. over the years. While increased awareness and actions addressing issues like content moderation, age verification, and parental consent are signs of progress, the U.S.’s varied legislative landscape can impose confusing and impractical compliance expectations on businesses and users alike.
Texas and Florida are two states that have passed various laws related to social media in recent years. In April 2023, Texas passed House Bill 18, a law requiring digital service providers, such as social media platforms, to get prior consent from parents before entering into agreements with users younger than 18. On March 25, Florida passed House Bill 3 into law, prohibiting users under the age of 14 from becoming social media account holders and allowing 14 and 15-year-olds to become account holders only with parental consent.
Opponents, for example, social media companies, have argued that certain state-imposed measures, such as age assurance requirements, can do more harm than good by undermining existing company safeguards and implicating user privacy and individual freedoms. Tech industry groups, including NetChoice, are challenging legislative policies on constitutional grounds. As such, the inherent nature of speech-related policies often tread the tightrope of the First Amendment under the U.S. Constitution.
On the other hand, these measures are critical to protect youth from harmful, age-restricted, and illegal content, while giving parents more peace of mind in their children’s online experience. The reality is that children’s safety is at stake: In early 2024, our research found that almost one in ten young people have come across Child Sexual Abuse Material on the internet, and social media channels account for 35 percent of all encounters with unsafe, age-restricted, or illegal content.
On Monday, July 1, the U.S. Supreme Court had the opportunity to weigh into the fray of speech on the internet. But instead of upholding or striking down the controversial social media laws in the states of Texas or Florida, the Court vacated the decisions in Moody v. NetChoice and NetChoice v. Paxton and sent both cases back to the lower courts to reanalyse the totality of each First Amendment challenge using appropriate standards of review.
The issue in these cases is whether Texas and Florida can prohibit large social media companies from de-platforming political candidates or restricting content based on viewpoints. Texas House Bill 20 bans social media platforms from limiting posts based on the viewpoint of the speaker and imposes content moderation disclosure requirements on social media companies. Florida Senate Bill 7072 prohibits social media companies from de-platforming political candidates and imposes penalties on platforms that violate this law.
While the Court did not rule on the merits, Justice Elena Kagan, who delivered the majority opinion, wrote that the First Amendment “does not go on leave when social media are involved,” and “[w]hen platforms use their Standards and Guidelines to decide which third-party content those feeds will display, or how the display will be ordered and organised, they are making expressive choices,” warranting First Amendment protection.
First Amendment protections are complex and nuanced legal questions that require serious examination. It is, however, equally as crucial to not lose sight of the negative impact the current state of the internet, including social media, can and is having on users, especially minors – efforts that are at the core of our business.
Businesses should engage and partner with subject matter experts in this area, including safety tech providers, to ensure the effective implementation of robust content moderation and reliable age assurance technologies. These are by far the most impactful methods to ensure users have age-appropriate experiences online. So, while pushback like that seen in the Court ruling and at the state level is unlikely to disappear, the only option is to continue making progress in driving meaningful change to ensure the online protection of young and vulnerable people.
If you'd like to learn more, we'd love to chat.