Social networks can’t be forced to filter content for kids, says judge


A federal judge issued a last-minute partial block on a Texas law that would require some large web services to identify minors and filter what they see online. Called HB 18 or the Securing Children Online Through Parental Empowerment (SCOPE) Act, it was signed into law last year and was set to take effect over the weekend on September 1st. But a late Friday court ruling determined that the “monitoring and filtering” requirements posed a significant threat to online speech.

The SCOPE Act requires a range of web services, particularly large social networks, to apply special rules to users whose registered age is under 18. That includes limiting data collection, banning targeted advertising, and not allowing financial transactions without parental consent. More unusually for a US-based law, it says services must implement a plan to “prevent the known minor’s exposure to harmful material,” including content that promotes or “glorifies” things like suicide, self-harm, substance abuse, and “grooming.” And any service whose content is deemed more than one-third harmful or obscene (as defined by an existing Texas statute) must implement a “commercially reasonable age verification method.”

The ruling didn’t find that the entirety of HB 18 posed a threat to First Amendment-protected speech, and some provisions — like the data collection rules and the age verification for sites with large amounts of adult content — remain in force. (Texas already required age verification on adult sites.) Meta and TikTok didn’t reply to a request for comment on whether they were planning changes to comply with the new law.

But Pitman was highly critical of the monitoring and filtering rules. “Terms like ‘promoting,’ ‘glorifying,’ ‘substance abuse,’ ‘harassment,’ and ‘grooming’ are undefined, despite their potential wide breadth and politically charged nature,” he writes, echoing criticism from FIRE, which noted that terms like “grooming” have been applied to all forms of LGBTQ content. “At what point, for example, does alcohol use become ‘substance abuse?’ When does an extreme diet cross the line into an ‘eating disorder?’” An attorney general enforcing the law could end up doing so selectively — by, say, deciding that posts or videos about marijuana were glorifying substance abuse “even if cigarette and alcohol use is not.”

And the judge points out that while social networks would have to filter out controversial material, the same rules wouldn’t apply to other media:

A teenager can read Peter Singer advocate for physician-assisted suicide in Practical Ethics on Google Books but cannot watch his lectures on YouTube or potentially even review the same book on Goodreads. In its attempt to block children from accessing harmful content, Texas also prohibits minors from participating in the democratic exchange of views online. Even accepting that Texas only wishes to prohibit the most harmful pieces of content, a state cannot pick and choose which categories of protected speech it wishes to block teenagers from discussing online.

While the injunction only covers a portion of the law, it makes HB 18 the latest state-level internet regulation to be at least partially blocked by courts, alongside California’s Age-Appropriate Design Code Act and other statutes in Arkansas, Ohio and Mississippi. (At the federal level, Congress is still working on the the Kids Online Safety Act, which has raised its own censorship concerns despite lawmakers’ efforts to allay them.) The legal battle over the SCOPE Act isn’t finished — but for now, Texas teens can keep watching videos about weed.



Source link