Home Privacy Why The EARN IT Act Isn’t Sufficient To Protect Children Online

Why The EARN IT Act Isn’t Sufficient To Protect Children Online

SHARE:
Cute little girl using laptop at home

The internet needs child-safety guardrails. The question is: What are they?

During his recent State of the Union address, President Joe Biden specifically called out the need to strengthen privacy protections for kids online, and lawmakers on both sides of the aisle are pushing to revamp and update existing child-focused safety laws, such as the Children’s Online Privacy Protection Act (COPPA).

But lawmakers are also hawking child-safety-focused bills that could serve as a distraction from the clear and present danger of widespread data collection and advertising targeted at children online.

In late January, Sens. Richard Blumenthal (D-CT) and Lindsey Graham (R-SC) reintroduced the EARN IT Act, which stands for Eliminating Abusive and Rampant Neglect of Interactive Technologies.

To do what its name suggests, the bill intends to pare back Section 230 of the Communications Decency Act, which protects platforms from liability for what their users post.

The goal of the EARN IT Act is to eliminate child sexual abuse material (CSAM) online. In practice, however, the proposal is “really unlikely” to help prevent the dissemination of CSAM, said Susan Israel, a privacy attorney at Loeb & Loeb LLP.

And that’s because, although the bill’s stated purpose is to protect children’s privacy, its real function would be to hobble Big Tech.

Which begs the question: What about the children?

Casualties

Containing the dissemination and spread of CSAM is a real problem.

Under Section 230, platforms have an obligation to filter out and report CSAM. But the EARN IT Act would require platforms to go a step further and actively seek out that content, Israel said, and that’s a problem.

The bill calls for platforms to essentially act as “agents of law enforcement,” Israel said, which they are not. Put another way, any attempt to comply with the proposed law could amount to “illegal, warrantless searches that couldn’t be used to prosecute the [actual] perpetrators of the crime,” Israel said.

Subscribe

AdExchanger Daily

Get our editors’ roundup delivered to your inbox every weekday.

Beyond making it harder to catch criminals, the act would also disincentivize the use of end-to-end encryption so as to make information more accessible, which is a double-edged sword. While ostensibly making it easier to find CSAM, “removing encryption protections doesn’t just surface criminals – it makes everyone more vulnerable,” Israel said.

Deputizing online platforms as government agents liable for the content their users post might even lead them to walk away from the idea of hosting user-generated content at all, Israel added, and this could have serious downstream consequences. Beyond removing a vehicle for the dissemination of important information, it would likely drive some of the more heinous activity “further underground, where it would be [even] harder to track,” she said.

That’s not to say Section 230 is perfect, but “carving out individual crimes from Section 230 has not been proven to be useful in the past,” Israel said, which is why, in this case, the EARN IT Act is missing the mark.”

In other words, there are ways to increase protections for children online, but the solution has to be more nuanced than just sticking it to Big Tech.

Alternatives

Instead of making platforms wholly liable for third-party content, one way to more effectively protect children online is to support law enforcement and related entities with boots on the ground.

“If the concern is that platforms aren’t reporting promptly enough, one thing [privacy advocates] suggest is providing more resources to those who prosecute the relevant crimes,” Israel said. For example, she noted, “most platforms report millions of pieces of content each year to the National Center for Missing and Exploited Children, but that group is under-resourced and isn’t able to follow up on [all] the reports it receives.”

But regardless, there’s already another, separate law outside of Section 230 that obligates platforms to do their due diligence in reporting CSAM.

Title 18, Section 2258 of the US Code requires prompt reporting of any incident described in the Victims of Child Abuse Act. According to Israel, this is the part of the law that’s “not working well enough.”

“It would make sense to [revisit] some of the language and the timeframe that Section 2258 sets forth rather than just removing liability protections for platforms and discouraging them from encrypting communications,” she said.

But these potential solutions are only pieces of the puzzle. Privacy advocates agree the real uphill battle, when it comes to protecting children online, is data privacy, not content moderation.

Focusing on data privacy

Although the issues of data protection and content moderation are related – one leads to the other – Gary Kibel, a partner and attorney at Davis+Gilbert LLP, warns that it’s dangerous to conflate the two.

And “privacy,” he said, “is the more urgent issue.”

Whereas laws governing illegal content and moderation exist (including Section 230), there’s still no national privacy law in the US, Kibel said. And while there are three states (California, Virginia and Colorado) that now have privacy regulations on the books with a fourth (Utah) on the way, the end result is “a patchwork of laws [for] a critical issue, and that patchwork is going to eventually have lots of holes,” Kibel said.

And kids can fall through the cracks.

Rob Shavell, CEO of DeleteMe, a for-profit company that deletes user data and digital footprints, cautions that keeping the data privacy of children on the back burner is a big problem.

“God forbid [just] one child is preyed upon by an adult online,” Shavell said. “But for that one child, there are thousands of kids whose choices and lives are shaped by a bunch of targeted algorithms that then build detailed profiles about them, steer them into certain kinds of behaviors and sell them on certain [kinds of] options in life, following them into adulthood.”

What’s next?

Until legislators can hash out a national law on data protection, there’s still room to amend existing child-focused privacy laws in the US, particularly COPPA, Kibel said. For example, some privacy advocates argue in favor of raising the age of protection to 16 from 13.

Doing so isn’t a panacea, however.

While it’s relatively easy to group together content directed to an 8-year-old, say, or a 9-year-old, it’s harder to draw those lines if the law raises the age of minors. Just try and distinguish between content directed to a 15-year-old versus a 17-year-old, Israel said.

If COPPA is revised to ban targeted advertising to children under 16 rather than 13, it could also stop young teens from “exploring freely online and acquiring information they may not want parental permission for,” like info on safe sex, which is one argument to keep the age at 13, she said.

But there is still some low-hanging fruit when it comes to COPPA, according to Kibel, which is to “narrow the knowledge exception [by] increasing verification obligations.” Doing so would put the onus on online platforms to ascertain whether or not their content could have a young audience, rather than allowing them to feign ignorance of the age of their users.

“If your website has [videos of] a big, fluffy dinosaur singing songs, then you have to realize that children are going to be there,” Kibel said. “You can’t put blinders on.”

(Looking at you, YouTube.)

Must Read

Readers Are Flocking To Political News, Says WaPo – And Advertisers Are Missing Out

During certain periods this year, advertisers blocked more than 40% of The Washington Post’s inventory over brand safety concerns.

Monopoly Man looks on at the DOJ vs. Google ad tech antitrust trial (comic).

Spicy Quotes You’ll Be Quoting From The Google Ad Tech Antitrust Trial

A lot has already been said and cited during the Google ad tech antitrust trial, with more to come. Here are a few of the most notable quotables from the first two weeks.

The FTC's latest staff report has strong message for social media and streaming video platforms: Stop engaging in the "vast surveillance" of consumers.

FTC Denounces Social Media And Video Streaming Platforms For ‘Privacy-Invasive’ Data Practices

The FTC’s latest staff report has strong message for social media and streaming video platforms: Stop engaging in the “vast surveillance” of consumers.

Privacy! Commerce! Connected TV! Read all about it. Subscribe to AdExchanger Newsletters

Publishers Feel Seen At The Google Ad Tech Antitrust Trial

Publishers were encouraged to see the DOJ highlight Google’s stranglehold on the ad server market and its attempts to weaken header bidding.

Albert Thompson, Managing Director, Digital at Walton Isaacson

To Cure What Ails Digital Advertising, Marketers And Publishers Must Get Back To Basics

Albert Thompson, a buy-side veteran with 20+ years of experience, weighs in on attention metrics, the value of MFA sites, brand safety backlash and how publishers can improve their inventory.

A comic depiction of Google's ad machine sucking money out of a publisher.

DOJ vs. Google, Day Five Rewind: Prebid Reality Check, Unfair Rev Share And Jedi Blue (Sorta)

Someone will eventually need to make a Netflix-style documentary about the Google ad tech antitrust trial happening in Virginia. (And can we call it “You’ve Been Ad Served?”)