Data privacy is a hot topic, but children’s privacy is often kept on the backburner.
Why?
Compliance with child data protection laws is hard to get right because it involves determining the exact age of your audience – and playing by different rules accordingly.
Although the Children’s Online Privacy Protection Act (COPPA) is well into adulthood at 24 years old – it was passed in 1998 – the ad industry continues to struggle with the challenge of age verification.
“It’s tricky to determine how old someone is,” said Virginia Lee, Cisco’s privacy officer for the Americas, speaking at an International Association of Privacy Professionals event in Washington, DC, earlier this week.
“If you get it wrong, you’re screwed,” Lee said. “And it’s very easy to get it wrong.”
Who knows?
Staying on the right side of COPPA isn’t easy – and a mistake is very expensive. Businesses can be fined more than $40,000 per violation.
Yet COPPA also contains a knowledge exception, meaning the operators of general audience sites or apps are only covered by the law if they have actual knowledge that children under 13 are sharing their personal information without parental consent.
Companies that claim ignorance of how old their audience is can use the actual knowledge standard as a loophole.
It’s a bit of a head-scratcher. Some lawyers argue that getting rid of the knowledge exception and implementing stricter age verification obligations are what’s needed to make COPPA work more effectively.
Otherwise, the incentives are out of whack. Companies can either err on the side of caution (which can hamper growth) or turn a blind eye (which isn’t good for child privacy).
But companies generally do want to get this right. It’s rare to see a business actively push back against child safety guardrails, said Dona Fraser, SVP of privacy initiatives at BBB National Programs.
There have been exceptions, however.
Take TikTok. Before its rebrand in 2018, when the app was still called Musical.ly, parents sent in thousands of requests demanding the app delete their children’s data. When Musical.ly refused to do so, those requests became complaints, and TikTok was eventually fined millions by the Federal Trade Commission the following year.
But this story could have ended differently for TikTok, Fraser said, if it had taken into consideration the feedback it received from unhappy parents and used it to do better by its young audience.
But even if the complaints hadn’t provided Musical.ly with actual knowledge that children under 13 were using its service, as the FTC found, the app should have had “constructive knowledge” of the age of its audience.
Actual knowledge is when a company knows something definitively versus constructive knowledge, which is something a company is reasonably expected to know.
All Musical.ly would have had to do was Google itself to find scores of press reports about the popularity of its app with teens and young people.
Making use of constructive knowledge will help companies implement best practices and operate on the right side of the law even before they have actual knowledge of whether there are children under 13 in their audience.
“If you have a platform that wasn’t designed for children, but then [discover] they’re starting to use it, you have to [address] this,” Cisco’s Lee said.
The constructive knowledge standard is starting to catch on. It makes an appearance in the Kids PRIVCY Act, a bill introduced in the Senate last year that would require sites and services to get parental consent if they have actual or constructive knowledge that they’re processing child data. (PRIVCY stands for “Protecting the Information of our Vulnerable Children and Youth,” I kid you not.)
Aside from its comically overzealous acronym, the act’s emphasis on constructive knowledge could help address some of the endemic issues with COPPA, Fraser said.
Self-regulation
But speaking of issues, the Kids PRIVCY Act has a few of its own.
For example, the bill aims to update COPPA by repealing the provisions that allow for industry self-regulation, Fraser said. Doing so, however, would also prevent companies from taking proactive steps to implement improvements to their own online platforms with children’s safety in mind.
Banning self-regulation could have “a chilling effect,” Lee said. Companies, looking to avoid a misstep, might stop providing meaningful online platforms for developing teens, she said.
And that would be both a shame and a disservice to young people.
“There’s harm in social media, but there’s also a lot of good that can [allow] teens to expand their worldviews rather than narrowing them,” said Kathryn Farrara, associate general counsel at Unilever.
It’s also difficult for legislation to be effective without including perspective from all of the parties involved.
Policies might “sound good and look good on paper,” Fraser said, “but what does it actually look like in practice? That’s our ask. If you want to require anything, require the industry to come together to work with legislators about education.”
And proactive self-regulation is a force for good, she said.
“Self-regulation has helped [implement] COPPA over the last 20 years,” said Fraser, who noted that BBB National Programs has conducted more than 200 investigations since the law was passed.
By comparison, she claimed, the FTC has conducted less than 40.