We’re about to see a lot more enforcement against dark patterns from the Federal Trade Commission and on a state level.
Dark patterns involve using manipulative or ambiguous language that pushes people to take an action they either don’t understand or wouldn’t normally take, such as sharing their data or agreeing to recurring online payments.
Holding companies to account for using dark patterns has largely fallen to the FTC, which enforces against “unfair or deceptive” business practices as defined by Section 5 of the FTC Act.
For example, the FTC’s is in the middle of a “dark patterns” investigation against Amazon over its Prime promotion practices.
But now, US privacy laws are starting to mention dark patterns, too.
“We’re seeing references to dark patterns slip into privacy laws and other discrete marketing laws as specific items,” said Gary Kibel, an attorney and partner at Davis+Gilbert, speaking during a webinar hosted by the Association of National Advertisers last week.
Currently, three of five US state privacy laws explicitly call out dark patterns, including CPRA in California, the Colorado Privacy Act and the Connecticut Data Privacy Act, which just passed in May. All three will go into effect beginning in 2023.
Apps and sites caught using dark patterns could now be penalized by State AGs in addition to the FTC, Kibel said.
Also, earlier this month, several state attorneys general, including Illinois, Delaware, Pennsylvania, Massachusetts and New Jersey, called on the FTC to address dark patterns in digital advertising. The FTC is already working on revising and modernizing its Dot Com Disclosure guidelines.
But why are dark patterns coming into the light now?
It’s partially because sensitive data, including health and location data, is becoming … well, more sensitive. Just look at the implications of the recent overturn of Roe v. Wade.
“It’s more important than ever, in those situations, to be crystal clear with the end consumer about how their sensitive data is going to be processed,” Kibel said.
Seeing in the dark
The most common dark patterns are subscription-related, including subscriptions that either renew automatically or are harder to cancel than they are to sign up for.
Several states already have automatic renewal laws expressly mandating that subscription cancellation be an “immediate process.” Meaning that being forced to email an unresponsive inbox or calling a toll-free number when you registered online is illegal, said Paavana Kumar, also an attorney at Davis+Gilbert.
But any website can engage in dark patterns – sites aimed at kids.
ABCmouse, an educational site for young kids, was sued $10 million by the FTC in 2020 for illegally billing tens of thousands of parents.
Kibel was one of them.
Specifically, this violates the Restore Online Shoppers’ Confidence Act (ROSCA), which is actively enforced by the FTC and prohibits “recurring payments of a service a consumer had no intention of paying for.” This typically happens when a service demands a credit card for a so-called “free trial” without mentioning they plan to bill you every month.
ROSCA is the most comprehensive national law that goes after dark patterns, even if only implicitly. (The law targets duplicitous subscription models without actually calling them “dark patterns.”)
There are other bills in the works designed to go after dark patterns more broadly, such as the Deceptive Experiences to Online Users Reduction (DETOUR) Act, written in 2019 to go after any language designed to “impair decision-making autonomy.”
That bill didn’t make it very far, Kibel said, but its intention is “almost identical” to state privacy laws that are actually calling out dark patterns by name. (The DETOUR Act was reintroduced in December 2021.)
State of the state
Back on the state level, under California’s CPRA and Colorado’s CPA any data obtained through the use of dark patterns is considered to have been collected without consent.
Both laws contain a clear definition of what constitutes dark patterns, which is any user interface “designed or manipulated with the substantial effect of subverting user autonomy or choice.”
This could apply to anything from deceptive subscription renewal terms to prefilled opt-in checkboxes, Kibel said, because they both lack “affirmative consent” from the end user.
Colorado also requires additional consent for the collection of “sensitive” data, such as precise geolocation or race and ethnicity. (California does not.)
Connecticut’s newly passed privacy law also makes a reference to dark patterns, but leaves the definition up to interpretation.
“Dark patterns are defined as any practice the Federal Trade Commission refers to as a ‘dark pattern,’” the law reads.
Although this might sound vague, there are only so many ways to interpret what “unfair and deceptive” means under Section 5 of the FTC Act, Kibel said. With that in mind, he said, vagueness actually gives regulators “a lot of discretion to decide what is and isn’t a dark pattern.”
And so, let the enforcement begin.