What’s a good, succinct way to describe the current US system of data security and privacy regulation?
“It’s a mess,” said Daniel Solove, a professor of law at George Washington University.
That’s why the Federal Trade Commission (FTC) is hosting a series of hearings from mid-September through November to help it rethink its approach to policy and antitrust enforcement and better keep up with the pace of tech.
The first hearing, which took place Thursday in Washington, D.C., set the scene for upcoming discussions with a review and overview of competition, the consumer protection landscape and the state of privacy regulation.
Here are five issues the FTC must consider as it hammers out priorities for policy and enforcement in the 21st century.
Keeping up with tech
Virtually everything the FTC tackles today has something to do with emerging technologies.
But despite having more technology capacity than at any point in its history, the FTC isn’t equipped to monitor the rapid pace of innovation from the tech sector.
The FTC’s agenda should ensure that “its infrastructure and its resources matches the challenges that the agency faces,” said David Vladeck, a professor at Georgetown University Law Center and a former director of the FTC’s Bureau of Consumer Protection.
Regulating algos
Predictive algorithms are extremely difficult to regulate. “You can’t put them under oath,” Vladeck said.
And yet, companies increasingly rely on algorithms for their decision making which could, depending on the scenario, cause consumer harm.
Say a hotel chain uses an algorithm to predict whether a consumer has the propensity to mess up a room during his or her stay. Anyone identified as a room wrecker might be charged a higher rate or be barred from booking, full stop. “There’s nothing wrong with the algorithm and it might even be true – but you haven’t done it [yet],” Solove said. “And how do you argue with a prediction?”
By the same token, predictive algorithms underpin most of the antifraud tools out there, said Howard Beales, a professor of public policy at the George Washington University School of Business, who previously filled multiple roles at the FTC, including as a former acting deputy director.
“If it weren’t for those tools, there would be more identity theft than there is,” Beales said.
Enforcing rules
Regulating enormous platforms like Facebook, Google and even Amazon is just as tricky as trying to cross-examine an algorithm. But deputizing the platforms to be responsible for policing themselves isn’t a great option either, which is part of why more access to skilled tech talent should be a top agenda item for the industry.
Multiple companies, including both Google and Facebook, are already under order by the FTC for past infractions, but violations of these orders are hard to detect by a resourced-strapped federal agency. And when there are violations, which inevitably end up in the press (especially if they concern one of the larger platforms), it “undermines the deterrent value of consent decrees or enforcement cases,” Vladeck said.
A smaller order of magnitude, but still a potential problem, is the universe of app developers out there and the mobile ad companies that help them monetize their work. COPPA violations are nearly par for the course in the children’s game app category.
On Tuesday, New Mexico’s attorney general filed suit against kid app maker Tiny Lab Productions and a bunch of its monetization partners, including Google’s AdMob and Twitter’s MoPub, for illegally collecting and sharing the location data of children under the age of 13 – but it’s hard to know if even a high-profile case like this will have any lasting impact.
“How do you enforce against an industry like the mobile app industry that’s highly diffuse?” Vladeck said. “There are thousands and thousands of app developers, many of whom don’t know what the law requires or just don’t really care.”
What’s the harm?
The FTC uses the concept of consumer harm to dictate the cases and enforcement actions it brings, which means the burden is on the commission to demonstrate that there’s been deception or likely consumer injury.
“Focusing on harm reduces the need to design some overarching regulation that foresees all innovations,” said FTC commissioner and former FTC acting chair, Maureen Ohlhausen. “The strength of the FTC’s approach is its case by case enforcement. Maybe it’s less predictable, but we trade that for flexibility.”
Defining harm, however, is harder than it sounds. Is it monetary, reputational, the irreplaceable loss of a loved one (multiple people committed suicide after the Ashley Madison data breach, for example) or all of these things and more?
“There is a remarkable unwillingness to articulate what the harms are that we’re worried about – or an inability,” Beales said.
And that’s because harm, at least considered within the FTC’s remit and mission, can be an amorphous concept.
“Harm, in a lot of cases, doesn’t mean someone is out financially or that their reputation was ruined – some is just consumer trust,” Solove said. “If consumers start losing faith, it hurts other companies, it undermines companies that are doing the right thing and saying what they’re doing.”
Patchwork of privacy laws
At the same time, compliance burdens that specifically require companies to say what they’re doing are stacking up around the world.
Europe has the GDPR, California just enacted a strict new data privacy statute and now there’s growing pressure for national conformity in the United States while at the same time interest is sparking in individual states to possibly create privacy laws of their own.
The FTC’s role in this unfolding drama is not totally clear. What does seem clear, however, is that there’s little chance Congress will make any immediate moves to pass federal privacy regulation, Vladeck said.
Unless the business interests that are unhappy with the California law succeed in scuttling it or changing it in some significant way, he said, “we’ll see other states moving to adopt a regime based on the California statute which is, in some ways, based on the GDPR.”