The Federal Trade Commission is making privacy a priority.
That’s hard to do without a federal data privacy law. But, in the meantime, the FTC is trying to fill in the gaps with new rulemaking to apply privacy practices to consumer welfare enforcement broadly.
The Advanced Notice of Proposed Rulemaking (ANPR), which the FTC opened for public comments over the summer, is the first stage of that rulemaking process, said Rashida Richardson, the attorney advisor to FTC Chair Lina Khan, speaking at AdExchanger’s Programmatic IO conference in New York City this week.
“We need to hear from consumers and stakeholders so we can get a better understanding not just of data practices in general but also of what’s scalable and what’s not,” Richardson said. “Feedback will [help] inform the FTC’s next steps.”
Feedback, please
The FTC recently delayed the ANPR deadline for public comments by another month because the commission values general consumer feedback as a way to cut through thorny privacy issues. And the commission needs more of it.
Rashida said the ANPR would be a way of “providing a lay of the land on exactly what’s allowed and what’s prohibited” in terms of the FTC’s enforcement of data practices.
The FTC also adds more “hooks” for data privacy enforcement, she added, such as cases made by attorneys general in the growing number of states with their own data privacy laws.
“We have to go aggressively after [high-risk] data practices that pose the greatest risks to consumers, at least as a mitigating intervention,” Richardson said.
That means the FTC is targeting sensitive data in particular.
Data that could be subject to privacy protection is generally thought of as “monolithic,” she said. But, in practice, the cases that are strongest for enforcement pose very high risks for very specific groups of individuals – not necessarily many people at all.
What is sensitive then?
Location data and health-related data are common examples of potentially sensitive data because they can be used to suss out extremely personal facts about an individual.
The FTC is paying special attention to companies it suspects are less than careful with geolocation data in particular, hence the recent case against Kochava.
And, lately, the FTC is laser-focused on location data following the Supreme Court’s overturn of Roe v. Wade, which ramped up “public attention to how that data can reveal very intimate aspects of people’s lives,” Richardson said.
The current liberal-majority FTC has some urgency to set precedent on geolocation data since the overturn of Roe v. Wade, according to Richardson.
Businesses, individuals, state governments and law enforcements can potentially use location data, especially paired with mobile device identifiers or IP addresses, to identify and prosecute women who obtained an abortion, visited a clinic or traveled out of state to obtain healthcare. Friends or family members of a woman they helped to procure an abortion or even just receive healthcare information could also be prosecuted. In a state like Texas that offers bounties to people who bring forward abortion cases that are successfully prosecuted, purchasing commercial location data sets might prove profitable.
Consumers should know they have a choice of what data they share about themselves and potentially to whom, Richardson said.