Notice and choice just got dragged at a House subcommittee hearing on privacy.
The hearing, hosted by the House Subcommittee on Innovation, Data and Commerce, convened on Wednesday in Washington, DC, to discuss the need for a national data privacy standard in the US …
… something we nearly had.
The American Data Privacy and Protection Act (ADPPA), co-sponsored by Reps. Frank Pallone (D-N.J.) and Cathy McMorris Rodgers (R-Wash.), passed the House Energy and Commerce Committee last year. But the bill stalled before making it to a full floor vote in the House after then-Speaker Nancy Pelosi voiced concern about the preemption of state privacy laws.
Still, ADPPA made it further than any other attempt at a federal privacy law with bipartisan support, and the subcommittee is very keen to revitalize the effort.
“Right now, there are no robust protections,” McMorris Rodgers said. “Americans have no say over whether and where their personal data is sold … and they have no ability to stop the unchecked collection of their personal information.”
Hobson’s (notice and) choice
The privacy debate has long revolved around the question of control (or the lack thereof) that people have over the data collected about them.
For well over a decade, the online advertising industry has pushed the concept of notice and choice, sometimes referred to as “notice and consent.” The idea is that companies are good to go from a privacy perspective so long as they inform people about their data practices (notice) and give people a way to opt out (choice).
If someone accepts your terms, then voila, you’ve got consent. But disregard the fact that there’s no way on God’s green earth anyone actually read – or even skimmed – what they agreed to.
Anyone who spends any time online “knows why this notice and consent model is broken,” said Alexandra Reeve Givens, president and CEO of the Center for Democracy and Technology, who appeared as a witness at Wednesday’s hearing.
But even if people could feasibly read and understand “these labyrinthian privacy policies,” she said, “they often have no real choice but to consent.”
“Many online services are such an important part of everyday life that quitting is effectively impossible,” she said.
Rep. Pallone referred to the dynamic as “coercive.”
Requiring people, including children and teens, to trade their personal data in return for access to essential services “is not a real choice,” he said, because of how pervasive digital communication is today.
Less is more
That’s not to say that giving people more transparency into a company’s data collection practices shouldn’t be part of the solution. People deserve to understand what data is being collected about them and also have the ability to control it.
But “by no means is [transparency] sufficient,” testified Graham Mudd, Meta’s former VP of product marketing for ads and business products, who left the company last February to launch a startup, Anonym, which is building privacy-enhancing technologies.
Rather than putting the burden on consumers to “make decisions that they’re not well informed to make,” Mudd said, the most logical way forward is to focus on privacy by default, data minimization and creating better baseline protections to cut down on the amount of data that’s being collected in the first place.
And the less data a company has (that it doesn’t need), the less exposed it is to a data breach, said Jessica Rich, a senior policy advisor for consumer protection at law firm Kelley Drye & Warren, who spent 26 years at the Federal Trade Commission.
“So many data breaches happen to data that’s sitting there and shouldn’t be, and the same with protections for sensitive information,” Rich said. “If people can prevent their sensitive information from being overcollected and stored, it’s less likely to be breached.”