“Data-Driven Thinking” is written by members of the media community and contains fresh ideas on the digital revolution in media.
Today’s column is written by Gary Kibel, a partner in the technology, digital media and privacy practice group at Davis & Gilbert.
The ad tech industry continually improves its use of data to segment users into unique groups and target them with the right message at the right time. Data analytics is now a valuable and essential component of digital media buying, as can be seen through the growth of data-management platforms. Campaigns are considered effective when the data reveals that ROI has increased based upon the targeting and segmenting.
However, have you ever considered if there are unintended negative consequences of such big data analytics and retargeting? While campaigns can use detailed data analytics to reach a representative sample of consumers, it is also possible that the data results in excluding consumers of certain socioeconomic backgrounds from ads as well as any resulting offers, discounts or promotions.
With nothing but neutral and unbiased good intentions, all that data-crunching may lead to a discriminatory outcome. This concern has regulators now focused on the issue.
“Big data analytics raises the possibility that facially neutral algorithms may be used to discriminate against low-income and economically vulnerable consumers,” FTC chairwoman Edith Ramirez said in mid-2014. “There is the worry that analytic tools will be used to exacerbate existing socioeconomic disparities, by segmenting consumers with regard to the customer service they receive, the prices they are charged and the types of products that are marketed to them.”
The White House echoed these concerns in its 2014 report, “Big Data: Seizing Opportunities, Preserving Values,” when it stated that “big data analytics have the potential to eclipse longstanding civil rights protections in how personal information is used in housing, credit, employment, health, education and the marketplace. Americans’ relationship with data should expand, not diminish, their opportunities and potential.”
Do not be misled by the use of the term “personal information” as regulators have long ago expanded that term to include persistent identifiers and other tracking methods.
The digital advertising industry would argue that such discrimination is never the intent of the actors or the campaign. Most services do not target consumers based upon sensitive data elements, such as race or ethnic background. Despite the absence of intent, the outcomes can be real.
In 2012, the City of Boston proposed an app called Street Bump that would enable city residents to report the existence of potholes. Crowdsourcing such data sounds like a simple, brilliant and noncontroversial idea. However, it was revealed that the data could become skewed because lower income and elderly residents may be less likely to carry smartphones, resulting in under-reporting potholes in their neighborhoods and over-reporting in more affluent neighborhoods. Although the City of Boston addressed these issues, many providers are unlikely to consider the potentially discriminatory outcomes of their data-driven products and services prior to launch, or to later revisit the results generated to look for such issues.
There are no easy answers to this dilemma. The advertising industry has long used data and consumer information to target messages and offers, both online and offline. Marketers may be reluctant to collect new sensitive information in order to ensure that their messages are delivered in a socioeconomically neutral manner. Further, one could argue that the resulting campaigns that use data analytics only reflect the realities of our world. Bad data going in results in bad data coming out.
However, just as data may exacerbate this problem, perhaps data can be used to remedy the situation. Companies could analyze the outcome of their data analytics techniques and refine their methods to ensure a balanced and neutral campaign execution.
Education has often been referred to as the great equalizer in our society, but technology can be a great equalizer too. At the same time, data’s unintended consequences must be held in check. It behooves the industry to be aware and stay ahead of this issue. Those in Washington are watching.
Follow Gary Kibel (@GaryKibel_law), Davis & Gilbert LLP (@dglaw) and AdExchanger (@adexchanger) on Twitter.