It’s been a weird four months for Aleksandr Kogan.
The 33-year-old went from an obscure research psychologist at Cambridge University to one of the villains at the heart of the Cambridge Analytica scandal.
Kogan, who testified Tuesday before a Senate Commerce subcommittee, apologized for his role in the controversy, but disputes that the data provided by his personality app to Cambridge Analytica swung the 2016 US presidential election.
“If the goal of Cambridge Analytica was to show personalized ads on Facebook based on people’s personalities, what they did was stupid,” Kogan said.
Instead, Cambridge Analytica attempted to reinvent the wheel and failed.
Kogan developed a personality quiz called “This Is Your Digital Life On Facebook” that was downloaded by around 300,000 users. Under Facebook’s policy at the time, Kogan collected friend-related data via those users without their permission, which enabled Kogan to get data on between 50 million and 87 million people.
He eventually sold that data to Cambridge Analytica, which got name, location, birthday, gender and presumed personality traits, such as extrovert, agreeableness or adventurousness, based on page likes.
But the application of this data set left a lot to be desired, Kogan said.
It just wasn’t all that effective for microtargeting. That Cambridge Analytica somehow weaponized the internet as part of a mind-control effort to dupe people into voting in ways they normally wouldn’t is, he said, “science fiction.”
“Psychographics as a whole is a dead end for [marketing],” said Kogan, who claimed he went into the project thinking it could pan out. “There is value to the data if you’re trying to understand the general trends of big groups of people – but for any one person, it simply doesn’t work.”
After studying the data, Kogan discovered that for every five personality traits he tried to predict, he was only accurate for around 1% of users. For 5-6% of users, he was wrong about everything.
Kogan concluded that better accuracy could be achieved by just “simply assuming everybody is average in everything,” he said.
On reflection, there was no point in even embarking on the project. Cambridge Analytica tried to derive insights for microtargeted advertising based on a smattering of page likes and only 30 million profiles – a fraction of Facebook’s more than 200 million users in the US. In other words, the sample was too small and the targeting signals too scant.
Cambridge Analytica would have done better if it had just run a survey to collect a bunch of email addresses and used Facebook’s ad platform to create lookalike audiences rather than trying to develop its own models.
“Facebook will use a lot more of its data than just page likes to build models,” Kogan said.
But, regardless of whether Cambridge Analytica was able to perform ad tech alchemy, people still have a right to feel angry and violated that their data was collected by third parties they had no control over, he said.
“Just the fact that data can be accessed elicits a strong, visceral response,” Kogan said. “I’ve naturally taken a hard look in the mirror at my own role in the controversy. What is clear to me now is that I made a mistake in not appreciating how people would feel about us using their data, and for that, I’m deeply sorry.”