The Dark Side of That Personality Quiz You Just Took – The Atlantic

While Kosinski thinks people’s rapidly diminishing privacy online is indeed dangerous, he’s quick to point out that there are potential benefits to personality profiling, too. Targeted ad campaigns could get kids to quit smoking, he suggests. Personalized political messages could inform voters, not pull their strings.

Companies like Cambridge Analytica have a commercial stake in exaggerating their techniques’ reach, as well. “What they’re selling is not exactly snake oil, though it can work as a placebo for panicky candidates who are down in the polls with weeks to go before Election Day,” the journalist Leonid Bershidsky has argued. “But just like artificial intelligence or, say, the blockchain, [data science has yet to produce] killer apps that can ensure a political victory or business success.”

I’m not convinced the nerdy podcasts and obscure track-and-field clubs I like on Facebook will hand the reins of my life to some shadowy corporation anytime soon. But I do think the threat is real—real enough, at least, that I wouldn’t give away my profile information for a personality assessment.

Something Kosinski told me gave me an uneasy feeling I haven’t been able to shake, too. There’s research that has been done on people’s trust in algorithms. A subject talks to an expert on a topic, and the expert offers some sort of insight on that topic, backed by one of two possible justifications: either a) the expert has thought about this for a long time, or b) the expert’s computer calculated the solution. The results show that people are more likely to trust the computer. “We’re being trained by algorithms that they’re always right,” Kosinski says.

Surely, such trust isn’t always misplaced. Vazire, the UC Davis psychologist, admits that she’d probably trust an algorithm over an expert—if she knew the algorithm to be accurate. But what if it’s not? What if, say, it’s built upon data collected by researchers who are prone to error and bias? Or what if it’s intentionally incorrect—sneakily incorrect? Conceivably, an algorithm could know so much about you that it could say exactly what would make you think, act, or feel a certain way.

That’s where the impulse to take a personality quiz keeps me up at night. I’m wired to seek out ways to reflect on who I am, but who I am is slippery—and that makes me open to suggestions. If people’s faith in algorithms continues to grow, it might not be long before I trust a computer to tell me about my personality more than I trust friends or family—or more than I trust myself.

That’s a strange future to imagine. But, hey, I am the Danube River. I’m adaptable. I’m sure I’ll adjust.

Comments

Write a Reply or Comment:

Your email address will not be published.*