Data-driven algorithms can penetrate the most intimate aspects of our psychology, enticing consumers to buy a particular product or pushing voters to support a specific political candidate.
While this technology can be invasive, it can also be empowering when used correctly, according to Sandra Matz, the David W. Zalaznick Associate Professor of Business at Columbia Business School.
Matz, a computational social scientist with a psychology and computer science background, is the author of Mindmasters: The Data-Driven Science of Predicting and Changing Human Behavior. In the book, Matz uses her experience living in rural Germany to draw parallels with the extent to which algorithms, AI, and data-collecting companies understand human psychology.
In the small, close-knit village where she grew up, Matz's neighbors knew almost everything about her, giving them a deep understanding of her life and allowing them to influence her in ways she didn't have control over.
As she spoke during a recent event hosted by CBS's Digital Future Initiative, Matz argued that we all have "digital neighbors" — algorithms, AI, and companies that collect vast amounts of data about us through our online activity. Like the physical neighbors in her village, these digital neighbors gain deep insights into our psychology and behavior, which they can then use to influence our choices. She shared three key insights about how these digital neighbors impact our lives and behavior and how humans can gain power over their data instead of letting it define us.
Big Data Can Be a Tool for Good
Matz explained that while data and algorithmic insights can, at times, feel manipulative and infringe on our sense of privacy, they can just as easily provide novel insights into our psychology that would be difficult for human experts to uncover.
According to Matz, healthcare is one of the many fields where these insights can be immensely valuable. She posited that algorithms and AI, particularly generative AI, can help address the massive gap between demand for mental health support and a lack of available mental health professionals, for example.
AI does well as a conversational partner, providing therapy and support to those who may not otherwise have access, and algorithms can also help identify early warning signs of mental health issues, enabling earlier intervention and support. For example, data collected through smart sensors (e.g., GPS records) could be used to determine if a person’s behavior deviates from their typical routine. As Matz’s research has shown, spending more time at home than usual or exhibiting lower levels of physical activity than normal is associated with an elevated risk of developing clinical depression.
“Observing these patterns isn’t the same as a full-on clinical diagnosis of depression," Matz says. “But it's a way for us to raise the alarm early and hopefully prevent an individual from entering a major depressive episode. Even if it turns out to be a false alarm in the end, we give you a chance to look into it and get support if you need it.”
Why 'Nothing to Hide' Is a Risky Mindset
According to Matz, people who say that they have nothing to hide from big data are in a very privileged position. She argues that just because someone doesn't currently have to worry about their data being used against them doesn't mean that's the case for everyone.
Even if someone believes they need not worry about data privacy, Matz explains that data can reveal intimate details—such as political views, sexual orientation, and mental health status—that one may prefer to keep private and that could be used for discrimination. While some people may feel they have nothing to hide at the moment, Matz argues that circumstances can change in a flash.
Matz noted how the overturning of Roe v. Wade in 2022 was one such catalyst – previously innocuous GPS tracking data now had the potential to be used against women traveling across state lines to seek reproductive healthcare.
"Data is permanent, but leadership isn't," Matz says, adding that "once our data is out there, you can’t get it back, even if the leadership of your government or major private actors changes radically. It's a gamble that I personally wouldn't want to take."
Data Cooperative Solution
To better balance the challenges and opportunities presented by big data, Matz suggests that relying solely on an individual's control and transparency is not sufficient. The complexity of technology makes it a burden for individual people to manage. Instead, we should focus on systemic solutions that empower people collectively.
One such solution is a data cooperative, where participants volunteer to pool their data together for a common good. According to Matz, data co-ops can empower people to collectively manage their data and derive value from it under shared governance. Co-op members can then leverage the expertise gleaned from their data to generate insights that directly benefit them rather than just companies, maximizing their data's utility while also mitigating the risks.
Matz highlighted MIDATA, a Swiss data co-op where users actively contribute to medical research and clinical studies by granting selective access to their personal data.