Skip to main content
Official Logo of Columbia Business School
Academics
  • Visit Academics
  • Degree Programs
  • Admissions
  • Tuition & Financial Aid
  • Campus Life
  • Career Management
Faculty & Research
  • Visit Faculty & Research
  • Academic Divisions
  • Search the Directory
  • Research
  • Faculty Resources
  • Teaching Excellence
Executive Education
  • Visit Executive Education
  • For Organizations
  • For Individuals
  • Program Finder
  • Online Programs
  • Certificates
About Us
  • Visit About Us
  • CBS Directory
  • Events Calendar
  • Leadership
  • Our History
  • The CBS Experience
  • Newsroom
Alumni
  • Visit Alumni
  • Update Your Information
  • Lifetime Network
  • Alumni Benefits
  • Alumni Career Management
  • Women's Circle
  • Alumni Clubs
Insights
  • Visit Insights
  • Digital Future
  • Climate
  • Business & Society
  • Entrepreneurship
  • 21st Century Finance
  • Magazine

Hey Siri, I Love You: New Research Shows People Prefer Gendered Technology

Average Read Time:

New Columbia Business School Study Analyzes Why Companies Intentionally Design Products Like ‘Siri’ or ‘Alexa’ to Have Binary Genders

Published
February 14, 2023
Publication
CBS Newsroom
Jump to main content
Topic(s)
Media and Technology
Organizations
Risk Management
Save Article

Download PDF

0%

Share
  • Share on Facebook
  • Share on Threads
  • Share on LinkedIn

NEW YORK, NY — Many Apple users might not realize that it’s possible to change the gender of Siri on an iPhone – from a female-sounding voice to a male one. Apple offers gendered options alongside a plethora of languages and accents for Siri to use. Most Americans own at least one technology that is human-like in design and motivates anthropomorphism – the attribution of human characteristics or behavior to an object. But there’s a reason most of these devices are intentionally “gendered”; that is, given names, pronouns, and personalities that clearly demarcate whether they are “male” or “female.” New Columbia Business School research finds that consumers are more attached to devices that are assigned a gender– even as Silicon Valley considers growing concerns that this design feature reinforces societal stereotypes. 

The study, Hey Siri, I Love You: People Feel More Attached to Gendered Technology, shows that people feel more connected and attached to technology that is described as having a binary gender. Columbia Business School Professor Malia Mason and doctoral graduate, now Stanford Business School Professor, Ashley Martin, found that this effect emerges, at least in part, because devices that have a binary gender are assigned gender stereotypes, and these gendered stereotypes make devices seem more human. Across multiple experiments, Professors Mason and Martin show that assigning a binary gender to AI devices drives perceptions that they are humanlike and enhances feelings of attachment to a greater extent than assigning them to other social categories, such as age and race.  

“Consumers feel more attached to devices that seem more human-like,” said Columbia Business School Professor Malia Mason, the Courtney C. Brown Professor of Business. “On the upside, when technology is assigned a gender, people see it as more ‘human’ and feel more attached to it; however, on the downside, this process is due to typical gender stereotypes that people ascribe to their technology.”

Professors Mason and Martin conducted five studies with various products including cell phones, autonomous cars, and vacuum cleaners, using data from approximately 10,000 Amazon reviews and 1,000 research participants. Across these studies, they examine whether gendering increases attachment to, and improves ratings for, technology with human characteristics, and rule out multiple, potentially confounding factors such as familiarity with product and race of the voice in the technology. Experiments ranged from analyzing whether Amazon reviewers who referred to their vacuum with a gendered pronoun were more inclined to use attachment language in their review, to dividing a panel of participants into three groups to rate their attachment to an autonomous vehicle, alternatively named the genderless “Miuu,” “Iris,” or “Jasper.” Across the studies, the researchers find that the preference for gendered technology emerges because applying gender stereotypes to these products makes them seem more human to consumers.

Key Findings Include:

  • People Are More Attached to Gender Than Race or Age: Throughout the studies, Professors Mason and Martin found that gender was the only social category that drove increased humanization and attachment. In one study, Professors Mason and Martin recruited a sample of participants who owned robotic vacuums like the Roomba. In both studies, they also measured participants’ ascriptions of other social categories (e.g., race, age), and they found gender above all other social categories predicted humanization and attachment.
  • Gender Can Humanize Technology: Ascribing gender increases the perception of the technology as human, which catalyzes attachment. These effects are because ascribing products gender (and thus gender stereotypes) leads to seeing those products as more human.
  • Findings Impact More Than Just Gender Research: These results have theoretical implications for research not just on gender, but also on the future of technology with human attributes and decision-making. They add to mounting evidence that gender plays an important role in seeing someone—or something—as human.

“The technology of the past often felt cold and impersonal, but advancements in AI have allowed people to feel a deeper connection to the technology by giving it a voice with a binary gender,” Martin said. “As AI technology continues to evolve, benefits of gendering technological devices are accrued primarily by the companies that sell them while the costs (i.e., reified stereotypes) are shared by society at large.” 

Anthropomorphized technology is a growing phenomenon worldwide. While more companies appear to use and benefit from gendered technology, there have been calls to de-gender technology by groups like the Equal AI initiative, whose mission is to ensure the “AI we use does not perpetuate and mass produce historical and new forms of bias and discrimination.” However, this study highlights a paradox about doing so: though gendering technology reinforces stereotypes, it also makes technology seem more human-like, which increases consumers’ feelings of attachment to products.

To learn more about the cutting-edge research being conducted at Columbia Business School, please visit www.gsb.columbia.edu.

###

External CSS

Articles A11y button

Official Logo of Columbia Business School

Columbia University in the City of New York
665 West 130th Street, New York, NY 10027
Tel. 212-854-1100

Maps and Directions
    • Centers & Programs
    • Current Students
    • Corporate
    • Directory
    • Support Us
    • Recruiters & Partners
    • Faculty & Staff
    • Newsroom
    • Careers
    • Contact Us
    • Accessibility
    • Privacy & Policy Statements
Back to Top Upward arrow
TOP

© Columbia University

  • X
  • Instagram
  • Facebook
  • YouTube
  • LinkedIn