How AI and Voice are Perpetuating Gender Stereotypes – And What You Can Do About It

Landor executive director of strategy warns marketers to be conscious about their brand choices on new technologies or risk impeding social progress on gender.

Brands must take care not to perpetuate outdated and harmful gender stereotypes as they increasingly utilise humanised voice-activated devices and artificial intelligence (AI) in every part of our lives, Landor’s strategy chief warns.  

Speaking at this week’s Ad:Tech conference in Sydney, executive director of strategy for brand consultancy Landor, Daye Moffitt, told attendees the technology revolution is both electrifying and terrifying in terms of how it’s changing how we behave, interact and communicate. 

“We know technology is colliding with and changing social norms. What was once reserved for sci-fi is now reality. We are hyper real, personalised and intelligent,” she said. 

However, what’s equally clear is brands are further perpetuating gender stereotypes that have no place in our modern times through these technologies, Moffitt said.

“More and more, we’re tapping into this humanity and realism through these advanced technologies. Just look at the human faces, names and voices being used for robots to make it easier for us to trust them. We’re also letting children play with technology, inviting it into our homes to take control of things for us.

“This technology is playing a key role in social conditioning.”  

As examples of how these technologies support gender stereotypes, she noted voice-activated devices in the home, such as Alexa, overwhelmingly use female voices. “Whereas Watson is an alpha male on the executive team,” she pointed out. “Isn’t it disappointing that brands we love are using prescriptive gender types?

“If we continue to do this to technology, we not only reaffirm these stereotypes and sexualisation of women, we are perpetuating and amplifying this to the cost of social progress.” 

According to Moffitt, there are five things that can help minimise such risk. The first is letting the data decide.

“In some interactions, a female voice makes sense. But the decision should be based on the data, rather than social conditioning,” she said. “For example, Nest was based on data that proved children are more responsive to the maternal voice.”

 

Read the full article here.

X
Cookies help us improve your website experience.
By using our website, you agree to our use of cookies.
Confirm