When I first heard ChatGPT-4o, as well as being curious and unnerved by its (or should I say her?) capabilities, I was shocked and pretty grossed out by how weirdly flirty OpenAI made her sound. Thankfully, it turns out I wasn’t the only person who felt this way, but sadly it really felt like only a small minority were talking about it.
First Apple’s Siri, then Microsoft’s Cortana, Amazon’s Alexa, HDFC Bank’s Eva... the examples feel endless. Even my Ultimate Ears Boom speaker sounds like it wants to sleep with me! (Which is actually quite impressive because she literally only says about four distinct words.)
My point, made so much more eloquently than I could have, is summed up in this article a friend shared with me: https://lnkd.in/ddgRPVxH
Why are a group of tech bros (OpenAI - and AI in general - is overwhelmingly male: 88% by UNESCO’s estimate) building the world’s chatbots to be submissive, sexy, and always female? How is this going to silently and insidiously change, or cement, the way society sees women?
What you see and hear shapes what you believe. (Great book on this, by the way, that got recommended to me by a colleague, Delusions of Gender by Cordelia Fine, all about how our implicit beliefs shape our abilities.) As millions of people around the world now turn to their new AI assistant/sex-bot, how are we bludgeoning the minds of men, women, and maybe most strongly, children with tired but effective stereotypes about women and girls?
What frustrates me the most is that this issue has been highlighted for years, ever since Siri came on the scene, and way before ChatGPT became a verb, or even existed. The fact that this is still an issue, now with such salience and scale, is merely proof that the boys at OpenAI who got all excited building Her simply don’t care about the damage they will inflict by making millions of people consistently associate women with servants and with sex. (They also don’t appear to care about stealing people’s work and even their likeness and are now being sued by Scarlett Johansson because ChatGPT-4o sounds so much like the sexy chatbot she played in the movie, Her.)
They can’t claim ignorance, they’re just not even trying.
Society can’t sit back and expect companies to police themselves. We’ve seen time and again how catastrophically that fails, from the financial sector to health to the last big tech innovation: social media. There need to be rules, they need to be enforced, and there need to be consequences for breaking them. The response can’t be “it’s too complicated to apply the rules.” They’re smart, they can figure it out.
Rant over.
P.S. There were some people going against the grain. Unlike most chatbots, Poncho won’t keep talking to you if you abuse it. Poncho is also male.
Some sources: https://lnkd.in/dBqAy7e5
Fractional CLO | Forbes Writer | Executive & Career Coach | Business Coaching Program Architect | Leadership Development & Organizational Effectiveness Strategist | Host “What’s On Your Bookshelf" | Global Board Advisor
9moI remember when this movie came out in 2013 and wondering how long these tools already had been in production and being tested for mass market viability. https://en.m.wikipedia.org/wiki/Her_(film)#:~:text=Her%20(stylized%20in%20lowercase)%20is%20a%202013,relationship%20with%20Samantha%20(Scarlett%20Johansson)%2C%20an%20artificially