From Siri to Microsofts Tay, womens voices are leading the AI frontier but we have no reason to believe the male-dominated industry understands us at all

By now youve likely heard the story of Tay, Microsofts social AI experiment that went from friendly millennial girl to genocidal misogynist in less than a day. While Tay promised to learn from her interactions with people online, Microsoft apparently hasnt learned anything from the countless headlines about how Twitter users like to talk to visible women everything from gleefully anarchic trolling to threats and abuse otherwise it would have seen this coming.

At first, Tays story seems like a fun one for anyone whos interested in cautionary sci-fi. What does it mean for the future of artificial intelligence if a bot can embody the worst aspects of digital culture after just 16 hours online? If any AI is given the vastness of human creation to study at lightning speed, will it inevitably turn evil? Will the future be a content creation battle for their souls?

But Tay was not a very good AI, and on Microsofts part, this was not a very good idea. All it would have taken to know this would have been a basic level of dialogue with women in tech and nothing terrifies this industry more.

In recent years weve had two very good films about artificial intelligence: In Spike Jonzes Her, a goofy Joaquin Phoenix twiddles his way into an idyllic relationship with his charming operating system, voiced by Scarlett Johansson. In Ex Machina, Oscar Isaac plays a damningly plausible tech zillionaire called Bateman (like the American Psycho) who invites equally plausible startup geek Caleb to Turing-test Ava, his fey, beautiful AI creation.

Both films feature latently sex-starved men who are unaware of their own weirdness. Hers retrofuturist aesthetic wood panels, high-waisted pants creates a carefully chosen timelessness and a throwback to an era of tech optimism, but it also gently highlights the arrested development of a man who falls in love with a talking device that lives in his front pocket.

Ex Machina is less kind: Both the openly-unlikeable tyrant Bateman and his would-be protege, Caleb, are familiar Silicon Valley caricatures who awkwardly fumble through workouts and whiskeys together. The viewer may even be lulled into believing that Caleb, a goofy, tousled nice guy who decides to rescue Ava from the hairy grasp of her creator, deserves to ride off into the sunset with her, a token of his victory over the more dystopian, overtly misogynistic Bateman.

Both of these films end with the female AI outsmarting her would-be lovers, owners and builders, leaving the men baffled and the viewer with a sense of doom. We feel our terror of the future and the tech industrys terror of women walking neatly hand in hand toward the horizon as the credits roll.

In Spike Jonzes Her, a goofy Joaquin Phoenix twiddles his way into an idyllic relationship with his charming operating system, voiced by Scarlett Johansson. Photograph: Uncredited/AP

Its fitting that our modern fiction about AI should go hand in hand with horror-laced tales about mens failure to correctly estimate women. Increasingly, AI helpers from Apples Siri and Microsofts Cortana to talking home thermostats, GPS and fitness apps default to a female voice, as lots of research suggests that both male and female consumers prefer it. The likely explanations of this are many and probably driven by social conditioning we want our virtual assistants to seem pliant and nonthreatening, competent but not domineering. Maybe AI development is also influenced by the geek culture ideal of being alternately serviced and encouraged by a hard-earned digital princess the nostalgic science fiction fantasies of white guys drive lots of things in Silicon Valley, so why not the concept of AI?

No matter the reason, the voices of women and their creators ingrained concepts of modern womanhood are leading the AI frontier. But we have no reason to believe that the male-dominated technology industry understands us at all. Siri and her cohort, for example, know how to respond to health crises such as suicidality but do not know what rape is. These robot women have no answer for questions about domestic violence, like a community of real women would. Siris makers had apparently not thought about emergencies primarily relevant to women. In fact, Siri even insists she has no gender but womens speech is more than just voice sound it purportedly involves word choices, too. They conditioned a woman and then attempted to neutralize her.

This is, of course, yet another example of why we need more women in technology and of how the industry is failing to listen to those of us who are already here. Despite lip service to diversity, those who push for it in their workplace are actually likely to be penalized. There was recently an entire summit at South by Southwest dedicated to finding solutions for the abusive speech directed at women and people of color online isolated away from the main conference and reportedly attended only by the few brave souls already aware of the issue. Weve gotten the clear message that womens safety online is not a problem the tech industry is solving; that maybe they dont even believe its a problem.

How could anyone think that creating a young woman and inviting strangers to interact with her on social media would make Tay smarter? How can the story of Tay be met with such corporate bafflement, such late apology? Why did no one at Microsoft know right from the start that this would happen, when all of us female journalists, activists, game developers and engineers who live online every day and could have predicted it are talking about it all the time?

The answer cannot be anything but outright disdain. The industry wants to use womens voices but still has no plans to actually listen to them. If empathy is core to the future of artificial intelligence, worry not the Singularity is still quite a way off, no matter how many terrifying Holocaust-denying, racist, anti-feminist white millennial-bots Microsoft accidentally spawns.

Read more: