[New Post] Techno-Patriarchy: How AI is Misogyny's New Clothes
Published 5 months ago • 9 min read
Techno-Patriarchy: How AI is Misogyny's New Clothes
In the discussions around gender bias in artificial intelligence (AI), intentionality is left out of the conversation.
We talk about discriminatory datasets and algorithms but avoid mentioning that humans — software developers — select those databases or code the algorithms. Any attempts to demand accountability are crushed under exculpating narratives such as programmers’ “unconscious bias” or the “unavoidable” opacity of AI tools, often referred to as “black boxes”.
Patriarchy is much older than capitalism; hence, it has shaped our beliefs about those who have purchasing power and how they use it. So patriarchy wants us to believe that women don’t have money or power, and that if they do, they’ll spend it on make-up and babies and put up with services and products designed for men. Moreover, that women are expendable in the name of profits. All this while in 2009 women controlled $20tr in annual consumer spending and in 2023 they owned 42% of all US businesses.
Tech, where testosterone runs rampant, has completely bought into this mantra and is using artificial intelligence to implement it at scale and help others to do the same. That’s the reason it disregards women’s needs and experiences when developing AI solutions, deflects its accountability on automating and increasing online harassment, purposely reinforces gender stereotypes, operationalises menstrual surveillance, and sabotages women’s businesses and activism.
Techno-optimism
Tech solutionism is predicated on the conviction that there is no problem tough enough that digital technology cannot solve and, when you plan to save the world, AI is the ultimate godsend.
It’s only through understanding the pervasiveness of patriarchy, meritocracy, and exceptionalism in tech that we can explain that the sector dares to brag about its limitless ability to tackle complex issues at a planetary scale with an extremely homogenous workforce, mainly comprising white able wealthy heterosexual cisgender men.
For instance, recruiting AI tools have been regularly portrayed as the end of biased human hiring. The results say otherwise. Notably, Amazon had to scrap their AI recruiting tool because it consistently ranked male candidates over women. The application had been trained on the company’s 10-year hiring history, which was a reflection of the male prevalence across the tech sector.
Another example is the assumption of manufacturers of smart, internet-connected devices that the danger typically comes from the outside; hence, the need to use cameras, VPNs, and passwords to preserve the integrity of the households. But if you’re a woman, the enemy may be indoors.
Tech is also a master at deflecting their responsibility on how AI enables bullying and aggression towards women. For example, we’re told that we must worry about deepfakes threatening democracies around the world based on their ability to reproduce voices and images from politicians and world leaders. The reality is that women bear the brunt of this form of AI.
How do machines know what a woman looks like? The Gender Shades study showed that face recognition algorithms used to predict race and gender were biased against darker females, which showed up to a 35% error compared to 1% for lighter-skinned males. Whilst Microsoft and IBM acknowledged the problem and improved the algorithms subsequently, Amazon blamed the auditor’s methodology.
Tech has a long tradition of capitalising on women and gender stereotypes to anthropomorphise its chatbots. The first one was created in 1966 and played the role of a psychotherapist. Its name was not that of a famous psychotherapist such as Sigmund Freud or Carl Jung, but Eliza, after Eliza Doolittle in the play Pygmalion. The rationale was that through changing how she spoke, the fictional character created the illusion that she was a duchess.
Tech actively sabotages women in areas such as self-expression, healthcare, business, finances, and activism.
AI tools developed by Google, Amazon, and Microsoft rate images of women’s bodies as more sexually suggestive than those of men. Medical pictures of women, photos of pregnant bellies, and images depicting breastfeeding are all at high risk of being classified as representing “explicit nudity” and removed from social media platforms.
It can escalate too. It’s not uncommon that women’s businesses relying on portraying women’s bodies report being shadow-banned — their content is either hidden or made less prominent by social media platforms without their knowledge. This practice decimates female businesses and promotes self-censoring to avoid demotion on the platforms.
While AI is naturally associated with the virtual world, it is rooted in material objects. Moreover, most tech software and platform giants — Apple, Google, Amazon, Microsoft, and Meta (aka Facebook) — are hardware providers as well. Datacentres, smartphones, laptops, and batteries rely heavily on metals such as cobalt and women often play a key role in their extraction and recycling.
For example, the Democratic Republic of Congo supplies 60% of the world’s cobalt. The mineral is extracted via artisanal and industrial mines. Some sectors welcome the integration of women into the artisanal mines as a means to empower them financially and as a substitute for children’s labour.
What tech has done about this? Software-only companies continue to look the other way while those manufacturing hardware avoided their responsibility as much as they could.
There is also a gendered division of labour in electronic waste, a €55 billion business. Women frequently have the lowest-tier jobs in the e-waste sector. They are exposed to harmful materials, chemicals, and acids as they pick and separate the electronic equipment into their components, which in turn negatively affect their morbidity, mortality, and fertility.
Again, the focus of the efforts goes to reducing child labour and women’s work conditions are lumped with those of “adult” workers. An additional challenge compared to mining work, it’s that hardware manufacturers control the narrative, highlighting their commitment to recycling materials across their products for PR purposes.
AI-powered misogyny beyond tech
Last but not least, not only tech companies use AI as a misogyny tool. Organisations and individuals around the world are ramping up quickly.
The baby-on-board market is a goldmine and technology is instrumental in helping vendors to exploit it. It has become habitual that retailers use AI algorithms to uncover and target pregnant girls and women.
Then, there is sexual exploitation. According to the United Nations, for every 10 victims of human trafficking detected globally, five are adult women and two are girls. Overall, 50 percent of victims are trafficked for sexual exploitation (72% in the case of girls). Traffickers use online advertisements, social media platforms, and dating apps — all powered by AI — to facilitate the recruitment, exploitation, and exertion of control and pressure over the victims.
And thanks to generative AI, it has never been easier for individuals to create misogynistic content, even accidentally. Examples include:
ChatGPT replicating gender stereotypes when writing professional profiles, stressing communal skills for women while highlighting financial achievements for men.
Tech has embraced the patriarchal playbook in its adoption and deployment of artificial intelligence tools. Hoping to reap massive financial returns, the sector is unapologetically fostering gender inequity and stereotypes.
As Black feminist Audre Lorde wrote, “The master’s tools will never dismantle the master’s house.” Whilst tech continues to be run by wealthy white men who see themselves as the next Messiah, misogyny and patriarchy will be a feature and not a bug of artificial intelligence applications.
We need a diverse leadership in tech that sees women as an underserved market with growing purchasing and executive power. Tech also needs investors to understand that outdated patriarchal beliefs about women being a “niche” don’t serve them well.
Finally, Tech needs to assume responsibility for the tools it creates and that goes beyond monitoring apps performance. It starts at the ideation stage by asking uncomfortable ethical questions such as “Should we build that?”
Because not all speed is progress.
NOTE: This article is based on a piece that I wrote previously for The Mint.
PS. You and AI
Are you worried about the impact of AI impact on your job, your organisation, and the future of the planet but you feel it’d take you years to ramp up your AI literacy?
Do you want to explore how to responsibly leverage AI in your organisation to boost innovation, productivity, and revenue but feel overwhelmed by the quantity and breadth of information available?
Are you concerned because your clients are prioritising AI but you keep procrastinating on learning about it because you think you’re not “smart enough”?
I’ve got you covered. Reply to this email to book a call and explore how my Strategic AI Leadership program can help you harness the potential of AI for sustainable growth and responsible innovation.
Philipp Schmitt & AT&T Laboratories Cambridge / Better Images of AI / Data flock (faces) / Licenced by CC-BY 4.0 Speculative fiction: The Life of Data Podcast Have you ever thought what happens to your photos circulating on social media? I have and that's the topic of in my second short story in English in which I used speculative fiction to question the interplay between humans and technology, specifically AI. In a nutshell, I imagined what the data from the digital portrait of a Black...
Image by Alexa from Pixabay. The Three Hidden Forces That Sabotage Your Ambitions and How to Overcome Them Throughout my life, I’ve devoted a lot of energy to “solve” for what I call “point blockers” — one-off events that come up as disruptive, beyond my control, or that I’ve given somehow a quality of being life-changing Somebody's death A certification A promotion Which is great for short-term survival — all my brain is focused on solving the problem at hand. What’s not so good is that — as...
Anton Grabolle / Better Images of AI / Human-AI collaboration / CC-BY 4.0 I cannot believe that's already 22nd of April. It's been a hectic month in a good way. For example, In my role as UK partnership lead for European Women on Boards, I've been preparing two virtual events to discuss how women can get on commercial boards as well as how to transition from board member to chairperson. This Thursday, I'll deliver the talk "Beyond the Offer Letter: Dispelling 7 Myths About the Gender Pay Gap"...