Stereotypes at scale: How algorithms amplify gender stereotypes.
And what that means for all of us.
Following my previous article Does LinkedIn have a problem with Female Founders, several people asked me to write a follow up about the culture at LinkedIn, and whether it could be playing a part in the deletion of female founders from the platform.
I don’t know anything about the culture at LinkedIn; I’ve never worked there[1]. I don’t know what the workforce is like, what leadership is like and how that subsequently affects the experience every LinkedIn user has. (Because let’s be clear; how an organisation works on the inside is directly linked to how we experience it on the outside).
But what I can write about is the general composition of the global software workforce, how humans are choosing to use AI, what those choices are doing to the landscape of the Internet, and how these facts affect women on LinkedIn and other platforms.
The default male
The introduction of Caroline Criado Perez’s Invisible Women is entitled The Default Male. The first sentences is ‘Seeing men as the human default is fundamental to the structure of human society." The subtitle of the whole book is Exposing Data Bias in a World Designed for Men.
(Spoiler alert: the book includes a lot about the gender data gap).
I would agree with Criado Perez that the world has ended up this way by default, not design. It’s only “designed” for men because women are left out by default[2].
And the global software workforce is a good example.
Globally, the gender split between those identifying as male and those identifying as female within the software developer workforce is about 80% male to 17% female. The remaining 3% of the developer workforce identify as non-binary, other, or preferred not to say.
It’s part of a wider trend within the tech industry, where the global giants have the following gender splits across their workforces[4]:

This over-representation of one gender automatically skews things to the male experience. And therefore this feeds into what is considered the ‘default’ and how it normalises that default[5]. Even if developers have the best will in the world to be inclusive, empathy to another’s worldview is not the same as shared history, culture or experience. And if most of the organisation's leaders look the same as them, there’s even less diversity of thought.
I’ll illustrate with a personal story: in my previous corporate career, I was the one British person on a project team for a global organisation. The rest of my colleagues were from North America. The task was to choose tech that could create subdomain websites for our main site and offer translations into local languages, in order to remove a burden from our volunteers and maintain consistency across 300 different chapter websites all around the world. One colleague opined that we wouldn’t need to pay to translate North American spellings into British English spellings because “they” [British people] were used to reading things in North American English and so there’s no need for British English, they probably won’t even notice.”
I gently corrected him, explaining that actually we did use British spelling as a matter of course across digital sites back home and would notice the difference between ‘color’ and ‘colour’, ‘favorite’ and ‘favourite’ etc. He was gracious and happy to acquiesce the point[6].
He was acting as a polite, professional, well-meaning colleague who wanted to offer a solution to a problem. But he didn’t have the necessary lived experience to adequately solve that problem in a way that worked for the end user. And he thought his opinion - without any lived experience, without any proof - was enough to make a decision on behalf of about 5,000 people, our members within the UK.
Women were not just the original programmers, they were the original ‘computers’.
The current gender split of the developer workforce is accidental. But for a few twists of fate, I could be writing about a developer workforce where the gender split is 80% female to 17% male, and the problems end users, particularly men, face as a result[8].
Katrine Kielos (previously writing under the name Katrine Marçal) explains in her book Mother of Invention: How Good Ideas Get Ignored in an Economy Built for Men that ‘not so long ago, computers were in fact women. Literally. Before a computer was a machine, it was a job. You could get a job as a ‘computer’ and this would mean you would sit in a room calculating equation after equation for someone else. From the 1860s until some way into the 1900s, computing was one of a very small number of scientific careers deemed appropriate for women’[9].
And the arrival of the ‘computing machine’ did not automatically reassign the role of programmer to men. When Alan Turing, widely seen as the father of computer science, built the world’s first electronic, programmable computer at Bletchley Park to code-break Germany’s Enigma code in World War II, guess who used the levers and buttons to program the thing? Women. Female volunteers from the Women’s Royal Naval Service.
The computing staff at Bletchley Park during the war years was 75% female.
Computing remained a female-dominated field for a subsequent two decades. To quote Kielos again, ‘programming still broadly involved the same work, but the industry had become more important to society’.
The shift started in the mid-1960s, when (despite being a definitive factor in winning the second World War) computing’s image got a glow up. Realising that computers themselves were pretty key to a functioning, secure society, the UK civil service launched a public scheme encouraging men - particularly men from the right social class - to take an interest in computers. The women trained them up with just enough computing know-how to become their managers. And consequently left the industry in droves when it became clear there was nearly zero chance of promotion.
Today the UK Government and employers want more women in tech. We now have Girls Who Code, Women in Data®, TechWomen and many other initiatives to promote tech as a viable career choice for women. There are hackathons and bootcamps and all manner of things to upskill women in the field; there are coaching programmes and mentorships and sponsorships to increase the ratio of female leaders in the space.
It might have all been unnecessary. If we’d let things stay the same.
The gutting of the UK’s female programming workforce is also partly why the world's cradle of tech innovation is in Silicon Valley in the USA, not Bletchley Park in the UK.
Let's think on the economic, soft power and global capital the UK's lost out on as a result.
What algorithms want (and the downsides of how they work)
Nowadays we separate platforms from the people doing the programming. When discussing the vagaries of our content's performance or whether we can see updates from the creators we like, we talk about the algorithm.
Algorithms are trade secrets; black boxes that confer competitive advantage. We think they are genderless, agenda-less equations. But they’ve all been created—at least initially—by imperfect, normal, flawed human beings. And so they're subject to human imperfections.
When we're talking about the algorithm in terms of what social media content we see, we're actually talking about a recommendation engine. Here's IBM's definition:
A recommendation engine is an AI system that suggests items to a user. Recommendation engines rely on big data analytics and machine learning algorithms to surface patterns in user behaviour data and recommend relevant items based on those patterns. Recommendation engines help users discover content, products or services they might not have found on their own.
Most working age people have been familiar with recommendation engines since the dawn of Amazon. The fact you could buy a book and have other books recommended based on your purchase was a novel way to have us spend more money back in the early noughties. Now recommendation engines are a standard way to have us spend more time and attention on platforms.
And that’s problematic because all recommendation engines do is show us more of the same. The algorithms crunch data we’ve fed them through our behaviour, and, because their output depends on the input, they can only serve up similar content.
This form of recommendation engine leads users down an ever narrower, ever more uniform path. Discovering different content, stories or creators outside of the algorithms’ parameters takes conscious, concerted effort.
And perhaps more pertinently: the content you get served by the algorithms also depends on who created them.
LinkedIn’s current view of professional women: Is it a bug or a feature?
In my previous article, I explored whether female entrepreneurs were being shadow-banned, outright banned or deleted from LinkedIn simply because they were showing up as female entrepreneurs.
Based on my research, I believe the truth is more nuanced than that—but the LinkedIn's lack of nuance and diversity is a hugely aggravating factor that means women are affected by spurious deletion or shadow-banning in greater numbers than men.
Coupled with the politicisation of the words women, female, belong and others, there’s been a notable rise in the kind of content you’d usually see on platforms owned by Meta.
During my research on the previous article, after about 10 minutes of clicking on content with the word ‘women’ in it, I was served this ad:
My instant reaction? I have never seen anything like this on a ‘professional platform’. I can’t believe that this is the kind of ad the LinkedIn algorithm ‘thinks’[10] any professional—of any gender — would want to be served. I can’t believe it passed checks. (I’m curious as to whether any of those checks were done by humans).
But what it shows is seriously interesting. Even if the image itself plays on tired old tropes.
·The sexualised female executive, showing strength—sure—but is also dressed in a way that shows her shape, reveals skin and is highly aestheticised. It’s a world away from the different women we saw in Sport England's ‘This Girl Can’ campaign or Sweaty Betty's ‘Wear the Damn Shorts’.
The projected aspiration that she wants to ‘feel like herself’ again. Which version of herself? In this instance, the model's ‘self’ just so happens to look like a version of femininity designed for a male gaze.
Before I started my research and fact-checking, my immediate thought on seeing this ad?
A lot of the people involved in making it: developers, AI trainers, decision-makers—must be men.
Men that have a very limited idea of what a Caucasian, western professional woman wants to look like and what her priorities are.
But they think they know.
And they believe that their opinion is enough.
I will suggest that if the image is AI generated, the prompt used was something like ‘a professional, attractive, mid-life woman working out’. And this is what got spat back out at us.
That shows how bias—how someone’s view on a group of people, which may or may not reflect what’s actually important to that group of people—gets encoded into what should be neutral, gender-and-agenda-less data.
And it affects how others view that group of people every day.
I could have shared this advert on a post but I didn’t want to give it the reach. And if this has been served to me by automation without enough human oversight, I don’t want to just shake my head and ignore it.
Because if it isn’t being challenged, it will be preserved. And if it is preserved, then it will feed back into the AI training material. And become entrenched. And then will become harder to challenge.
(As an aside I’d be curious to know how the advertisers, and LinkedIn, are measuring the success of this ad. It’s been in my feed for the last two weeks and the follower count never gets north of 4000. The one repost I’ve seen is a busy, professional woman sending it up. It might be getting eyeballs, but is it getting clicks, and are those clicks converting into sales? It might have been cheap to produce, but that’s a false economy if it’s actually repelling the customers it’s supposed to attract).
Treat the algorithm like a dog. Not a god.
I love this reframe[11] from Emma-Louise Munro Wilson FRSA. She shares how we humans have the power to shape the algorithm—even if it’s on an individual basis - rather than allowing it to shape us. Even if the price we pay is engaging with the platforms more (let’s not forget: in the attention economy, that’s what the platform leaders want us to do because more eyeballs drive more advertising spend). Her original post in full is here, and the best bit in my humble opinion (emphasis mine) is below:
She’s not the only one encouraging users to choose their own adventure on LinkedIn. Jane Evans talks about feeding the machines with what we want to see: with her recent event, 'Wrangling the Algorithm', and in her own network, The 7th Tribe , she’s started a fightback against the plummeting reach of posts that use the word women, encouraging women to keep creating, sharing and amplifying the kind of content they want to see, to help retrain the algorithms at scale.
Individual and group actions are a great start. But to scale we need some bigger interventions.
Without knowing the exact cultural makeup at LinkedIn and its parent, Microsoft, here’s some suggestions to help it be more representative of all the professionals it serves:
Greater demographic diversity within its parent organisation at all levels – leadership, product, marketing, AI product etc. (Yes, there are issues here with accessing diverse talent pools but they can be mitigated through interventions like Learning & Development, coaching, ethical use of AI in recruitment to name but a few. Human Resources for the win!).
Rigorous assessment of the neutrality of algorithmic systems, and interventions to fix encoded biases before they’re allowed to scale.
Adequate, diverse, human oversight of content or functionality that may relate to a specific demographic – also known as nothing about us without us.
Never attribute to malice what could actually be stupidity
There’s a phrase in software development: “Never attribute to malice that which can adequately be explained by stupidity”.
Ultimately, ever-more-finely-tuned recommendation engines that prioritise what some men think women want might not necessarily be a nefarious, deliberate attempt to erase the female experience.
It could easily be something far more banal.
Groupthink.
Groupthink is not necessarily malicious.
It’s just narrow. It misses things.
The dangers of groupthink are precisely why companies like to have diverse points of view represented on their boards and senior leadership teams (and ideally, throughout the workforce).
The problem with groupthink that’s being powered by AI is one of scale. Groupthink at this scale could mute whole swathes of the human experience. It could—perhaps completely accidentally—silence an entire gender. It could entrench outdated ideas of what people of certain identities do and do not like, think or do not think, want or do not want.
Going back to my personal example earlier, when an American colleague blithely spoke on behalf of All British People and How They Feel About Spelling (I’m still rolling my eyes eight years later), without that gentle correction, we could have made a poor business decision.
We could have decided to save costs on the project by skimping on translation, pouring the investment elsewhere in the tech. We would have ended up saving a small amount of the project budget but ended up with a solution that none of our British members wanted to use - which would have strangled our strategy to use the UK Chapter website to grow the number of members in the UK.
So the net effect would have been failure. The project would have come in under budget, but failed due to lack of adoption, putting our growth in the UK back even further.
This is why diversity of thought and experience matters in business.
Without it we’ll all be worse off.
Further reading
For more on AI, generative AI, machine learning and the actors behind the platforms, I strongly recommend following scientist and NYU Professor Gary Marcus and subscribing to his Substack:
.
For more on the erasure of women from computing’s history (and other stories of how female-led innovation is stifled by male-dominated bias) read Katrine Kielos Mother of Invention (previously published under the name Katrine Marçal). You can find her and how to buy it on LinkedIn here: KATRINE KIELOS LIMITED. She also has a Substack:
Caroline Criado Perez's Invisible Women is a classic as well as a clarion call to action. The chapter “The Myth of Meritocracy” delves into the double standards women in tech experience, how the difference between boys’ and girls’ socialisation can make a tech career less available and attractive to women, and how women in the US are earning degrees in maths and chemistry at almost the same rate as men. Invisible Women pp. 92-111. She also has a Substack publication:
.Women in Tech Network is a great resource for all things related to gender within the global tech workforce: this paper, Women Tech Stats 2025, is excellent: https://www.womentech.net/women-in-tech-stats
This is a great blog about women in tech, bro culture and the disparity between female and male talent in the developer pipeline: https://writefulcopy.com/blog/why-female-uk-programmers-minority
And of course, there's more in This Women's Work on Substack and LinkedIn.
Notes
[1] Although in a previous corporate life I visited their London offices a few times and the people have always been lovely. Exactly what you’d expect: collegial, professional, helpful and focused on success for their customers and their employees.
[2] The default being: they’re left out because of the historic, cultural connotations with their gender.
[3] Note: gender data has been calculated across the whole workforce of these companies. It does not drill down into the detail about gender split by roles. For example, Netflix has a fairly even gender split across its own whole workforce. But the data doesn’t show how many women are in developer or IT roles as opposed to marketing, content acquisition or other roles.
[4] Source, Women Tech Network: https://www.womentech.net/women-in-tech-stats#12
[5] Sidebar: perhaps this explains the disconnect between LinkedIn’s community guidelines and the experience women have from predatory men using the DMs as a dating site.
[6] Sadly, it probably helped that in this scenario I was in a senior position and the organisation was very hierarchical.
[7] Let us be clear. Neither gender is innately superior to the other. Equal representation and equity are the goals, not superiority and domination of one category over another.
[8] Marçal, Katrine, Mother of Invention: How Good Ideas Get Ignored in an Economy Built for Men (2021), p. 75.
[9] Algorithms that can be generated by other algorithms/code do exist, and are proliferating.
[10] Another problem: we anthropomorphise algorithms and datasets. Equations can’t think – but we’ve yet to land on language that adequately describes how GenAI, machine learning etc work without anthromorphising them.
[11] Original post here: https://www.linkedin.com/posts/emma-louise-munro-wilson_connectedleadership-linkedin-digitalempowerment-activity-7347867081360482304-XlZ-?utm_source=share&utm_medium=member_desktop&rcm=ACoAAAU_pSQBUR8QqhRPYrVni_CiXF1tuYwe3ww