AI and the Pink House
How early decisions about AI gender and representation are creating lasting biases in our digital future
When I was around 14, my mother wanted our house to be repainted and she insisted on it being painted pink, yes, you read that correctly, pink. Actually, "Dusky Pink" was the name of the colour on the tin.
As a 14-year-old boy I was part of the free labour crew (alongside my brother) that brought this vision to life. From this point on my life became split: I had a pre-pink house life and a post-pink house life.
Now, sure, my pre-pink house life was not perfect. I suffered the usual travails of any pubescent boy in the grim north of the UK, but my post-pink house existence became distinctly unpleasant. I'll spare you all the details, but this was the 1980s and the world was a little less understanding of why any normal boy would willingly live in a pink house.
The lady, however, was not for turning and the house remained steadfastly dusky until we moved, many years later.
Two Points of View
There are two opposing points of view here:
- The 14-year-old who thought this a huge mistake and would ruin his life.
- My Mother (who, to be fair, owned the house) and thought it was fabulous.
I have been reflecting recently on AI and what 'it' would make of this little vignette and those two juxtaposed points. The answer (probably) is that it would view the Dusky Pink decade as a mistake. Here are the facts it would consider:
- My mother was female. You may see this as so obvious it doesn't warrant mentioning but bear with me.
- There was no tangible benefit to the house being pink, it didn't become more valuable and there was no other physical change.
- The colour inconvenienced two young, white males.
Now, imagine this scenario happened many thousands of times across the country with similar results. Sure, some 14-year-old boys would be confident enough to handle the pink house, but they are outliers in our little thought experiment.
To the AI world of 2025, trained on this data from decades ago, the equation would be simple. A female made a choice that was of detriment to two white males, so it would view this as a mistake.
With the wisdom of hindsight and my human capacity for reflection, the house actually looked pretty cool. This is another issue with the flawed data we feed AI for lunch, its capacity for self-reflection is non-existent.
How Alex Fits In
Ten years ago, I worked for the company that helped bring Alex to the ATO website. Alex was a very early chatbot (a new term then) which was actually powered by humans transposing the questions, it was a fancy FAQ rather than a Skynet overlord. (I don't work there now so the technology may have changed.)
Alex was created to be a young girl, the only thing that differentiated her was a different coloured shirt, depending on what website you were viewing.
We would see the questions as they were captured and a lot of them were misogynistic, crude or utterly vapid to a point that we simply couldn't believe.
One of the first iterations of, and use cases for, what would become AI (as we know it today) was immediately misused. Now of course Alex was actually quite brilliant, a revolution in terms of what the technology could do, and you can still interact with her today, so this is by no means meant to disparage the excellent work that was done.
What Do These Two Examples Show?
Amplifying mistakes and inequality from many years ago is now creating a real ripple effect in what AI is using for its source material today.
"In the real world, the overwhelming majority of bots are built in the female form, an extension of the madonna-whore dichotomy: either servile voice assistants or full bodied sex toys." — Tracey Spicer, Man-Made (2023)
A couple of years ago, I met the inspirational Tracey Spicer and read her excellent book Man-Made (2023). The quote above ultimately made me reflect on these two little chapters from my own life.
It never occurred to anyone (including me) that Alex should be a man, and the long-term consequences of unquestioningly using a young female were not considered either. We just thought it was cool (I mean it really was) and what harm could it do?
We didn't see it as a mistake. We didn't foresee that in a decade the kind of drivel that people were saying to Alex would be what was used to feed her children.
The Bad News
It's too late. We blew it.
Humanity won't face the singularity in my lifetime, but what matters today is that we had this chance, a golden opportunity to make something inclusive, amazing and bold, then we blew it. We fed our young child a diet of evil, bias, discrimination, the plain old awful side of us, and we have raised an all-powerful Dr Evil.
There is no route back. There is no amount of oversight or law-making that can correct where AI is now.
The Pattern Recognition Problem
AI systems don't just learn facts, they learn patterns from decades of human behaviour, including our biases, prejudices, and historical inequalities. When we train AI on data that reflects a world where:
- Women were predominantly cast in servile roles
- Certain demographics were marginalised or underrepresented
- Harassment and discrimination were normalised
We create systems that perpetuate and amplify these patterns, often in ways we don't immediately recognise.
The Monopoly Effect
Like a game of Monopoly where early players buy up all the prime real estate, the companies and cultures that shaped early AI development have established patterns that are now incredibly difficult to change. The "board" is set, the rules are established, and newcomers find themselves playing a game where the fundamental assumptions were decided before they arrived.
Key Insights
- Early AI design decisions create lasting biases that compound over time
- Gender representation in AI interfaces reflects and reinforces societal inequalities
- What seemed like innocent technical choices have profound social consequences
- The data we use to train AI systems carries forward historical injustices
Looking Forward
While the fundamental patterns may be set, understanding how we got here is crucial for anyone working with AI systems today. Recognition of these biases is the first step toward making more conscious choices about how we implement and interact with artificial intelligence.
The pink house taught me that different perspectives on the same situation can lead to entirely different conclusions. What seemed like a disaster to 14-year-old me was simply a homeowner making a choice about their property. AI systems, however, lack this capacity for multiple perspectives or self-reflection.
They see patterns in data and optimise for those patterns, without questioning whether those patterns represent the world we want to create or simply the world we happened to inherit.
Recommended Reading
Read Tracey Spicer's book, Man-Made. I think in 10 years we'll look upon how prescient this and other similar works truly are.
Special heat-resistant robots will be burning me and the other heretics at the stake by that point, so it won't matter for too long.