Sunday, November 16, 2025
Barbara Pesel:
“Algorithmic bias isn’t just a technical failure – it’s a communication emergency. When our digital systems fail to reflect the complexities of women’s health, they undermine access to information and erode trust in healthcare itself.”
In an era where technology promises to revolutionise healthcare, a hidden and troubling reality is unfolding and has been for some – exacerbated by the rapid rise of AI – Algorithmic bias, particularly in women’s health, is not just a technical oversight; it is a systemic issue with profound implications.
A huge and ever unfolding topic hopefully the following sheds a little light and understanding on the hidden gender divide in healthcare AI, the silencing of women’s health on social media, and the nefarious broader societal impacts of this bias. We need to rethink our roles – not as passive observers, but as active advocates for health equity. By crafting compelling narratives and holding platforms accountable, we can do our bit to alter communication into a powerful tool for justice.
The hidden gender divide in Healthcare AI
We are all aware that an AI system is only as fair as the data it learns from. Historically, datasets have often excluded or underrepresented women. For example, cars and crash-test dummies have historically been designed around the ‘average’ male body. This lack of female-specific data means women are significantly more likely to be seriously injured or killed in crashes.
Silicon Valley predominantly employs males ages between 25 and 40 to write code. This foundational flaw means that many of our most advanced diagnostic tools and predictive models continue to default to male-centric patterns.
Consider cardiology; AI diagnostic tools may miss signs of heart disease in women because they often present with different symptoms, such as fatigue or nausea, rather than the “classic” chest pain more common in men. These symptoms have been typically underweighted in male-focused training data, leading to dangerous diagnostic gaps.
The problem compounds when algorithms learn from and replicate existing human biases. Research shows that clinicians’ tendencies to underestimate women’s pain can become embedded into the very code of our health systems. It’s a documented phenomenon where discrimination is automated, with serious consequences for patient care.
While frameworks to mitigate sex and gender bias in AI are emerging, they often lack the teeth of accountability. Experts argue the only way to effectively reduce this bias is by building multidisciplinary teams – clinicians, sociologists, and gender experts working together at every stage of AI development to ensure our technology serves everyone equitably.
When visibility on social media becomes a privilege
The challenge extends beyond clinical settings and into the digital spaces where we seek information and connection. On social media, a different kind of silencing is occurring.
A prominent UK survey of 4,000 young adults found that many believe women’s health content is frequently restricted or hidden, a practice often called “shadow banning”. Even when posts about periods, menopause, or female anatomy are purely educational, they risk being suppressed.
This isn’t just a feeling; it’s a measurable reality. One experiment revealed that women’s health content on Instagram saw a 66% drop in views and 69% fewer comments compared to equivalent men’s health content. Furthermore, posts using medically accurate terms like “vagina” or “period” are often flagged as adult content and hidden, impacting everyone from health influencers to trusted brands. For women and gender-diverse creators, this means constantly navigating opaque rules just to share vital health information.
The broader ripples of algorithmic bias
The issues are symptomatic of a larger dynamic where technology reflects and amplifies societal inequality. Search algorithms can deliver gender-skewed results for neutral queries, subtly reinforcing stereotypes and shaping public perception.
The moderation protocols that lead to shadow banning are anything but transparent. While platforms claim neutrality, their algorithms can invisibly manipulate visibility, polarise discourse, and suppress critical health stories without our knowledge.
In response, creators are developing strategies to bypass this suppression, using semantic adjustments or moving to alternative platforms. However, these are merely stopgaps. Lasting change requires systemic reform
Why this matters for communications professionals
Algorithmic bias presents a direct challenge to our profession, but it also creates a powerful opportunity to lead.
A call to action
Algorithmic bias isn’t just a technical failure – it’s a communication emergency. When our digital systems fail to reflect the complexities of women’s health, they undermine access to information and erode trust in healthcare itself.
As communication professionals, we are at the intersection of technology, healthcare, and public conversations and can be the critical agents of change. It is our responsibility to translate technical critiques into public accountability, advocate for equitable visibility, and ensure women’s health stories reach the audiences who need to hear them.
+++
Barbara Pesel is Managing Director of Pesel & Carr in Melbourne, Australia. She is Chair, IABC APAC, a #WeLeadComms honoree, and a Strategic Columnist
+++
Reference List
Written by: Editor
© 2025 Stratpair Ltd., trading as Strategic. Registered in Ireland: 747736