An image of a woman surrounded by medical images with the text: "A communicaiton emergency: women's health and the algorithms'

A Communication Emergency: Women’s Health and the Algorithms

Reading Time: 4 minutes

Barbara Pesel:

“Algorithmic bias isn’t just a technical failure – it’s a communication emergency. When our digital systems fail to reflect the complexities of women’s health, they undermine access to information and erode trust in healthcare itself.”

In an era where technology promises to revolutionise healthcare, a hidden and troubling reality is unfolding and has been for some – exacerbated by the rapid rise of AI – Algorithmic bias, particularly in women’s health, is not just a technical oversight; it is a systemic issue with profound implications.

A huge and ever unfolding topic hopefully the following sheds a little light and understanding on the hidden gender divide in healthcare AI, the silencing of women’s health on social media, and the nefarious broader societal impacts of this bias.  We need to rethink our roles – not as passive observers, but as active advocates for health equity. By crafting compelling narratives and holding platforms accountable, we can do our bit to alter communication into a powerful tool for justice.

The hidden gender divide in Healthcare AI

We are all aware that an AI system is only as fair as the data it learns from. Historically, datasets have often excluded or underrepresented women. For example, cars and crash-test dummies have historically been designed around the ‘average’ male body. This lack of female-specific data means women are significantly more likely to be seriously injured or killed in crashes.

Silicon Valley predominantly employs males ages between 25 and 40 to write code. This foundational flaw means that many of our most advanced diagnostic tools and predictive models continue to default to male-centric patterns.

Consider cardiology; AI diagnostic tools may miss signs of heart disease in women because they often present with different symptoms, such as fatigue or nausea, rather than the “classic” chest pain more common in men. These symptoms have been typically underweighted in male-focused training data, leading to dangerous diagnostic gaps.

The problem compounds when algorithms learn from and replicate existing human biases. Research shows that clinicians’ tendencies to underestimate women’s pain can become embedded into the very code of our health systems. It’s a documented phenomenon where discrimination is automated, with serious consequences for patient care.

While frameworks to mitigate sex and gender bias in AI are emerging, they often lack the teeth of accountability. Experts argue the only way to effectively reduce this bias is by building multidisciplinary teams – clinicians, sociologists, and gender experts working together at every stage of AI development to ensure our technology serves everyone equitably.

When visibility on social media becomes a privilege

The challenge extends beyond clinical settings and into the digital spaces where we seek information and connection. On social media, a different kind of silencing is occurring.

A prominent UK survey of 4,000 young adults found that many believe women’s health content is frequently restricted or hidden, a practice often called “shadow banning”. Even when posts about periods, menopause, or female anatomy are purely educational, they risk being suppressed.

This isn’t just a feeling; it’s a measurable reality. One experiment revealed that women’s health content on Instagram saw a 66% drop in views and 69% fewer comments compared to equivalent men’s health content. Furthermore, posts using medically accurate terms like “vagina” or “period” are often flagged as adult content and hidden, impacting everyone from health influencers to trusted brands. For women and gender-diverse creators, this means constantly navigating opaque rules just to share vital health information.

The broader ripples of algorithmic bias

The issues are symptomatic of a larger dynamic where technology reflects and amplifies societal inequality. Search algorithms can deliver gender-skewed results for neutral queries, subtly reinforcing stereotypes and shaping public perception.

The moderation protocols that lead to shadow banning are anything but transparent. While platforms claim neutrality, their algorithms can invisibly manipulate visibility, polarise discourse, and suppress critical health stories without our knowledge.

In response, creators are developing strategies to bypass this suppression, using semantic adjustments or moving to alternative platforms. However, these are merely stopgaps. Lasting change requires systemic reform

Why this matters for communications professionals

Algorithmic bias presents a direct challenge to our profession, but it also creates a powerful opportunity to lead.

  • In Healthcare AI: Bias in diagnostics and treatment demands our action. We can advocate for transparent AI, collaborate with multidisciplinary development teams, and help translate complex AI findings for public audiences, all while championing the use of sex-disaggregated data.
  • On Social Media: The suppression of essential women’s health information calls for strategic communication. We can craft advocacy campaigns, work with platform transparency groups, and develop alternative distribution strategies to bypass algorithmic censorship.
  • In Public Perception: The reinforcement of outdated stereotypes requires a narrative shift. We have the skills to create content that challenges bias, share the human stories behind the data, and lobby for ethical standards in algorithm design.

A call to action

Algorithmic bias isn’t just a technical failure – it’s a communication emergency. When our digital systems fail to reflect the complexities of women’s health, they undermine access to information and erode trust in healthcare itself.

As communication professionals, we are at the intersection of technology, healthcare, and public conversations and can be the critical agents of change. It is our responsibility to translate technical critiques into public accountability, advocate for equitable visibility, and ensure women’s health stories reach the audiences who need to hear them.

+++

Barbara Pesel is Managing Director of Pesel & Carr in Melbourne, Australia.  She is Chair, IABC APAC, a #WeLeadComms honoree, and a Strategic Columnist

+++

Reference List

  1. Alvaro, D. “The Gender Bias Built Into AI — And Its Threat to Women’s Health.” Pharma’s Almanac, accessed September 2025. https://www.pharmasalmanac.com/articles/the-gender-bias-built-into-ai-and-its-threat-to-womens-health.
  2. Joshi, A. “Big Data and AI for Gender Equality in Health: Bias Is a Persistent Barrier.” Frontiers in Big Data, 2024. https://www.frontiersin.org/journals/big-data/articles/10.3389/fdata.2024.1436019/full.
  3.  Hoffman, S. “Race and Gender Bias in Medical AI.” Wiley Research Publishing, accessed September 2025. https://www.wiley.com/en-us/network/publishing/research-publishing/trending-stories/race-and-gender-bias-in-medical-ai.
  4. Jessica L. Roberts and Peter Salib, “Algorithmic Discrimination and Health Equity.” In Research Handbook on Health, AI and the Law. National Center for Biotechnology Information, accessed September 2025. https://www.ncbi.nlm.nih.gov/books/NBK613222/.
  5. Anna Isaksson “Mitigation Measures for Addressing Gender Bias in Artificial Intelligence.” AI & Society, 2024. https://link.springer.com/article/10.1007/s00146-024-02067-y.
  6. Roberto Confalonieri “A Unified Framework for Managing Sex and Gender Bias in AI Models for Healthcare.” In Artificial Intelligence in Medicine, 2021. https://www.sciencedirect.com/science/article/pii/B9780128213926000042.
  7. Taylor Mitchell, “Algorithmic Bias in Health Care Exacerbates Social Inequities—How to Prevent It.” Harvard T.H. Chan Executive Education News, 2023. https://hsph.harvard.edu/exec-ed/news/algorithmic-bias-in-health-care-exacerbates-social-inequities-how-to-prevent-it/.
  8. “Social Media More Likely to Suppress Women’s Health Content over Men’s, Say Young People.” The Sun (UK), May 2023. https://www.thesun.co.uk/health/35092348/social-media-suppresses-womens-health-content/.
  9. “Social Media Shadow Banning Limits Access to Women’s Health Education.” Noah News. https://noah-news.com/social-media-shadow-banning-limits-access-to-womens-health-education/.
  10.  “Posts about Women’s Health Blocked by Instagram—While Men’s Are Promoted, Study Finds.” The Sun (UK), July 2023. https://www.thesun.co.uk/health/35853042/posts-womens-health-blocked-instagram-mens-promoted/.
  11. Jennifer Pan “Propagation of Societal Gender Inequality by Internet Search Algorithms.” Proceedings of the National Academy of Sciences 119, no. 40 (2022): e2204529119. https://www.pnas.org/doi/10.1073/pnas.2204529119.
  12. Wei Liu “Shaping Opinions in Social Networks with Shadow Banning.” PLOS ONE 19, no. 3 (2024): e0299977. https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0299977
  13. “How Shadow Banning Can Silently Shift Opinion Online.” Yale Insights, 2023. https://insights.som.yale.edu/insights/how-shadow-banning-can-silently-shift-opinion-online.
  14.  “The Hidden Codes That Shape Our Expression: Understanding How Social Media Algorithms Obstruct.” Feminist Internet Research Network, accessed September 10, 2025. https://firn.genderit.org/research/hidden-codes-shape-our-expression-understanding-how-social-media-algorithms-obstruct.

 

 

 

Written by: Editor

Leave a Reply

Follow by Email
LinkedIn
Share