AI Development Needs More Women. Here’s What Leaders Can Do About It

Corinne Post Forbes Contributor
Corinne Post analyzes execs, C-suites and boards with a diversity lens
June 5, 2025
AI Development Needs More Women

AI accountability, accuracy and societal impact could improve with more women AI developers.

The gender gap in generative AI isn’t just about who uses the tools—it’s also about who builds them. AI developers make crucial decisions—designing models, selecting training data, determining data usage, developing testing protocols—that affect AI’s accountability, accuracy, and societal impact.

Just 31% of AI professionals are women, according to a 2024 LinkedIn analysis. In areas such as algorithm development, machine learning and text mining, the representation gap is even wider.

The gender gap in AI expertise also plagues AI research: all-male teams author 75% of scientific publications. This is according to an analysis by the National Endowment for Science, Technology and the Arts in the U.K. of over 74,000 AI-related scientific articles in fields like physics, math, computer science and engineering.

Yet, women do prioritize responsible AI values according to a study conducted by Harvard doctoral degree candidate Zana Buçinca and Bauhaus University Professor Maurice Jakesch, among other coauthors, surveying 743 individuals from the general U.S. population, 755 crowd workers, and 175 AI practitioners. The survey was conducted in collaboration with researchers from Cornell University and Microsoft Research as well. Together, they found that women were 4-5 % more likely than men to rate AI values like privacy, safety, accountability, fairness and human autonomy as very or extremely important.

Here’s how increasing gender diversity in AI development teams could ensure more accountable, accurate and socially responsible AI technologies—and how leaders can increase the representation of women in their AI teams.

AI Development For Social Impact

AI machines sound excessively confident and knowledgeable, but they can make mistakes. And when mistakes occur, AI firms and even technical experts may be hard-pressed to explain why.

AI’s gender biases—in healthcare, hiring, criminal justice and access to financing—are well-documented and often stem from biased training data. But gender bias may also stem from AI inaccuracies being tolerated more when they concern women. For example, wrist-worn Fitbit devices may be less accurate for women, because Fitbit’s AI is less capable of detecting the arm movements of people whose stature and stride length are shorter, according to the Centers for Disease Control and Prevention study.

AI development teams that are dominated by men may simply be oblivious or disinterested in features that could be valuable for women. For example, Apple’s health app tracked an extensive number of metrics when it was launched in 2014, but the app notably had no metrics on women’s menstrual cycles.

To make matters worse, beyond unintentional biases, bad actors may corrupt AI technologies by feeding false or divisive information into AI training data. Elon Musk’s generative AI, Grok, made headlines for inserting unprompted white genocide mentions after an unauthorized code modification. AI accountability can reduce such risks.

Adding professional AI women’s voices to the development process could increase the accountability of AI systems and minimize the risks of opacity, bias and manipulation.

AI tools can make people more productive, but not necessarily happier at work. And concerns about AI’s role in eroding critical thinking are mounting, especially in the field of education.

Women, on average, tend to think about a broader range of stakeholders in decision-making, including the environmental risks of massive computational power, while also anticipating more risks. This may explain why AI research teams with at least one woman are more likely to explore topics with broad societal relevance—like fairness, human mobility and misinformation, as the NESTA analysis found.

The societal implications of AI reinforce the importance of ensuring that AI development teams consider a wide range of stakeholders, something more gender-diverse teams are more likely to do.

How Industry Leaders Can Close The Gender Gap In AI Development

Strategies and techniques for increasing the representation of women in AI development roles align with those intended to increase female representation in STEM more generally. Here are some strategies you could follow:

Support And Partner With Educational Outreach Efforts

In doing so, you can support the slow but upward trajectory in women’s completion of technical degrees. For example, the 2025 executive order “Advancing artificial intelligence education for American youth”—which aims to promote AI literacy in K-12—offers concrete opportunities for narrowing the gender gap. Early exposure to STEM programs seems to encourage more girls to pursue technical degrees.

Approach Recruiting And Retention Deliberately

Firms can deploy a number of strategies to stand out as an employer of choice for female AI professionals. Check that your job ads reach a broad range of potential candidates. Use gender neutral job descriptions to increase the number of women applicants. Consider how your recruiting process might correct for women’s tendencies to report fewer technical skill differences than men. Involve technical women in the interview process: female job candidates who interview with female role models are more likely to accept job offers, according to Google’s analysis of its own hiring process.

Foster Inclusion On AI Development Teams

Recruiting and retention efforts without buy-in and culture change are unsustainable. Communicate the innovation value of AI development teams that represent a diversity of perspectives and lived experiences to get people on board. Invest in sharpening AI professionals’ inclusive leadership skills. Adopt interventions proven to reduce biases against women at work.

As our understanding of the ethical and societal ramifications of GenAI grows, one thing is clear: AI teams must listen to more diverse voices to improve the accountability, accuracy and societal impact of the tools they develop.

Back

Become a Speaker

Become a Speaker

Become a Partner

Subscribe for our weekly newsletter