Human Factors, with their person-centred approach, can help bridge the gender gap in research and design, enhancing the quality, safety, and effectiveness of the solutions developed. Only in this way can innovation be truly “sustainable” and genuinely aligned with everyone’s needs.
Gender Gap in research and design: when technology excludes
If a woman is involved in a road accident, the likelihood of her being injured more seriously than a man is significantly higher. Not because she drives less well, but because cars were not designed for her. Automotive crash tests were originally developed using dummies modelled based on the body of an “average man”. A female version was introduced in the 1960s, but it was essentially a scaled-down male dummy, failing to reflect the dimensions or anatomy of a typical female body, and likewise not accounting for older people or people with obesity, who also differ from the “average male” prototype.
This is a clear example of technology that does not work equally for everyone — not because it is poorly designed, but because it was developed with only one type of person in mind. Similar cases can be found across many sectors: medicines and medical devices; the ergonomics of objects, machinery, and workplaces; personal protective equipment; artificial intelligence tools. These examples show that when factors such as sex, gender, ethnicity, disability — and their intersections — are excluded from research, design, data collection and analysis, innovation ends up excluding, and even harming, a large part of the population, whether we are talking about a seatbelt, a medication, or an AI system.
It’s a Man’s World
Why do these biases exist? “It’s purely a historical and cultural matter,” says Mara Marzella, sociologist and Senior Consultant at Deep Blue. “Our world — from everyday objects to advanced technologies — has historically been designed by and for a single profile of individuals: white, non-disabled men.” “For a long time, only they had access to education and later to technical and decision-making roles, shaping technological, industrial, and cultural development,” Marzella continues. “Take cars, for example: who sat at the tables where decisions were made about shape, function, and safety? Men. Who had the economic power to buy a car? Again, men. As a result, the average user around whom products were modelled was male, with tangible consequences: from seatbelts designed for standard male bodies to airbags that inadequately accounted for different body types, including those of women or pregnant people.”
“History and culture have long assumed the man — and specifically the ‘average’ white man — as the universal measure of the human being,” adds Izabela Diana Ihnatiuc, Communication & Dissemination Consultant at Deep Blue. “This is why feminist thought has referred to women as ‘unexpected subjects’, a category that in fact includes all identities that diverge from the dominant model.”
Gender Bias in Clinical Trials and Research Teams
The male cultural reference model is so deeply embedded that it continues — explicitly or implicitly — to influence the design of products and technologies. This is evident, for example, in the selection of research samples. The medical and healthcare fields are among the areas where gender bias is most visible.
In the United States, the inclusion of women in clinical trials to test the efficacy and safety of medicines and treatments became mandatory only in 1993. Yet even today, women (mainly white women) represent just 40% of participants in trials on cancers, heart disease, and psychiatric disorders — conditions that disproportionately affect them. Non-inclusive samples translate into less effective and riskier treatments: because drug dosages have traditionally been calibrated on men, women face a higher risk of overmedication and, consequently, more severe side effects. “The bias affects diagnoses too,” Ihnatiuc notes. “Some conditions, such as cardiac diseases, have different physiologies and clinical manifestations in women and men. Building medical evidence solely on male subjects inevitably leads to misdiagnoses or delayed diagnoses — and delayed treatment — with potentially dramatic consequences.”
A lack of diversity in research and development teams further contributes to gender bias and undermines the adoption of an approach rooted in intersectionality — the understanding that identity is shaped by many dimensions beyond sex and gender, including ethnicity, socioeconomic background, disability, and more. This is made worse by the lack of training on these issues.
A Gender-Sensitive approach to research
The term Gender Innovation was coined in 2005 by Londa Schiebinger, a science historian at Stanford University. The concept is simple: applying analytical methods based on sex, gender, and intersectionality to overcome past biases and, above all, to generate new knowledge and drive innovation.
Europe took up the issue later but decisively, making gender representation a strategic priority and transforming it into a policy requirement for research funding. In 2022, public institutions were required to adopt a Gender Equality Plan (GEP) to access Horizon Europe funding. “The GEP has become an essential requirement for organisations seeking European Commission funding,” Marzella explains. “It includes commitments and actions to ensure gender equality in recruitment, representation, career development, and work–life balance. Crucially, it also embeds a gender perspective into research itself. With the GEP, gender-sensitive and intersectional analysis becomes a clear, systematic part of the research approach required by EU funding programmes.”
Human Factors to bridge the Gender Gap
Deep Blue’s commitment to inclusive research moves in this direction. “It could not be otherwise, given that our core business is Human Factors: whether we work on an object, a technology, or a procedure, we always place the person — their needs and specificities — at the centre of the design and development process,” says Marzella.“For this reason, we know that individuals cannot be defined by a single characteristic but are shaped by multiple identities: not only biological sex, with all its bodily and physiological specificities, but also cultural, social, economic, and organisational factors,” Ihnatiuc adds. “Human Factors follow an intersectional logic, recognising that human experience results from the interplay of many elements.”
Detailing Deep Blue’s concrete actions, Marzella explains: “In European research proposals, we are integrating gender-sensitive and intersectional approaches throughout the entire design process. This means embedding them into every work package and every stage of research. Even in highly technical projects — those closest to the ‘hard sciences’ — our social-science-based approach ensures that intersectionality and gender-related dimensions are properly addressed. We do this not only because Europe requires it, but because it is essential to produce innovation that is truly effective and inclusive.”
In practice, introducing gender and intersectional analysis into research projects means ensuring balanced representation — at least between women and men — among stakeholder groups involved in product development. It also means identifying from the early design stage the requirements needed to make a product accessible and usable for everyone. “Only then can innovation truly be sustainable,” Ihnatiuc emphasises. And this brings us back to the beginning: non-inclusive innovation is not only ineffective; it is dangerous because it can create or reinforce inequalities, harming those who are historically more vulnerable. In the age of AI, the issue is even more urgent.
The data problem in AI
From automated translation systems to speech-to-text tools, many algorithms “discriminate”. We all remember gender-biased translations that assumed certain professions to be exclusively masculine. A study by Stanford and Georgetown University researchers examined widely used speech-to-text tools and found that they made significantly more errors when interpreting — and therefore transcribing — the speech of Black women and men than that of white women and men: an average error rate of 35% versus 19%. This is no trivial matter, considering the importance of these tools for people with visual or motor disabilities. Facial recognition systems also show worrying disparities. According to a well-known study, several major commercial facial analysis tools display biases related to skin tone and gender: for light-skinned men, the error rate in identifying gender is below 0.8%; for darker-skinned women, however, it reaches 20% to 34%, depending on the system. The difficulty is not recognising darker-skinned women’s faces per se; the problem lies in the datasets used to train the algorithms: 77% of the faces were male, and 83% were white.
“The quality and representativeness of the datasets used to train algorithms are crucial to closing the gender gap,” Marzella concludes. “We have the evidence — now we must correct the course. Otherwise, one of the most powerful innovations of this century, designed to improve daily life for everyone, risks paradoxically becoming a driver of greater social inequality.” Innovation is not neutral: it becomes fair only if we choose to build it that way.