Dec 09, 2022
5 min read
Gender bias affects every part of people’s lives – including design. As the world gets slowly remodelled to an equal vision, we’d like to take a look at how it looks now and what can be done to change it.
What is gender bias?
Gender bias is defined as behaviours, beliefs, and mindsets that favour one gender over another. It is a form of unconscious bias that involves assigning certain characteristics and stereotypes to other individuals or groups. These ascribed attitudes affect how a person interacts, engages, and understands relationships with others.
In most cases, the term gender bias is used to describe the privileges, benefits, opportunities, and special treatment that white, heterosexual males receive in modern society. This advantage stems primarily from the deeply entrenched androcentric (male-centric) way in which culture, worldview, and history are constructed, perceived, and transmitted. This means placing the male point of view at the centre and taking the male experience as the norm and point of reference.
Gender bias statistics
A 2020 UN report revealed that nearly 90% of all people hold some kind of bias against women. This is a vast and fundamental problem that affects many areas of life, including politics, work and education, relationships, sports, religion or design. Design, broadly defined, is the focus of this article, so let's take a look at how the androcentric perspective in this area impacts our lives.
How biased design affects our lives
First of all, it is important to note that design has the potential to create social narratives as well as give them direction and momentum. Therefore, when practised in an environment where gender bias is present, it can contribute to the perpetuation, validation, and distribution of that bias. This leads to widening social inequalities, offering only selected groups access to benefits and improvements in quality of life.
In the next section, I want to take you through examples of various products designed with a disregard or misunderstanding of the experiences of female users. These examples often highlight unconscious gender bias. While we can assume with a high probability that their designers did not intentionally want to exclude an essential part of the audience, we can see that deep-seated androcentric thinking influenced the final solutions.
Gender bias examples in everyday life
Car seatbelts reflect gender bias
One of the most infamous examples of biased design is car seatbelts. Looking back, automotive product design and development was (and still is) primarily defined by men. The invention of seatbelts dates back to the latter half of the 19th century, but even decades later, in the 1960s, their design standards were configured for men. The vehicle crash test protocol that emerged at the time was based on the use of dummies modelled after the average male whose height, weight, and stature were in the 50th percentile. This means that seatbelts were designed to be safe only for men, as women (especially pregnant women) were not considered at all. In fact, studies found that they are nearly 50% more likely to have a severe injury than men and their risk of death increases by 17%. Although this situation is slowly beginning to change, we are still far from satisfactory safety standards. In 2011, female dummies became required in crash tests. However, still, male counterparts make up the majority, which does not correspond to the actual distribution of the driving population (according to statistics, in the U.S. and Europe, half of the drivers are women). Additionally, few female dummies are tested as drivers – they usually drive as passengers or just not at all.
Protective gear is gender biased
In 2016, the U.S. military began recruiting women into units that had previously been male-only. However, it turned out that women's armour needs were not adequately thought through. The army indeed added some smaller sizes, but other pieces of equipment, such as boots and helmets, were not fully accommodated. This state of affairs has many inconveniences for women, notably the inability to fire their weapons properly. Moreover, some women in the military have been forced to adapt their equipment, which often involves the removal of some protective panels or makeshift alterations to protect organs.
This case is not isolated. It turns out that most of the protective equipment for workers is designed primarily for male bodies. Recently, there has been a buzz about healthcare workers who, with the coming of the Covid-19 pandemic, have begun to point out that the lives of female healthcare professionals are at risk because PPE (personal protective equipment) is designed for men. For example, all masks are produced based on a male template, which is even more surprising when faced with the information that 75% of health workers are female.
However, voices of outrage influence these areas. For example, U.S. military officials have pledged to speed up the process of fitting body armour for women. Various initiatives to improve PPE are emerging worldwide, such as Fit for Women, launched by the global Women in Global Health (WGH) movement.
Smartphones and video controllers
"Size matters" is a phrase that many women who use smartphones can say. In their experience, many of the latest smartphone models have been designed without considering female users, as most do not fit comfortably in the average woman's hand. Smartphones are getting bigger and bigger, with display sizes often exceeding 6 inches (15.25 cm). This tendency can make it difficult or impossible for many women (and men with small hands) to use the phone with one hand. This is because women's hands are, on average, about an inch (2.5 cm) smaller than men's.
The situation is similar with video game controllers. Some gamers with smaller hands find that using standard controllers negatively affects their comfort, and the gaming experience itself is not as enjoyable as it could be.
Virtual reality headsets are unconsciously gender-biased
Motion sickness in VR has afflicted this technology since its very beginning. Women tend to experience more VR-induced nausea than men. What are the explanations for this? For example, one implies that most systems use a distance suggestion method that is easier to program and render but is also preferred only by men (motion parallax method). In contrast, women prefer the shape-from-shading technique. These methods have two very different depth cues. The implication is that VR sends the wrong signals to women's brains.
Another simpler explanation implies that women are much more likely to feel nauseous than men when using VR, probably because 90% of them have pupils closer together than the default setting of a typical headset. In this case, the solution to the problem lies in applying minor adjustments to the helmet design. Simple, isn't it?
Health apps are protected by gender bias
It is impossible to forget an infamous part of the tech giant's history and leader in smartphone innovation. Following the release of Apple's Health app, the company was heavily criticised for overlooking important women's health issues. The app, which was supposed to be a comprehensive solution and has features such as tracking sodium intake, ignored a vital health aspect of half the population – the menstrual cycle. Apple users could not use the period tracking feature until the iOS 9 release. Could the explanation for this ignorance be that period simply isn't a concern for the vast majority of company employees who are men? Maybe.
Facial recognition is gender-biased
Studies have shown that commercially available facial analysis programs (used, for example, in law enforcement) from major technology companies do not always provide satisfactory performance. They work best for men with white skin colour – for example, artificial intelligence from IBM Microsoft could correctly identify a man from a photo 99 per cent of the time. However, for women, especially those with darker skin, a study found that their effectiveness drops to as low as 35 per cent.
These findings raise questions about how today's neural networks are trained and evaluated and on which datasets they learn to perform computational tasks. After all, it is no secret that it is difficult to achieve satisfactory performance if images of white men dominate the dataset.
However, it is consoling to see that they are becoming more streamlined and gender diverse in response to criticism, and the effectiveness of facial recognition systems is starting to increase.
Voice assistants contain gender bias
If you've dealt with a voice assistant even once in your life, the first thing you'll probably hear in your head right now is a gentle, maybe even slightly submissive or flirty female tone of voice. Apple's Siri, Google Home, Microsoft's Cortana or Amazon's Alexa – all these systems have default female voices and names. It is not difficult to identify the collective beliefs, stereotypes, and hurtful prejudices about women that are "hidden" here. These are easily transferred and replicated in the latest technologies and solutions, fostered by the lack of diversity and inclusion in the environment and industry in which they are created.
The roots of gender-biased design
The examples of gender bias I mentioned above are just the tip of the iceberg. Where can we begin to look for an explanation of this reality? Let's start at the very top of the ladder.
Owners, executives, leaders, and experts (engineers, designers) of technology companies – all of these high-profile positions are filled mostly by men. For example, in the U.S., the percentage of tech startups with at least one female founder stands only at 26% (this result is still better than a few years ago).
This means that wherever ideas are generated, and decisions are made, there is a lack of female representation. Such homogeneous groups carry more limitations than benefits. First, they may be unaware of the problems faced by other social groups, and the solutions they build tend to respond mainly to their own needs and expectations. Second, when one homogeneous group designs and develops most technologies, they consciously and unconsciously convey their own biases. Decision-making on product launches by such groups can also be problematic due to a phenomenon called status belief transfer. This is an asymmetric negative bias, which is that products made by women are disadvantaged in male-typed markets, but products made by men are not disadvantaged in female-typed markets. Research on this topic has provided compelling evidence of the transmission of status beliefs from manufacturers to their products.
Someone may argue that the technology industry is changing after all, and the latest data shows that there are more and more women in it. And indeed – the design industry has no problem attracting women because currently, up to 60% of junior positions are held by them. The problem arises with their retention and promotion. Many companies are concerned that women often take leave to have a baby. As a result, they have a more challenging time negotiating fair pay or earning promotions. These are just some of the obstacles women face in the work environment, and already their opportunities to hold senior positions are significantly diminishing. It is worth emphasising here that diversified teams alone do not guarantee success. Even in gender-balanced teams, project briefs that focus on women may still be underrepresented. This is because those approving budgets or writing project development proposals (i.e. decision-makers) are guided by their own priorities, which may include gender bias.
Another critical driver of biased design is data. Project decisions are often data-driven, but they may not show gender-specific trends when considered as a whole. Additionally, the data sets used may simply be limited and problematic due to the overrepresentation of one group (as was the case with facial recognition systems).
Data issues also arise during prototype testing and user research. Undifferentiated and often not very large, samples may fail to capture different needs and expectations based on gender. This is amplified by confirmation bias, where researchers and designers may look to respondents' answers to confirm their personal beliefs.
How does this circle close? Data affects funding. Funding is decided by the people at the top... Do you see where this is going? In this environment, which is (still) male-dominated, you need a combination of big and small data to reduce gender bias. In other words, it is important to capture not only general trends but also those of specific groups. Otherwise, entrepreneurs developing new technologies will rely on their own experience and assumptions, which are not universal. Fewer data on women and other underrepresented groups means less chance of making ideas credible and getting funding.
Changes are coming. Ways to reduce gender bias in everyday life
Despite the many obstacles and problems facing the design and technology industry, we must admit that we live in the best of times to overturn outdated and harmful standards caused by conscious and unconscious gender bias. And in fact – changes are coming. To the examples mentioned above of improvements in various types of products, we can add emerging legal standards on the horizon. For example, last year, the U.S. Federal Trade Commission turned its attention to fairness in AI. One of its commissioners publicly stated that it should expand its oversight of discriminatory AI. It's good to see that designers, developers, engineers, leaders and even legislation are responsive and react to previously overlooked user needs. It's just a shame that such changes happen slowly and only in response to loud protests and criticism from the public. It is no longer acceptable that in the 21st century, thinking about people's needs is still dominated by the male perspective in the context of products and services intended for everyone.
So, how can we accelerate and facilitate this change to reduce gender bias? Let's conclude this article with a handful of tips and directions.
- Diverse teams
Building diversified teams should include not only project or research teams but also managers, executives, and engineers. Hire more women and promote them to leadership positions. Don't see inclusion and diversity as a box to check but as an opportunity to create products that don't discriminate against certain users. Build a better future and opportunities for all.
- Diversified and sex-disaggregated data
Collect diversified data and look not only for general trends but also gender-specific ones
- Education & bias awareness
Educate yourself and your colleagues – even excellent data can fall short if those analysing it are unaware of the biases and stereotypes that influence the discovery of the right context and framing of insights.
- Algorithmic fairness
Companies using AI should use best practices to both identify and minimise instances where their AI generates unfair results. Emerging clear standards for fairness testing can help them do this. Companies can also draw on public guidance offered by experts in the field.
- Challenging priorities
Designers should be aware of decision-making priorities at earlier stages, such as funding or R&D, and question those that may contain gender biases.
What is gender bias?
Gender bias is a form of conscious and unconscious gender prejudice that can be seen in people’s beliefs and behaviours. Gender bias is a term to describe bias against women.
What are the types of gender bias?
There are many types of gender bias. When it comes to career and job opportunities, you can find them in workplace conditions (the equipment not fitting women), stereotypes that decrease women’s credibility and belittle their expertise, or objectification (including sexual harassment).
What are the factors that influence gender bias?
Gender bias stems from culture, including gender roles and stereotypes.
What is another word for gender bias?
Another word for gender bias is sexism or sexual bias.