Feminist scientists criticise gender bias in the tech industry

Whatever you find when you google depends on the people who have developed the algorithms. American researcher fears enhanced sexism and racism unless the IT sector begins to promote diversity.
“In order to secure a more inclusive digital medium and culture, we need more women and minority groups among developers and designers, among those who create the contents and decide what types of technology should be purchased,” says Jill Rettberg. Illustration photo: iStockphoto

Today, people affect the earth’s development faster than nature does. Therefore, the humanities are more important than ever, according to humanities scholars.

The way in which technology plays a part in this, and the way digital cultures arise, have thus become important fields of research. Does technology affect our perceptions of ethics, culture and aesthetics?

One researcher who has addressed this topic is associate professor Elizabeth Losh, director of Gender, Sexuality and Women’s Studies at William & Mary University in the US.

Last spring, Losh visited Bergen together with guest researcher at University of Bergen Aristea Fotopoulou, in order to talk about feminist digital methods.

One example of such methods could be to construct digital archives of unknown women’s accomplishments. The motivation behind the work is simple: the digitalisation of our collective memory – everything that is transferred from books and paper to digital platforms – should also include women and minorities.

Concealed through algorithms

“Aristea Fotopoulou and I have overlapping research interests when it comes to feminist IT studies, and we share the concern that companies specialising in social media and internet searches use algorithms that conceal information that can be found in digital archives. I also worry about what and who are concealed through these algorithms,” says Losh.

Elizabeth Losh is associate professor and director of Gender, Sexuality and Women’s Studies at William & Mary University in the US. Photo: Private

She fears that digital cultures will preserve sexism and racism, but also that digital humanities as a discipline will continue to develop in patriarchal directions. For instance, the use of data sets that exclude women and minorities or are based on stereotypes may generate further use of stereotypes.

XML, a universal data language and tool for document coding, previously used the numerical value 1 for men and 2 for women, which perhaps confirms Simone de Beauvoir’s claim that women are conceived of as the second sex. Feminist digital methods have thus emanated as a criticism against the humanities and social sciences’ uncritical approach to their own digital methods.  

Cultural baggage

Although humanistic research with the use of computers has a long existence, digital humanities is a relatively new scholarly field.

Digital humanities is about analysing digital data within the humanities disciplines, meaning to study digital practices, develop digital methods for humanistic research, or to use digital programmes for searches within large text corpora. In today’s information society, this is becoming an increasingly relevant field, and consequently degree programmes in digital culture have emerged at several higher educational institutions in the country over the past years.

XML previously used the numerical value 1 for men and 2 for women, which perhaps confirms Simone de Beauvoir’s claim that women are conceived of as the second sex.

The American scholar and author Jacqueline Wernimont writes in her recent book Numbered Lives: Life and Death in Quantum Media that the way in which the Anglo-American culture has measured and counted life and death is based on a gendered and radicalised idea of categorisation. In other words, when we register data, we do so with a cultural baggage.

Sustains sexism and racism

According to Elizabeth Losh, systems that rank, categorise and filter data often use machine learning that draws conclusions based on data training. With machine learning, it may become even more difficult to discover how data codes perform these conclusions.  

Jill Rettberg, professor of digital culture at University of Bergen, says that some data sets preserve biases, particularly machine learning algorithms that transfer their conclusions to the machine learning.  

In order to secure a more inclusive digital medium and culture, we need more women and minority groups among developers and designers.

“The algorithms are trained to identify the faces of white men, for instance. This can be a problem if the police uses facial recognition in order to identify a suspect, and the algorithm more often makes mistakes when it comes to women and people with darker skin types,” says Rettberg.

“In order to secure a more inclusive digital medium and culture, we need more women and minority groups among developers and designers, among those who create the contents and decide what types of technology should be purchased. Data systems define more and more of our society and everyday life, and without diversity among developers, we have seen that unplanned discrimination is often integrated into the systems,” says Rettberg.

Feminist theory on data

In Bergen, Elizabeth Losh primarily spoke about what feminist digital humanities entails.

Jill Rettberg is professor of digital culture at University of Bergen. Photo: Eivind Senneset

“Feminist digital humanities does not only revolve around digital projects related to women or created by women, it is also about applying feminist theory on the digital archives that preserve stories from our past. Or where social media relate stories about our present.”

Many feminist digital humanities scholars are interested in digital work: who does the work, who gets the credit for the work and how is the work compensated and valued. Due to the huge amount of literature about housework, they are also interested in everyday life, experiences, emotions and the body, and what all this means within a rational, technical system,” says Losh.

Polarised digital culture

Losh uses relatively few data in her own research. She studies Donald Trump’s Twitter posts and the leaked emails from Hilary Clinton in addition to feminism, media and innovation. She is also interested in fake news and digital harassment.

“Can social media polarise groups and thus intensify both feminist and anti-feminist groups?”

“Internet researchers have known for years that polarisation and filter bubbles create problems on various platforms. You cannot trust technological companies to regulate themselves or that they teach their users a practice that benefits society,” says Losh.

It is about applying feminist theory on the digital archives that preserve stories from our past.

“Technological companies profit from conscious, determined marketing and information that they sell to third parties. Fake news are lucrative, and it is time consuming to filter hateful contents. There is an increase in extreme hate groups both in the US and in Norway.”

According to Jill Rettberg, it is not only the social media that polarise.  

“So-called recommendation algorithms, such as the ones you find in YouTube, advise you to click on increasingly extreme videos,” she says.

Challenges established stories

In addition to launching a book on hashtags and addressing the claim that Chris Messina invented the hashtag, Losh is also digging into history this autumn in order to find people who have been significant for the technological development. But it is not the big names that preoccupies her; it is the names of characters who at first glance seem insignificant in the history of technology.

“Mina Rees is a good example of such a character. She helped create the infrastructure for a data centre in the US after the Second World War. This made it possible to apply data in many new ways,” she says.  

Losh admits that she likes finding faults in men who have been praised as technological pioneers.

“As a researcher, I am interested in the ways in which their sexism and racism have affected the technological development. When working with my first book, for instance, I discovered that the engineer and inventor Vannevar Bush excluded women from the workplace in various horrific ways.”

Archives must be constructed in new ways

The purpose of feminist digital humanities, as Losh also addresses in the anthology Bodies of Information: Intersectional Feminism and Digital Humanities, is to inspire people to broaden their fantasy when they think of digital humanities.

“When people began to make archives for documenting royal houses, kingdoms and regimes for the colonial powers, decisions were made concerning what was worthy of preservation and not, how materials were to be marked and organised into various collections,” says Losh, and continues:

“Modern nations have often applied the same techniques, particularly when it comes to estimating the value of labour and establishing what constitutes criminal conduct. When archives become digitised, they inherit these outdated, biased systems. Additionally, new systems become further biased as the learning data are selected by white, male engineers, or algorithms are concealed as professional secrets.”

More technological competence is needed, but so is a humanistic understanding of our past and of other cultures.

“What can we do to prevent this from happening?”

“More technological competence is needed, but so is a humanistic understanding of our past and of other cultures. I believe that rather than that the technological companies make all the decisions, we need political participation and resilient processes.”

“But if the majority of those who work with data are men, why should we strive for gender neutrality?”

“I am not certain that the majority of those who work with data are men, since many of those who work with archives, digitalisation and data registration are women. These forms of feminised work are often invisible, but significant for digital culture,” says Losh.

Translated by Cathinka Dahl Hambro.

See also: Intelligent robots may strengthen gender norms

    Read more about feminist digital humanities:
    • Jacqueline Wernimont: Numbered Lives: Life and Death in Quantum Media (2019).
    • Elizabeth Losh og Jacqueline Wernimont: Bodies of Information. Intersectional Feminism and Digital Humanities (2018).
    • Aristea Fotopoulou: Feminist Data Studies: big data, critique and social justice (kjem 2020).
    • Christine Borgman: Big Data, Little Data, No Data (2015)
    Facts

    Feminist digital methods: 

    The aim of feminist digital methods is to challenge technology in order to prevent it from normalising certain types of patriarchal perceptions by making the masculine into a norm. For instance, a feminist method can be to consult a diverse selection of members of society when collecting data from the society in question.

    Digital humanities: 

    Have traditionally been about digitising paper archives and museum collections to make these accessible to researchers and the general public. Digitised archives make it easier to see connections and find structures and ideas. In recent decades, many digital humanities scholars have been preoccupied with collecting and preserving websites. With the emergence of social media, hashtags have become increasingly important. According to some experts on the field, such as the Russian-American professor Lev Manovich, digital tools may enable researchers to pose new research questions. This is because human beings are incapable of identifying large sets of data in the same manner as are computers.

    Digital culture: 

    Digital culture studies aesthetic, ethical and cultural aspects of technology.  

    Latest news

    Calendar

    News Magazine

    Our news magazine is an independent online newspaper and a member of the Norwegian Specialised Press Association Fagpressen.