Are Apps Feminist (Enough)?
Dr. Regina Müller about feminist-ethical perspectives on digitization processes and why they are important
What does digitization have to do with feminism? Why does artificial intelligence need feminist influence? And how can this be achieved? Dr. Regina Müller is carrying out research on digitization with a focus on feminist ethics at the University of Bremen. Unlike established ethics theories, feminist ethics primarily analyze power relations and discriminatory or exclusionary structures. In her lecture “Algorithms in pink? Why Digitization Needs (More) Feminism,” Regina Müller will talk about exactly this topic at the SCIENCE GOES PUBLIC! event series on 6 April. In up2date., the research assistant from the Institute of Philosophy at the University of Bremen explains in advance why digitization needs more feminism and what could make an app feminist.
Ms. Müller, in your contribution at SCIENCE GOES PUBLIC! you talk about digital tools such as apps, which can be discriminatory and sexist. What does this mean?
In recent years, I’ve primarily conducted research in the fields of medical ethics and digitization. For example, I have studied sleep trackers, fitness trackers, and similar tools that can be used to record and measure body data. I came across digital tools that incorporated discrimination and sexism. Let’s take the example of fitness apps that specialize in women: they often use pink designs and depictions of slender young women that convey beauty ideals and stereotypes, thus suggesting what a woman should look like. Another example from the medical field is AI-based diagnostic tools for skin cancer detection. These diagnostic tools work better on a light skin tone than on a dark skin color as the majority of data was fed in from white people and the AI has “learned” from them. We see something similar in clinical research, which in the past was frequently carried out on healthy young men and where women were often excluded. This created a data gap known as the “gender data gap.” We should not transfer these data gaps unquestioned into the digital world.
What are the consequences?
The example of the fitness apps deals with questions such as: What should a “woman” be like? Who is considered a “woman” – and who is not? Social norms are disseminated and even reinforced by stereotypical representations in apps, with the risk of excluding those who do not conform to these norms. One-sided presentations can marginalize groups and make them invisible. In the case of skin cancer detection, one-sided data input can lead to delayed or even incorrect diagnoses in people with dark skin. The digital tools mentioned are concrete examples from practice and raise questions of fairness. In feminist-ethical research, it is precisely these problems that should be uncovered, understood, and discussed. I am not only concerned with feminism that deals exclusively with women, but with a concept of feminism that includes all groups whose interests are considered unimportant (less important) by the dominant society.
Do these problems accompany all kinds of digitization? Are these questions relevant for any organization that intends to become completely paperless?
I think that feminist-ethical values can be applied to all digitization processes. Taking the example of the AI-based diagnostic tool again, it’s not just about the data that is fed into an AI-based system, for example, but also about how the systems interact with the environment and “learn” from it. Developers who generate a system can also incorporate sexist or discriminatory perspectives. This applies to the entire process: from the idea and development to sales and marketing. The example of a paperless organization may seem harmless at first sight. But even in such a process, questions should be asked such as: Are digital documents accessible? Who speaks in what order and for how long in online meetings? If we have places of refuge in real space for people affected by social exclusion and disadvantage, do we also need digital safe spaces?
Let’s stick to the example of apps: What makes an app feminist?
This is exactly the question I’m dealing with in my current research, so I don’t (yet) have a conclusive answer. To assess whether an app is feminist, I look at the concept behind it, the design and marketing, the features, the audience it addresses, the data privacy protection, and the working conditions under which it was created.
What would be examples of assistance for users in finding a feminist digital tool?
It is rather difficult, for example, to develop a catalog of criteria, as each digital tool should be considered depending on the specific context of its use and the individual users. In addition, of course, it also depends on the feminist approach that is used for the evaluation. These may differ greatly in their objectives and evaluations. If we take feminist approaches from bioethics, for example, user autonomy, privacy and data protection, as well as power relations, relationality, and intersectionality play a role. The following questions could then be used as a guide for users: Who developed the app? Who is it intended to address? Who benefits from it? How is the app accessed? Who was considered – or not? Not all criteria have an equal weighting for all users. However, the intention of a feminist ethic is not to set rigid guidelines, but rather to make problems visible and provide orientation.
Feminism and Digitization in a Bar
If you find this topic exciting and want to learn more about it, you can meet Dr. Regina Müller in the Kono Bar at this year’s SCIENCE GOES PUBLIC! on 6 April. From 8:30 p.m., in her lecture “Algorithms in pink? Why Digitization Needs (More) Feminism,” she will speak about feminist-ethical perspectives on digitization processes and why they are important.
More information about the program and the other free pub lectures from 2 March to 6 April can be found on the website of the SCIENCE GOES PUBLIC! event series.