In this reading, Batya Friedman describes the biases that can appear in computer systems. I chose this reading because before I learned about gender bias in technology in Informatics 200, I was never aware of it and now I realize how much of a problem it is. According to Friedman, “Computer systems, for instance, are comparatively inexpensive to disseminate, and thus, once developed, a biased system has the potential for widespread impact” (Friedman). So, when people aren’t aware of the biases in their system, it can have a bigger effect than they think when it spreads everywhere. Also, when developers and designers aren’t careful about diminishing all the biases in their systems, it can lead to those being excluded to feel unimportant, which is somewhat how I felt. An example that we talked about in Informatics 200 and also in this class was Apple’s Health App that didn’t include a menstrual cycle. Apple partnered with experts from Mayo Clinic and other organizations to make this application and the fact that they didn’t include something that pertains greatly to women is very frustrating to me. Ren Gerecke summed up what a lot of women were probably thinking and tweeted, “Further evidence that tech doesn't care about women: no period tracking in HealthKit in iOS8” (Lewis).
In the reading, they mentioned a study by Huff and Cooper that looked into gender bias for a seventh grade learning software. One group of designers were asked to create the software for boys, another group for girls, and another group for gender unspecified. They found that the “gender-unspecified group closely resembled the designs proposed by subjects who designed for boys and were significantly different from the designs proposed by subjects who designed for girls” (Friedman). This really shocked me because even when people are consciously trying to make things gender neutral, their preexisting bias gets in the way and makes things geared towards boys. This shows the tendency in our society to place importance in males over females even when we are trying hard not to and reflects the gender biases rooted in our culture. One critique I do have though is that they did not specify the gender distribution in the groups, which I think is vital in analyzing the study for myself. Nonetheless, this study is an example of why we should think about gender in bias in computing systems because it will be difficult for women to feel equal to men when things are still catered to males. However, hopefully this will encourage women to join the industry and to have a voice in the design process.
Another detail that I found interesting was when Friedman mentioned that bias is hard to eliminate because it can be hidden in the code. At first, I didn’t understand because I thought it would be easy to figure out what the bias is and fix the code that corresponds to that. Though, I thought back to Abbate’s book and how she describes some coder's experiences with longer codes that “frequently ended in disaster” and then I realized that eliminating the bias might be difficult (Abbate). Then, because of the difficultly, developers leave the bias in the system and lead to harmful consequences like the Apple Health app.
I think that these gender biases in technologies should not be taken lightly. Developers should be putting more effort into eliminating the bias in their systems and not letting it become pervasive. It makes me, and I’m sure many women out there, feel less important and forgotten when designing these systems.
Works Cited:
Abbate, Janet. Recoding Gender: Women's Changing Participation in Computing.
Cambridge, MA: MIT, 2012. Print.
Friedman, Batya, and Helen Nissenbaum. "Bias in Computer Systems." ACM Transactions on
Information Systems 14.3 (1996): 330-47. Web.
Lewis, By Tanya. "Apple's Health App Tracks Almost Everything, Except Periods."
LiveScience. TechMedia Network, 26 Sept. 2014. Web. 11 May 2015.