Eliminating Gender Bias in Computer Science Education Materials

Eliminating Gender Bias in Computer Science
Jared O'Leary

In this episode I unpack Medel and Pournaghshband’s (2017) publication titled “Eliminating gender bias in computer science education materials,” which examines three examples of “how stereotypes about women can manifest themselves through class materials” (p. 411)

Article

Medel, P. & Pournaghshband, V. (2017). Eliminating gender bias in computer science education materials. In Proceedings of the 2017 ACM SIGCSE Technical Symposium on Computer Science Education (SIGCSE '17). Association for Computing Machinery, New York, NY, USA, 411–416.


Abstract

“Low female participation in Computer Science is a known problem. Studies reveal that female students are less confident in their CS skills and knowledge than their male counterparts, despite parallel academic performance indicators. While prior studies focus on limited, apparent factors causing this lack of confidence, our work is the first to demonstrate how, in CS, instructional materials may lead to the promotion of gender inequality. We use a multidisciplinary perspective to examine profound, but often subtle portrayals of gender bias within the course materials and reveal their underlying pedagogical causes. We examine three distinct samples of established CS teaching materials and explain how they may affect female students. These samples, while not a complete display of all gender inequalities in CS curriculum, serve as effective representations of the established trends of male-centered representation, imagery, and language that may promote gender inequality. Finally, we present easily implementable, alternative gender equitable approaches that maximize gender inclusion.”


Author Keywords

Gender, Diversity, Confidence, Gender Equitable


My One Sentence Summary

This study examines three examples of “how stereotypes about women can manifest themselves through class materials” (p. 411)


Some Of My Lingering Questions/Thoughts

  • Is the shift toward animals and monuments a form of dehumanizing CS?

  • When creating or sharing materials with students, what kind of demographic balances do you strive for?

    • Are you trying to demonstrate equal relationships (e.g., 1/3 female, 1/3 nonbinary, and 1/3 male), match demographic proportions (e.g., 50% female, 1% nonbinary, and 49% male), or lean more toward marginalized identities to counter trends (e.g., 70% female, 20% nonbinary, and 10% male)?

      • When might an approach like this unintentionally communicate messages that a certain demographic is not welcome in the CS community?

  • How might we as a field start engaging in conversations around gender without making assumptions about people?


Resources/Links Relevant to This Episode



More Content