Blog



An animated gif of a woman looking into the camera

How We’ve Taught Algorithms to See Identity

The outputs produced by facial analysis systems are premised on their training and evaluation data: the images used to “teach” the system what a subject looks like. In our study, we analyze race and gender in training databases from a critical discursive perspective. Specifically, we investigate how race and gender are codified into 92 image databases, many of which are publicly accessible and widely used in computer vision research.


An image of a person smiling. Their face is framed by a bounding box.

How Computers See Gender

Have you ever thought about how you, as an individual, are seen by your technology? What does it think of you? How does it classify you? What labels make up who you are? Many technologies are doing this as we speak — making simple determinations about the humans they come into contact with, bucketing them into terms like “woman,” “female,” “age 21–24.”


An image of a person running on a rainbow road and holding a transgender flag.

Safe Spaces and Safe Places: Unpacking Technology-Mediated Experiences of Safety and Harm with Transgender People

As researchers and designers, we have the privilege and the power to construct spaces, like social media platforms and apps. In our paper, we discuss the benefits to intentionally employing an intersectional understanding of the way power shapes space to improve the way marginalized users — like transgender users — experience and shape safety in a digitally connected world.