Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech

Wachter-Boettcher, Sara. Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech. United States, W. W. Norton, 2017.

This book focuses on how digital products are designed and why, specifically how the people designing technologies operate, due to the importance digital technology now has in everyday life. Wachter-Boettcher examines tech hiring practices and work culture and how those exacerbate issues of bias in digital designs. In particular, she highlights how issues of sexism and racism in tech companies directly connect to the bias and insensitivity found in digital products.

She criticizes the use of personas as a design tool, which are meant to be aggregate descriptors of real people that designers and engineers then internalize when working on a product. These personas can also lead to implementations of stereotypes that alienate users. Tailoring to specific imagined and ideal users often leads to unintentionally othering those unimagined unideal users. Default settings also normalize specific types of users and actions - something called "the default effect." While defaults are often helpful and time-saving, they're also never neutral, and encode specific norms into technology - like the United States being at the top of a default country list. In dealing with persons, she and others recommend removing demographic data so that team members do not focus on their demographics as representative of their actions and needs.

She criticizes how the term "edge case" has been used to characterize othering as "extreme" and atypical, and to insinuate there is some average user. She points out that average is actually less likely to fit any individual in the example of air force research Gilbert Daniels discovering not a single pilot fit all ten average physical dimensions. Instead, the air force designed for both "extremes" of smallest and largest pilots and engineers then had to design solutions within these ranges. She proposes "stress cases" over edge cases because the term instead insinuates where design work breaks down.

Exclusion makes its way into forms, or interaction design, in places where users interact with interfaces. Forms are often normatively designed to collect data, but do not imagine potential interactions with others (e.g., filling out you were sexually assaulted on a medical form leading to uncomfortable conversations with doctors). In designing a real name policy, Facebook was normatively designing which names were "real" and which were not. The labor of proving one's authenticity most profoundly impacts marginalized people.

She discusses the issue of forced enrollment, like Facebook's On This Day feature, which you can now turn off but can never remove. The goal of engagement leads to absurd and upsetting interactions, some of which encourage destructive behavior (e.g., health apps encouraging you to lose weight). "Marketing negging" is a technique for shaming consumers into making different choices (e.g., "No thanks, I hate saving money"). Other mechanisms involve intentional deception to get at your data, like the example of Uber disabling the option for "use location while the app is open" and defaulting to using location data all the time, even when the app is closed.

She also has a chapter dedicated to harassment and trolling, and how the design of platforms like Twitter, Reddit, and Facebook have enabled such behaviors. Platforms like Twitter are built with the value of open communication, easily allowing for trolls to attack people who build platforms on the site. Reddit's specific subreddits allow for harmful communities to persist, particularly alongside their lack of anti-harassment enforcement. Facebook's trending algorithm allowed for fake news to proliferate to the mainstream, due to the site's values of shifting towards algorithmically curated content.