I work at the intersection
of data science, technology and human-centred design.
Former astrophysicist &
e-Research/data consultant.

Earlier this week I had the privilege of seeing Sara Wachter–Boettcher talk about how tech industry biases get baked into the digital products we use today. Not surprisingly many outcomes are far from good. In some cases the result has been the stuff of ethical nightmares; Chatbots that harass women; signup forms that fail anyone who's not straight; apps that make you look black or asian – racial ignorance at it's finest. It's astonishing to to think that (a) these were considered good ideas in the first place, and (b) that they made it through the development lifecycle without anyone noticing how incredible broken they were – in both the ideological and technical sense.

She tells the story of her long-time developer friend Eric Meyer , who co-wrote Design for Real Life. One Christmas Eve 2014, Eric was checking out his Facebook account and found a collage lovely smiley photo of his  daughter Rebecca, as part of a celebratory Year in Review. The photo was the most popular post of his year, which is why it made it into the collage, with the caption "hey Eric! this is what your year looked like!". While this may seem like a happy memory, it was in fact the worst year of his life. His daughter had died of an aggresssive brain tumor on her 6th birthday.

This really struck a chord with me. Unfortunately I've had a similar experience with my Apple Photos iPad app. For the most part I love what Apple have done with Memories, but sometime in August last year, I was welcomed to a cheerful Best of June gallery, that contained dozens of photos of my best friend. Little did Apple Photos know that she had lost her battle to breast cancer that month, so while it was lovely to see her smiley face again, I really wasn't in the mood to be reminded that she was, in fact dead. An unfortunate mistake, but one that seems to reoccur time and time again, on apps and social media platforms such Facebook, Instagram, and Medium. Each time the company in question apologises for the "mistake" and sets out to fix that one seemingly isolated issue.

One of the many things I love about Sara is that she is not afraid to call out bullshit when she sees it. I suspect she would be an awesome person to work alongside. Check out the talk she gave at Google late last year;

"You don't get to decide what circumstances somebody is going to be in when they use your technology"

— Sara Wachter-Boettcher

"This is what happens when you assume the technical [problem] is neutral".

— Sara Wachter-Boettcher

"Design makes the biases look like facts"

— Sara Wachter-Boettcher


About Sara

Sara Wachter-Boettcher  (@sara_ann_marie) is the principal of Rare Union, a digital product and content strategy consultancy based in Philadelphia. Her most recent book, Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech (W.W. Norton, 2017), was named one of the best tech books of the year by Wired, and one of the top business books of the year by Fast Company.

She is also the co-author, with Eric Meyer (@meyerweb), of Design for Real Life (A Book Apart, 2016), a book about creating products and interfaces that are more inclusive and compassionate, and the author of Content Everywhere (Rosenfeld Media, 2012), a book about creating flexible, mobile-ready content. Sara speaks about design, tech, and digital publishing at conferences around the world, and consults with startups, Fortune 100 companies, and academic institutions. Her work has been featured in The Washington Post, Slate, The Guardian, Salon, Quartz, and more. Find her on Twitter @sara_ann_marie or at sarawb.com

The IDEO +Acumen Course for Human–Centred Design

Gloomy Sunday