Misdesign: intentionality and bias in innovation

At the beginning of the semester, we started our first class with a discussion of social media. In particular, we addressed the role of social media design in its uses and abuses. We watched and discussed videos from Margaret Gould Stewart, Facebook’s director of product design, on the principles for design at scale, and Tristan Harris, “design thinker” and former design ethicist at Google, on how social media are designed to be addictive. As we’re approaching our trip and the end of our class, I want to revisit the question of design in innovation, with a new question this time: Who are we designing for?

It’s a perennial question in entrepreneurship; understanding your market is key to acquiring funding and establishing a sustainable and realistic business model. I won’t repeat the quote, but suffice it to say the market is essentially the be-all and end-all of venture capital. And the market is made up of people. Understanding the market is just understanding people.

Image result for market research business
There are lots of infographics like this one.

When user base is addressed in discussions of entrepreneurship and innovation, however, it is often from a business perspective. What are the characteristics of the people who will buy this product or use this platform? Are there enough of them; is there a market for what I want to create or sell? How can I convince the people I’ve identified, or expand to other groups, to support what I produce?

While these are valid questions, they are largely post hoc; they presuppose the existence of a product, whether merely a concept for a business or a realized physical prototype. Moreover, they address the issue of user characteristics only insofar as they relate to fiscal sufficiency for a successful business.

To design products which are ethically, socially, and fiscally sound, this isn’t enough.

People from different groups use technology in different ways, and in many cases, in ways which are not necessarily obvious. Culture and nationality, for instance, influences the way people use websites and what kind of websites they prefer. These differences are well-documented over the past 15 years in a variety of different aspects, ranging from preference for textual or visual presentation of information to viewing patterns with eye tracking.

Age also plays a significant role in how people use mice and touchpads, according to a study from the University of Copenhagen. Adults are more adept than both early adolescent and elderly users, but in different ways: teenage users are quicker, but less accurate, while older users take longer to move a cursor to its final destination, but about equally accurate. Interestingly, differences in maximum speed were not observed, although both young and adult users reached maximum speed sooner than elderly users. Instead, the researchers found that elderly participants made more submovements while adult participants moved the mouse in a more direct line to the target.

The study found that larger displays and smaller, more densely distributed items increased these age differences, in particular disadvantaging elderly users. In other words, poorly designed websites are bad for everyone, but they’re especially difficult to navigate for the elderly as a result of the strategies they employ in their use of the mouse or touchpad.

Consider how this impacts the narrative of who is most natural at adapting to new technology. Building products—touchscreens, for instance—which are increasingly quick and responsive might actually make them harder to use for some populations, forcing them to adapt to products which are adapting away from them.

When developers fail to adequately consider who they are picturing using their products, they risk unintentional—and sometimes drastic—consequences. Invisible Women: Exposing Data Bias in a World Designed For Men, a just-released book by Caroline Criado Perez, documents the myriad examples of how an historic failure to consider gender as a factor in design has resulted in a world replete with systems safe for men but unsafe, uncomfortable, or just unintuitive for women.

The examples range from the mundane the extraordinary. The formula for office temperature was built with data on men’s metabolic rate, resulting in working spaces even today which are, on average, five degrees too cold for women. Since crash test dummies are traditionally designed along average male proportions and many regulations do not require testing with dummies matching typical female height and weight, women are almost 50 percent more likely to be seriously injured in a car crash. They are 70 percent more likely to be moderately injured, and almost 20 percent more likely to die.

In general, women’s medical concerns are less well researched and documented. Safe levels for carcinogenic chemicals do not take into account women’s higher percentage of body fat, in which toxins accumulate over time, or thinner skin, which increases absorption rate. There’s little data on the safety of chemicals used in conjunction in industries such as nail salons, or injuries to women in construction, although sprains and wrist strains are sustained at higher rates by women. Standard sizes for dust and hazard masks often do not fit women’s faces, making them unsafe.

Similar patterns of unintentional misdesign show up in recent technology. According to an article published in 2016, at that point, Siri, Google Now, and Cortana recognized complaints of a heart attack, but not rape, sexual assault, or abuse. An artificial heart developed fit 86% of men’s chest cavities, but only 20% of women’s; the company was quoted as having no plans to pursue an alternative more compatible to female physiology as it “would entail significant investment and resources over multiple years.”

The original version of the Health app on iPhones did not include menstruation or other aspects of women’s reproductive health. As phone sizes have increased in recent years, using the devices has become more difficult for people with smaller hands—largely women. Features such as zooming and taking photos one-handed are exponentially more difficult if not impossible for some women.

For all its purported impartiality, historically, artificial intelligence has amplified rather than addressed biases in design. In 2016, Google’s speech recognition software was 70 percent more likely to recognize men’s speech accurately. Users have reported that voice recognition in cars works far better for lower voices—if it works at all for higher ones. Racial bias in facial recognition software is likewise well-documented. A study released just last year by the ACLU found that Rekognition, Amazon’s facial recognition software, made disproportionately more errors identifying members of congress of color. A study from MIT confirmed similar issues in IBM’s and Microsoft’s systems.

Certainly designers and developers of technology, entrepreneurs and venture capitalists are not only considering the financial concerns in understanding the market, nor, I’m sure, are they unaware of the issue of bias. Yet products continue to be released which, through fundamental error or simple oversight, are designed in such a way that they exclude or impede users of some groups.

If we don’t think sufficiently about who we are designing for, the result is products designed against, products designed whether intentionally or not to exclude and minimize certain groups of people. In other words: who are we failing to design for?

One thought on “Misdesign: intentionality and bias in innovation

  1. Really informative article, Rachel. Great use of actual studies and data to back up your points. I learned a lot. This is definitely one area where diversity at all levels in the workplace could help lessen these design inconsistencies.

    Liked by 1 person

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s