WTF…
A non-user is someone who has a reluctant, resistant, or involuntary relationship with a technology.
To use something makes you a user. But that thing you use, you don’t use it for everything, and you don’t use it all of the time. There are tasks for which you don’t use it, and times when you’re not using it. So why privilege “use” as the predominant framework for labeling yourself?
Sometimes you don’t use a thing because you don’t find it useful. Sometimes it just doesn’t appeal to you. But sometimes, you choose not to use that thing because using it might have negative consequences for you. Or else you choose not to use it because if lots of people used it, it might have negative consequences for society.
A non-user doesn’t have to be perfect.
Non-use, like use, is not an absolute. It exists in degrees. You don’t have to never use to not use.
Maybe you don’t want to use a thing, but that thing is really cool and everyone else is using it, so… you start using it. Maybe you feel guilty, maybe not. Maybe you stop using it. Maybe you start using it again. Maybe you keep using it, but you never fully trust it.
Chances are, you are a non-user of something.
So are we, and we’d like to hear your story.
Ooookay… but why is this important?
The language of “use” and “user” is fundamental to the culture of interaction design, yet the “user-centered” framework reduces the complexity of people’s encounters with technology to measures of utility. This mindset leaves little room for other normative values—such as privacy, equity, or community—and subverts our responsibility as designers to recognize and account for individuals’ more nuanced moral concerns.
As our environments are increasingly shaped by automated decision-making, the voices of people who actively resist data-collecting technologies become ever more critical. Though such “non-users” are commonly seen by technologists as adversarial or warranting conversion, an empathetic consideration can reveal values-based concerns and potential social implications.
For example, while a user-centered designer may interpret a person’s aversion to their smartphone’s fingerprint detection system as a misunderstanding of technical details—a solvable problem!—their actions may instead signal concern for a vulnerable community's privacy and safety. By recognizing expressions of non-use as essential feedback, we reduce the opportunity for ethical blindspots and reveal the broader impact of our work.
What if the design process was better equipped to address peoples’ internal struggles? How might a deeper understanding of non-use reform our own design process and shape the digital products we develop?
HMW…
How might we prepare designers to recognize nuanced moral responses to their products?
Learning from (Non)Users
MozFest, 10.2019
Central St. Martins, 10.2019
IxDD: Trust & Responsibility, Georgetown University, 09.2019
Beyond Frameworks: Designing with Ethics, IxD19 Education Summit, 02.2019
How might we equip a non-user to signal an assertion of their rights, and how might others recognize and respond to such a signal?
FYI…
Non-use Project Publications
Harvard Kennedy School Shorenstein Center’s Privacy Forecast 2019: My Terms of Service Button
New America and Harvard Kennedy School Belfer Center’s paper, Understanding Data Privacy Protections Across Industries:
“Case study #4: Learning from Non-Use: Active Resistance + Data Privacy Workshop for Designers”
Further Reading: Consent
Jones, Meg Leta, Ellen Kaufman, and Elizabeth Edenberg. “AI and the Ethics of Automating Consent.” IEEE Security & Privacy 16, no. 3 (2018): 64-72.
Jones, Meg Leta. “Your New Best Frenemy: Hello Barbie and Privacy Without Screens.” Engaging Science, Technology, and Society 2 (2016): 242-246.
Jones, Meg Leta. “Privacy Without Screens & the Internet of Other People's Things.” Idaho Law Review 51 (2015): 639-660.
Further Reading: Non-Use
Satchell, Christine & Dourish, Paul (2009) “Beyond the user : use and non-use in HCI.” In OZCHI 2009 : Design: Open 24/7 : 21st Annual Conference of the Australian Computer-Human Interaction Special Interest Group (CHISIG) of the Human Factors and Ergonomics Society of Australia (HFESA), 23-27 November 2009, The University of Melbourne, Melbourne.
TL;DR
Actively resist technological determinism.
Don’t let the supposed inevitability of technological advancement determine our social and moral norms.
Learn from non-users.