What is it about?

This article explores why people sometimes refuse to follow recommendations made by algorithms—computer systems that make decisions about things like hiring, healthcare, and navigation. Rather than viewing this refusal as a sign of ignorance or stubbornness, the authors argue it is often a deliberate, informed choice. They draw on the idea of "strategic illiteracy," which describes how communities throughout history have intentionally refused to adopt dominant knowledge systems as a way to resist oppression. The article suggests that when people reject algorithmic advice, they may be pushing back against systems that carry hidden biases, threaten privacy, and concentrate power in the hands of technology companies. This reframing treats algorithmic aversion as a meaningful form of social and political resistance rather than a problem to be fixed.

Featured Image

Why is it important?

Algorithmic systems increasingly shape critical life decisions—from who gets a job interview to how law enforcement operates. These systems are often presented as objective and efficient, yet research shows they can reinforce racial discrimination, deepen inequality, and erode personal autonomy. By reframing people's reluctance to trust algorithms as a rational and even courageous act of resistance, this article shifts the conversation. Instead of asking "How do we make people trust algorithms more?", it asks "What are algorithms doing wrong that people are right to resist?" This perspective is crucial because it validates the lived experiences of marginalized communities who are disproportionately harmed by biased technologies. It also provides a foundation for building fairer, more inclusive algorithmic systems—ones designed with community input rather than imposed from the top down. The article points toward practical solutions like community-governed data trusts, participatory design, and stronger legal accountability for tech companies.

Perspectives

This work opens an important new way of thinking about technology adoption and resistance. For policymakers, it highlights that people's reluctance to embrace algorithmic tools is not simply a communication or education problem—it signals genuine flaws in how these systems are designed and governed. For technologists and developers, it serves as a call to build algorithms collaboratively with the communities they affect, especially those historically excluded from the design process. For researchers, it charts a rich agenda: empirical studies are needed to understand how different cultural, racial, and socioeconomic groups experience algorithmic resistance, and whether refusing biased algorithms actually leads to better outcomes for individuals and communities. The concept of strategic illiteracy also extends well beyond algorithms—it offers a lens for understanding resistance to organizational change, media narratives, and other top-down systems. Ultimately, this article reminds us that choosing not to engage with a powerful system can itself be one of the most powerful things a person or community can do.

Siyuan Yan
East China University of Science and Technology

Read the Original

This page is a summary of: Understanding Algorithmic Aversion: A Bold Rejection of Digital Dominion, Communications of the ACM, February 2026, ACM (Association for Computing Machinery),
DOI: 10.1145/3758088.
You can read the full text:

Read

Resources

Contributors

The following have contributed to this page