The Techno-Authoritarian Backlash: Inclusive Design Is an Act of Resistance (Part 1)

I have been grappling with my identity as a technologist committed to inclusive applications, responsible technology, and cultural competency in this increasingly destabilized world. That isn’t to say that my commitment to either has wavered, but rather, as a current job-seeker in an over-crowded market, I’m scared.
As I watch companies and organizations waver, stumble, sometimes double down on their commitment to DEIA and inclusivity, I wonder a little bit about where I will ultimately land. Some nights, during my usual 3 am ruminations, I consider toning down my resume and even this blog. I even briefly unpublished an article (it’s back now). Self-censorship is a powerful thing.
My commitment to this work is so much a part of me, sometimes I am not sure how to exist in this reactionary world where it may shut doors rather than open them. I find encouragement in watching companies like Target and Walmart face significant backlash when they dismantled their DEIA programs, some even walking back their decisions. But this struggle extends far beyond corporate policies.
To be clear, when I talk about DEIA, I am not just referring to workplace culture. I’m talking about the foundational work of making digital spaces accessible to everyone. Building with intention. Building teams and applications that are intentionally inclusive, and creating technology that serves all of humanity, not just a privileged few.
What’s happening right now in our government demands a technologist’s perspective. We’re witnessing a corporate takeover dressed in patriotic clothing. As I’ve posted on social media, (and as more people are now becoming hip to), there is a larger architecture being constructed for technocratic rule, a form of techno-facism.
This world view reduces humans to capital and envisions society as better governed by tech oligarchs rather than elected officials or public servants.
Despite mounting pressures, my ethos isn’t going away. The necessity for inclusive technology development isn’t going away either. Yet we face the very real danger of surrendering to a neo-palingenetic ultranationalism that positions techno-facism as innovation.
Though it deploys through smartphones, databases, and AI, rather than brownshirts, it’s built on the same foundational principles that defined fascism in the 1920s-1940s.
What we are witnessing in real-time is the manifestation of an ideology cherished by techno-capitalist (techno-fascist) leaders Musk, Thiel, Andreeson, and right-wing zealots like Steve Bannon, called Dark Enlightenment, started by ideologue Nick Land and online blogger Curtis Yarvin. Dark Enlightenment (also termed the neoreactionary movement, or accelerationism) essentially wants to throw out American democracy in favor of a type of techno-corporate state.
Imagine democracy reimagined as a corporate org chart: a CEO/monarch (formally POTUS) leading from the top, with states transformed into competing government-corporations. It rejects democratic principles as inefficient and celebrates hierarchical authority structures built on technological control. We’re watching this occur in real time as Musk (and Trump, sorta) are taking a chainsaw to the federal structures that run our current democracy.
- Ana Teixeira Pinto: Artwashing – on NRx and the Alt right;
- A Brief History of a Terrible Idea: The “Dark Enlightenment”;
- Dark Enlightenment: The Emergence of Neo-Reactionary Thought;
- and this recent NYTimes article I’ve gifted: The Interview: Curtis Yarvin Says Democracy Is Done. Powerful Conservatives Are Listening.
I believe, in this moment, inclusive design isn’t just a good practice – it is outright rebellion. Every diverse team we assemble, every accessible interface we build, every ethical AI framework we implement is a direct challenge to techno-authoritarianism.
This isn’t theoretical. Resistance is encoded in our work.
While they build systems to monitor, extract, and control, we build digital spaces that distribute power, protect privacy, and amplify marginalized voices. Our code, our designs, our methodologies, are battlegrounds. When we reject “move fast and break things” for “move carefully and fix things” we’re not just changing development practices, we’re fighting for a different future.
This is why inclusive design terrifies the techno-authoritarian crowd. It exposes their “innovation” as mere consolidation of power. It proves that their “efficiency” comes at the cost of human dignity. It reveals their “meritocracy” as a rigged game.
Every accessible product we launch is living proof that their exclusionary vision is not an inevitability but a deliberate choice. One that we refuse to make.
To dismantle our resistance, they've weaponized a potent lie – that quality requires exclusion. This calculated deception sits at the heart of their attack on our society, our humanity. It targets the elderly, disabled people, immigrants, and every marginalized community by designing systems that exclude rather than include.
Understanding this strategy isn't academic; it's essential for survival in an industry increasingly hostile to the very idea that technology should serve everyone.
For example, the deliberate blurring of lines between government and technology corporations creates the perfect architecture for mass surveillance. When we design with privacy as a fundamental right rather than an afterthought, we're directly challenging their business model. Every time we build systems that minimize data collection, implement strong encryption, or give users true control over their information, we're sabotaging the surveillance infrastructure that powers techno-authoritarianism.
This is particularly critical now, when governments increasingly deploy surveillance technologies developed by private companies with minimal oversight. The same facial recognition tools marketed for "convenience" become weapons of control when turned against protesters, journalists, or marginalized communities. Our commitment to privacy-respecting design isn't just about user experience – it's about preserving the conditions necessary for democracy itself.
Right now, we're at a turning point. What once seemed like a weird internet philosophy is quickly becoming real-world policy. A small group of tech leaders and political thinkers are trying to reshape how our society works, pushing an idea that technology should control more of our daily lives.
Every time we create a piece of technology that respects people's privacy, that works for everyone (not just a privileged few), we're pushing back against this vision. When we build apps that protect user data, when we create teams that include people from all backgrounds, we're doing more than good design. We're protecting the basic idea that technology should serve people, not control them.
This isn't just about making nice software. This is about keeping our fundamental human rights safe in a world where technology is becoming more and more powerful.
The fight for good technology is the fight for human dignity. And every single design choice we make matters.
I don't pretend this resistance comes without personal cost. As I update my resume and prepare for interviews, the tension between my values and the market's shifting winds is palpable. But I also know that surrendering these principles means conceding defeat before the battle has even begun. The temporary comfort of self-censorship pales against the permanent damage of complicity.
The struggle for inclusive technology isn't separate from the larger fight for democracy—it's one of its most critical battlegrounds. And in that fight, each accessible interface, each equitable system, each ethical design decision matters more than we might realize.
In Part 2 of this series, I'll explore how the false binary between excellence and inclusion serves as both shield and sword for the techno-authoritarian movement. We'll dissect the weaponized language of "meritocracy" and examine how it conceals a system designed to protect power rather than promote talent. By understanding these rhetorical strategies, we can better counter them in our daily work as designers, developers, and digital citizens.
[Article image prompt: A digital cityscape at dawn. In the foreground, a diverse group of women stand united, their hands intertwined to form a web-like pattern. Behind them, looming skyscrapers are adorned with surveillance cameras and data streams. Flowers and vines grow from cracks in the concrete, symbolizing hope and resistance. The overall tone should be a blend of dystopian elements with touches of organic, feminine strength.]