You Can't Un-Invent Fire: Designers Should Explore, Not Block

A startup wants you to design an interface for AI that analyzes human behavior from video. It could revolutionize eldercare—or enable mass surveillance. Do you take the project?
Today's tech ethics creates pressure to avoid potentially harmful technologies. While few designers explicitly refuse projects outright, many hesitate, delay, or find ways to distance themselves from technologies with potential for harm. This creeping reluctance from design is common. But it fundamentally misunderstands what designers do and why that matters.
This false choice undermines both innovation and ethics, pushing technological development to less thoughtful hands while reducing society's visibility into emerging capabilities.
The Professional Duty to Explore
In law, attorneys have a professional duty to zealously advocate within legal bounds, regardless of personal opinion. This duty of zealous advocacy ensures everyone receives competent representation. The constitutional right to legal counsel requires someone to explore every legitimate legal avenue for their client. Designs should treat their responsibility to design in the same regard.
Consider fire. Once our ancestors discovered flame, there was no going back. Fire would cook food and provide warmth, but also burn forests and enable warfare. The question wasn't whether fire should exist—that was inevitable.
Today's technologies follow the same pattern. Someone will develop advanced AI and autonomous systems. The only question is whether that happens in the light, with thoughtful exploration of consequences, or in shadows with less ethical consideration.
When designers refuse to engage with difficult technologies, we don't prevent development—we ensure it's done by people less concerned with ethics, usability, and societal impact.
Where to Draw the Line
I’m not saying designers are ethically boundaryless. There's a crucial distinction between exploring technology that may have potential for harm and explicitly designing tools intended for harm.
Responsible exploration: Investigate possibilities, document scenarios (good and bad), implement safety considerations, communicate transparently about capabilities and risks.
Crossing into complicity: Design explicitly for oppression or aggression, ignore foreseeable harms, not communicating potential risks.

Why Designer Gatekeeping Fails
Individual designers aren’t in a position to be effective gatekeepers for technological development for three reasons:
Innovation is distributed. Technologies emerge through thousands of people across academia, industry, and government. When one designer says "no," others continue with less ethical consideration.
Wrong institutional level. Designers lack the context and authority for society-wide technology decisions. These belong to democratic institutions and informed public discourse, which work better with clear understanding of technological possibilities. Designers should raise and communicate the potential issues.
Perverse outcomes. Designer self-censorship often reduces societal safety. Thoughtful designers abandon difficult spaces to less scrupulous actors. Society gets the technology anyway, with less ethical input.
Take facial recognition: many ethical designers avoided this space due to surveillance concerns. Development continued elsewhere with fewer constraints. Society might have been better served by thoughtful designers understanding the technology's capabilities and limitations.
Common Objections
"You're enabling harm." This conflates potential misuse with intended use. The knife inventor isn't responsible for every stabbing. Intent, context, and transparency matter.
"Someone has to say no." Yes, but institutions, not individual designers. Democratic societies have regulatory agencies, elected officials, and informed discourse. The designer's role is providing clear information about possibilities, not making societal decisions.
"What about dangerous technologies?" Catastrophic risks require institutional responses: government oversight, international cooperation, systemic safeguards. Not ad hoc moral choices by individual designers.
"This sounds like moral abdication." On the contrary, responsible exploration requires courage to work on difficult problems, intellectual honesty about implications, and professional skill implementing safeguards. The designer who explores AI decision-making while documenting failure modes contributes more than one who refuses to engage.
Responsible Exploration in Practice
Your job is to design, just as a lawyer's job is to defend their client. You have to do your job.
But doing it responsibly means:
Transparency: Clearly communicate what you're building, what it can do, and potential misuse. Enable informed decision-making.
Scenario planning: Don't ask "should this exist?" Ask "if this exists, what are possible outcomes?" Explore beneficial applications and potential harms.
Cross-disciplinary collaboration: Work with ethicists, policymakers, and affected communities. Bring technical understanding; receive other expertise.
Professional standards: Distinguish legitimate exploration from weaponization. Create guidelines separating technological development from explicit harm.
The Future Will Be Designed
We need to reframe designer responsibility from gatekeeper to informed explorer. This means engaging with difficult technologies rather than avoiding them. Building institutional capacity for technology governance rather than relying on individual moral choices.
The designer exploring AI-human interaction while documenting risks contributes more to ethical development than one refusing to work on AI entirely. Better to have a conscientious designer who thinks deeply about consequences working on contentious technology than someone who doesn't consider implications at all. The former gives society better information for collective decisions; the latter leaves decisions to less informed actors.
Professional courage means intellectual honesty about what's possible and ethical clarity distinguishing exploration from exploitation. It means building tomorrow's technologies with today's wisdom.
The future will be designed by someone. The question is whether by people who take responsibility for understanding implications, or people who simply build what they're asked. In a world of rapidly advancing technology, society needs designers willing to explore difficult territories with ethical intentions and transparent methods.
You can't un-invent fire. But you can ensure it's developed by people committed to illuminating both its promises and its dangers.
Comments ()