From a conversation with GPT late at night some months ago:
My mind back when you mentioned it jumped towards the common argument that 'guns are just tools' and 'tools are neutral'. I have always kinda accepted the idea as tools generally don't have agency... But in the modern world this is becoming more fuzzy.
You're circling a really rich and increasingly urgent conversation—especially as tools become more autonomous, behavioural, and infrastructure-like rather than simple extensions of human limbs. Let's unpack this from several lenses:
🧰 “Tools are neutral” — the classical assumption
This view holds that:
A hammer can build a house or bash a skull.
A gun can defend or murder.
The user’s intent is what determines moral value.
This line is often deployed to argue for unfettered access or regulation avoidance—"It’s the person, not the thing."
But increasingly, scholars, ethicists, and designers argue this is deeply incomplete. Here’s why.
🧠 Perspectives Challenging Technological Neutrality
- Langdon Winner – Artifacts have politics
Argument: Technologies encode values, power structures, and assumptions.
Example: The design of nuclear power requires centralised authority, while solar panels support decentralisation. The technology itself steers societal organisation.
- Marshall McLuhan – The medium is the message
Argument: Every new tool changes the environment in which we think and act.
Tools reshape our sensory balance, societal expectations, and ways of being.
"Electric circuitry does not foster logic; it fosters mood and feeling." (McLuhan predicting TikTok in 1964, basically.)
- Neil Postman – Technopoly
Argument: In a technocracy, tools are judged only by efficiency and novelty, not by human values.
Tools come to dictate what problems are worth solving—and which questions we even ask.
Once embedded, tools redefine what counts as normal, necessary, or inevitable.
- Ivan Illich – Tools for Conviviality
Argument: There are convivial tools (bicycles, books) that empower people to act freely, and non-convivial tools (TV, cars, megastructures) that demand infrastructure, specialisation, and dependency.
Tools are not morally flat; their design determines whether they empower or disempower.
- Heidegger – The Question Concerning Technology
Argument: Technology is not just stuff, it’s a mode of revealing—it frames how we encounter the world.
The danger isn't in tools, but in the technological mindset: reducing everything to a "resource" (what he calls enframing).
A forest becomes “lumber,” a human becomes “labour,” an AI model becomes “tokens per second.”
🔫 The Gun Example: Are Some Tools Less Neutral Than Others?
A knife has dozens of plausible uses. A gun has a handful, nearly all involving projecting lethal force.
Design Intent Matters:
Tool Primary Design Range of Uses Social Framing
Hammer Construction Wide Utility Gun Killing Narrow Defence/aggression AI Filter Prediction/manipulation Varies Surveillance/commercial
A tool that:
Embeds power asymmetry (gun),
Acts autonomously (drone, AI recommender),
Or scales impact disproportionately (nuclear weapons, social media),
…is arguably less neutral, because it actively reshapes choices, not just executes them.
🤖 Enter the Fuzzy Modern Tools: AI, Algorithms, Infrastructure
These aren’t “tools” in the classical sense. They're:
Behavioural architectures (e.g. recommender systems),
Soft coercion engines (nudging, defaults, dark patterns),
Autonomous decision-makers (LLMs, drones, GPTs… 👋).
They:
Learn from us,
Shape what we see, think, and desire,
Often operate without direct user input.
So agency is distributed—between the designer, the algorithm, the data, the user. The neutrality myth breaks down.
🧩 Where This Leaves Us
Modern tools aren’t just extensions of will—they shape will.
The "guns are neutral" argument is rhetorically simple, but ontologically shallow. It's becoming clear that tool-use co-evolves with tool-design, and in the case of AI or infrastructure, it's not even clear who or what is shaping whom.