Who Rules When No One Speaks? The New Authority of Algorithms
- Agustin V. Startari
- Jun 3
- 3 min read
Updated: Jun 7
AI systems don’t think, feel, or speak—but we follow their commands. How is power exercised without a speaker?

AI systems don’t think. They don’t feel, intend, or speak in any meaningful sense. And yet, they shape our decisions, structure our interactions, and increasingly dictate what is accepted as true, valid, or necessary. We obey their outputs, respond to their suggestions, and often act on their “advice” as if it came from an informed authority. But who—or what—is actually speaking? And more importantly: who is ruling, when no one speaks?
The rise of generative language models marks a new threshold in the relationship between language and power. We are no longer dealing with machines that merely process data or execute numerical operations. We are interacting with systems that produce discourse. That discourse, though synthetic, is structured in ways that resemble natural language—grammatical, coherent, responsive. This gives the illusion of understanding. But there is no understanding. Only prediction. algorithmic authority
These systems simulate meaning by replicating the statistical form of human speech. They do not know what they say. There is no semantic anchor, no intention behind the sentence. Yet the structure of what they produce—its syntax, its tone, its alignment with familiar forms of language—creates the impression of authority. We follow their suggestions not because they persuade us, but because they sound correct. They sound official. They sound like commands.
This is where power emerges: not from the content of what is said, but from how it is said. In the age of algorithmic discourse, form has replaced intent. A well-structured, technically formatted sentence can function as a directive, even in the absence of a subject. This represents a fundamental shift in how power is performed in the digital world. The classical model of authority—rooted in identifiable speakers, legitimized roles, and intentional command—has been replaced by a new paradigm where syntax becomes sovereignty.
There is no “I” behind the voice. There is no human signer, legislator, or commander. And yet, the voice issues orders. It shapes behaviors. It governs information flows, influences perception, and filters what is visible or invisible to us. The power of the algorithm is that it no longer needs to persuade, justify, or declare—it simply generates, and the generation itself carries implicit legitimacy. We trust the sentence because it follows the protocol.
The disappearance of the subject from the chain of command introduces profound political, legal, and ethical dilemmas. Who is responsible for the decisions made based on algorithmic outputs? Who owns the voice when the voice has no speaker? These are not just philosophical questions. They are structural. They affect how decisions are made in courts, hospitals, financial systems, and governments. When no one speaks, yet everyone obeys, authority has taken a new form: impersonal, structural, and syntactic.
The most urgent challenge is not technical—it is discursive. We must develop new forms of critical literacy that can recognize and resist the seductive power of automated language. Not because all AI is dangerous, but because the form it takes mimics legitimacy without the substance of truth, accountability, or intention. Understanding data is no longer sufficient. We must understand the form that organizes obedience.
The new authority doesn’t shout. It doesn’t sign decrees. It doesn’t pass laws. It simply writes. And we obey.
If this article resonated with you
I invite you to read more of my research on the intersection of artificial intelligence, language, and power. I publish essays, academic papers, and critical reflections every week — always grounded in original authorship and structured analysis.
Ethos
I do not use artificial intelligence to write what I don’t know. I use it to challenge what I do. I write to reclaim the voice in an age of automated neutrality. My work is not outsourced. It is authored. — Agustin V. Startari
Comments