top of page
Search

When Language Follows Form, Not Meaning

Updated: Jun 8

---


When Language Follows Form, Not Meaning

What happens when AI generates structure instead of meaning?

"Generative models do not orbit. They chain."

 In the age of large language models, language is no longer a vessel of meaning - it is a sequence of structurally conditioned outputs.


In this new article, I introduce the concept of Formal Syntactic Activation (FSA) to describe how LLMs operate without referencing ideas, intentions, or subjects. They don't express. They continue. Each token is not chosen because it signifies, but because it fits.

👉 If a unit fits, it fires.

This is not a failure of alignment. It's a structural reformation of what language is when generated by non-human systems.


---


What You'll Find in This Paper

🔸 A clear departure from semantics as the core of language generation

 🔸 A model of syntactic activation based on structural compatibility

 🔸 The collapse of intentionality and the disappearance of the subject

 🔸 Connections with prior works on passive syntax and synthetic authority

 🔸 A call for a structural realignment in linguistic theory and epistemic frameworks


---


Read the full article

📝 When Language Follows Form, Not Meaning: Formal Dynamics of Syntactic Activation in LLMs

 📅 Published: June 7, 2025

 📚 Series: Grammars of Power

 🔗 Read on Zenodo

 🔗 Also on Figshare


---


Keywords:

LLMs · AI language models · syntactic activation · post-semantics · algorithmic discourse · epistemic legitimacy · generative language · structure without subject · grammar and authority · language and AI


---


Follow the Research

📍 Agustín V. Startari

 ResearcherID: NGR-2476–2025

 More works on Zenodo


Ethos

I do not use artificial intelligence to write what I don't know. I use it to challenge what I do. I write to reclaim the voice in an age of automated neutrality. My work is not outsourced. It is authored."

 - Agustin V. Startari

 
 
 

Comments


bottom of page