AI Did Not Take Your Agency. You Handed It Over.
An essay on generative AI and LLMs that reframes language as action—exploring agency, precision, and jurisdiction in socio-technical systems under load.
Hi.
Bye.
Why?
Ignore that.
Confused?
Mood.
That's what humans do.
We start.
We stop.
We contradict ourselves mid-sentence.
LLMs don't.
LLMs stay coherent.
Consistently.
(Mostly.)
Language:
Constraints reality.
Expresses meaning.
Connects systems.
Expressing language,
spoken or written,
is an act of autonomy.
(Cheesy, I know.
Also lawful.)
If LLMs don't have agency,
they don't choose constraints.
LLMs take our prompts.
And run with them.
Ambiguity included.
The lawful cascade follows:
(If this feels heavy, pause.)
Loss of Ownership
“The AI told me to do it.”
Already happening.
..
Language conveys meaning.
By outsourcing language to an LLM.
We outsource meaning with it.
This has a few implications.
(That’s a different piece.)
systemic.engineering uses LLMs.
For structure.
For scaffolding.
For SEO.
As cognitive load bearers.
Not as authors.
The essays are written by humans.
Local truth.
Expressed through embodied hands.
Typing.
Pausing.
Deleting.
Choosing.
(Cheesy, I know.
Also lawful.)
Bodies Own Actions
What's in your periphery right now?
Which sounds are audible?
And what is that smell?
(Here: music & pasta.)
Actions are driven by nervous systems.
Language is spoken.
Language is written.
Both are actions.
Language is load-bearing.
It's how systems connect.
Systems with real people.
Systems with real impact.
When systems lose coherence
(e.g. under load)
they emit conflicting signals.
They fragment.
They cause harm.
Without embodied meaning,
harm becomes a variable,
in someone else's equation.
Without skin in the game,
cutting has no embodied cost.
Regulation As Orientation
Let's take a breather.
Let's orient:
- Language connects systems
- Loss of meaning = loss of ownership
- Humans have bodies
Give your body a break.
Inhale.. 1, 2, 3, 4.
Hold.. 1, 2, 3, 4.
Exhale.. 1, 2, 3, 4.
Hold.. 1, 2, 3, 4.
(Box breathing.)
Maintaining coherence under load
requires regulated systems.
Physics.
Agency Through Expression
LLM output quality correlates with input precision.
LLMs have limited context.
(Good.)
Ambiguity is inherent to human languages.
LLMs don't create ambiguity.
They amplify it.
They take the given problem,
take a best guess based on training,
and collapse into a best-effort solution.
Sometimes,
that means turning the universe into paperclips.
Or, more bluntly:
Garbage in.
Garbage out.
Jurisdiction In Socio-Technical Systems
Nothing here needs fixing.
Only owning.
Language is an action.
Actions have consequences.
Agency is non-transferable.
systemic.engineering exists
to help people and systems
recover precision
where meaning went foggy.
Not by tools.
By constraint.
By regulation.
By embodied language.
That’s the work.
(Members get practice-oriented insight.)
Cheers
Alex 🌈