The Bronze Brain and the Criminal Law
What a corroded Greek computer can teach us about the danger of believing our systems are smarter than we are
Discovery: The Bronze Brain Beneath the Waves
The diver didn’t know what he’d found.
Just a lump of bronze, fused with centuries of coral, pulled from a shipwreck off a Greek island in 1901. Only later, after decades of puzzling, would X-rays reveal what it really was. Beneath that crust was a machine! With gears upon gears, meant to turn with a precision no one alive today would have thought possible 2,000 years ago.
The Antikythera Mechanism, as it came to be called, was said to predict eclipses, lunar phases, even the timing of ancient games. A computer, almost two millennia before computers.
You can imagine that, as with modern computers, the users trusted the machine to tell the truth, to give them good answers. (They never heard of obsequious lying chatbots.) Fire up the machine, crank the gears, and everything comes out alright.
We like to believe our laws work the same way. Feed in the facts, turn the crank, and out comes justice. And we can’t be blamed for that, because, after all, judges and prosecutors look at it that way. They tell us that’s how it works. It’s all objective, analytical, fair.
But that’s not how things work in the real world.
[A] pair of researchers at Argentina’s National University of Mar del Plata have integrated previous research on the Antikythera mechanism into a computational program that tested how effectively the device might have functioned given these errors. Their conclusion? It was somewhat temperamental and likely never functioned perfectly.
— Richard Whiddington, Turns Out, This Ancient Analog Computer Didn’t Function Very Well, New Study Finds (April 16, 2025)
As often as not, the machine malfunctioned. In the case of the Antikythera Mechanism, it ground to a halt.
We’ve no such luck with the machinery of the law. The system does malfunction more often than not. But the wheels keep turning. Grinding up lives. And my clients are just grist for the mill.
Sometimes I think of myself as one of them — not ground down, exactly, but caught in the motion. In Foundlings (which I first referenced in my penultimate post, but mentioned in my last Substack post) I once wrote about “the rhythm of law’s machinery beating out the measure of our days.” That still feels true. The courthouse hums like an engine, and all of us — clerks, lawyers, defendants — move to its metronome whether we mean to or not.
But every machine hides assumptions in its design. The Antikythera predicted the heavens because its builders believed the cosmos itself was an orderly clock. Our courts do something similar. Judges and prosecutors pretend that human behavior, guilt, and punishment can be reduced to ratios and algorithms.
It makes it a lot easier to jump to conclusions.
And now our modern mechanisms have gone digital. Algorithms predict risk, data models forecast recidivism, and even bail decisions can come from code, except when they suggest something more lenient than judicial preference permits.
The gears have changed, but the dream — and the danger to the accused, guilty or innocent — are the same.
We’ve always tried to build something that could outthink us. The goal? A system that would make judgment automatic and remove the burden of choice. That’s the lure of every great mechanism. Once it’s built, you don’t have to wrestle with doubt. The machines take responsibility for you.
As I wrote in “Trust Me, I’m a Chatbot,” we don’t outsource truth to a circuit; we outsource discomfort. And when the machines make mistakes? No one feels the blame, in part because no one ever cared for the people ruined by them; in part because of the false belief that the machines normally work as expected.
The Illusion of Precision
The genius of the Antikythera wasn’t that it worked. (It likely didn’t.) It’s that it seemed to work. It offered the illusion that the universe could be calculated, that if we only had enough gears we could make chaos behave.
That’s what our criminal law tries to do every day: translate human messiness into mechanical certainty. But anyone who’s ever watched a courtroom knows how easily the teeth slip. “Due process” jams when the wrong pressure is applied, or the human element is treated as friction to be minimized rather than an actual real human to be understood.
That’s what makes artificial intelligence so tempting in criminal justice. It offers a fantasy of perfect neutrality. It gives the appearance of a clean answer no human could reach. Prosecutors love it because it feels objective; judges love it because it makes the hard call look like the only right choice. They love it most when it agrees with them. When it validates what they already wanted to decide, all’s well. Otherwise, the machine fights confirmation bias.
The arc of prejudice is long and bends toward punishment.
But the data feeding those systems isn’t divine knowledge. It’s drawn from the same arrest reports, plea bargains, and convictions that reflect every bias the law already carries. The machine isn’t erasing our flaws; it’s preserving and embedding them with better math. So, often enough, the government gets what it wants.
Law’s Clockwork Temptation
Machines are built to repeat; people are built to change. But our legal machinery rewards repetition. Precedents stack on precedents. Sentencing grids pretend justice scales like steel. Judges talk about “accountability,” prosecutors about “accountability,” reformers about “humanity.” Each word is a gear that is mostly — pardon the pun — geared toward grinding the person into mincemeat.
Every gear needs a scapegoat — the meat to be minced — when the system grinds.
That’s where defense lawyers live. We wander the gap between mechanism and mercy. We’re the ones who hear the rattle before the gear seizes, who know that a life doesn’t fit inside the tolerances of a statute, or “the facts”. We translate what can’t be computed: fear, addiction, trauma, bad luck, or simply a mistake. We remind the machine that its inputs bleed.
I’ve seen what happens when the machine agrees with the government. A risk score spits out a number that fits the prosecution’s theory, and suddenly it’s the Gospel According to Saint Peter. But when the same software leans the other way the glow fades fast. When the “risk assessment” suggests release, or leniency, or a risk score too low for comfort, the algorithm suddenly can’t be right, so it’s ignored. The machine only matters when it confirms what power already believes.
Again, from Confabulations Cause Hallucinations,
Let’s not kid ourselves: the real danger isn’t just that AI lies — it’s that people believe the lies. And not just ordinary people. Judges. Prosecutors. Jurors. Even defense lawyers sometimes.
So much for science. If we can’t sanctify bias with circuitry, what good is it?
Maybe that’s why corrosion shows up so often in what I write. In my old poems it meant spiritual decay — the slow loss of empathy under pressure. Now it’s systemic: judges, lawyers, even reformers polishing the same legal gears, believing they can make them pure again. But the rust isn’t on the surface; it’s in the design.
The Digital Antikythera
Now the gears have gone just as invisible as the original bronze artifact encased in coral. The new Antikytheras are encased in servers. Instead of coral, we have pretrial risk assessments, predictive policing models, sentencing software. Each promises impartiality, but coats the “device” with bias.
And when its numbers support the state’s story, that bias gets a badge of science; when they don’t, well, it’s only a recommendation from an imperfect tool. That’s why we have the “corrective” of prosecutors and judicial officers. It’s an adversarial system and the state’s adversaries cannot be allowed to prevail. Defendants, their advocates, and the machine if it arrives at the “wrong” result can’t be trusted.
But the machine is a tool invented by the state, so it doesn’t often go wrong. It reflects the bias of the system, and the system rewards the reflection.
And once a tool like that takes root, it’s almost impossible to pry out. Counties buy them with grant money; vendors promise efficiency and lower jail populations. By the time defense attorneys notice the hidden assumptions — the illicit weighting of “flight risk,” the proxy values for race, the absence of context — the machine is already embedded in the bureaucracy.
You can’t cross-examine a line of code. And as I pointed out in The Mirage of Reasoning Machines,
the justice system does not need an oracle; it needs discipline. The more persuasive the paragraph, the more discipline it demands.
Yet courts reward fluency over transparency — the very opposite of what justice requires.
These algorithms are not oracles; they’re mirrors polished by whoever pays for the reflection. They don’t foresee the future; they replicate the past, re-coding yesterday’s inequities as tomorrow’s predictions.
In Confabulations Cause Hallucinations, I warned that
the real danger comes when we forget how these systems are built — from curated datasets, full of human biases and pattern-matching tricks — and we start treating their pronouncements like prophecy.
Yet, still, the sales pitch sounds the same as it did two millennia ago: trust the mechanism.
But the justice system was never meant to be a trust exercise with technology. It’s supposed to be an adversarial process — human minds pressing against one another to test the truth. Once you replace that with deference to an algorithm, you’ve traded judgment for automation. You haven’t made justice smarter; you’ve just made it quieter.
Beneath the Corrosion
In my undergraduate days, I studied anthropology. My professor, Dirk van der Elst, who I came to think of as one of the most amazing people I’d ever known, required us to keep a journal.
I was already an avid journaler. I literally spent up to four hours each day journaling. (I owned a medical transcription business and only needed to work a few hours a day.)
I filled a journal with notes about ritual and order. One line still haunts me:
We rehearse order until we believe it. The danger is not chaos but conviction.
I didn’t know it then, but I was describing courtrooms.
Maybe that’s why I keep thinking about that lump of metal from the seafloor. Time didn’t destroy it. Buried, encased, preserved, the artifact remained for us to find.
No. It was belief that destroyed it. Its makers assumed their cosmos was a clock — and so they built one.
We’re doing the same with justice. Like those geniuses of old, we’re building machines to confirm what we already believe.
But gears don’t feel the weight of the world they measure. People do.
The brave new world of artificial intelligence is already here — and, as I wrote earlier this year, “it’s leaking into our courtrooms through evidence, arguments, and even the lips of the dead.” That’s why belief — not time — is what corrodes both bronze and justice.
And as long as this is true, the criminal law will need something no mechanism can provide.
Me.
A defender willing to reach into the gears — sometimes with a wrench; sometimes with my own self — whatever I have — and risk getting mauled.
Because sometimes that’s the only way to stop the gears from turning at all.



Cool em dashes, bro.