A Working Model
I talk about
Semantic AI and reading English – people don’t see the obvious – if you
understand the text, and add a few formulae, you have a working model.
The
specification of a fighter jet, which details every component or system. Put
all those together, turn the working bits into operators, and what do you have.
You need a formula for drag (which changes with bomb and fuel load, and altitude.
A formula for engine thrust, fuel rate, atmospheric conditions.
You start the
engine, the fuel flows, the engine delivers thrust, the plane takes off and
stows the undercarriage, changing the drag. The plane climbs. Yes, only numbers
and states change, in accordance with the specification. You are flying the
paper version, but incoherence and inconsistencies will show up. For the Navy
version, you do an emergency landing with a full fuel load on a carrier at
maximum vertical acceleration (of the carrier in a surging sea). The undercarriage
breaks. You fly at supersonic speed through a rainstorm – the stealth paint
washes off.
But the
machine is only reading text!
When you
read text, you bring along your imagination – why shouldn’t the machine do
likewise? Do the formulae exist – yes. Do the attributes of the objects exist –
the mass of the aircraft, the drag coefficient, the atmospheric conditions, the
engine thrusr curve, the fuel load – yes. You can’t do this in your head – it won’t fit
(the Four Pieces Limit), but it easily fits inside a computer.
But no-one
has ever done this before!
There is
always a first time.
Another
example – Robodebt (a scheme whereby bad actors in the employ of the government
rewrote a program that worked out the entitlement of people on the dole to
change income averaging from a single fortnight to six months, causing dole
recipients to be harassed by debt collectors for amounts up to $18,000, which
they couldn’t pay – some suicided). Get a machine to read the legislation,
build a working model that does precisely what the legislation says (using
operators to represent the exact words). As well as avoiding the suicides, it
would have avoided $2.4 billion in reparations.
OK, it would
be slow and only suitable for edge or test cases, but as a verification of the
program, it would have been ideal.
But these
are isolated cases – we will be more careful next time.
People have
limits – the most expert has the same limit as the man on the street where
complexity is concerned.
There are
other examples:
Horizon
UK – a vicious
attempt to protect a reputation – 13 suicides, 1.7 billion pounds reparations
(so far).
Boeing
737 MAX MCAS – bad actors
taking glee in snowing the FAA – 346 deaths
Constellation
Class warship
https://www.nytimes.com/interactive/2025/12/11/opinion/editorials/us-military-industry-waste.html
The US Navy
made a too-many-cooks butchering of a foreign design, intended to circumvent
the overspecification of an American-designed ship. The article makes clear the
2,000 page procurement manual was not understood or followed – another candidate
for a text working model, where you turn one knob, another knob you hadn’t
thought of turns, as well as a working model of the ship’s specification, so it
would have been crystal clear what the result would be of so many changes – the
project was cancelled, billions lost.
You want
more, there are more – prefer not to highlight them.
People don’t
do complexity well. They need help. Mix in some bad actors and you have a real
mess.

Comments
Post a Comment