LLMs and the Military

 

The news that LLMs will be gifted to the US Military is not good. LLMs fulfil their intended function – getting more useful information back from a search engine – instead of a million hits, if you are lucky you get an informative article which may be on point. But when it comes to life and death an LLM is not a reliable tool. Put simply, it doesn’t understand the meaning of a single word of English. Instead, it relies on word associations.

As an  example, here are two sentences talking about a movie set.

We watched a movie set in Hawaii.

We visited a movie set in Hawaii, and spoke to the cast.

Notice how you read them differently, unconsciously inserting a “that was” in the first one. Because handling words happens unconsciously, many people forget about it, but unless a system is emulating a person’s unconscious mind, it will make a real mess of even small pieces of language.

Admittedly, the communications are pretty basic in a warfighting environment, making it even more important not to make a mistake about what is meant. Something like

“They shot the command post to hell”

makes perfect sense, but the internet is not the place to look for its meaning.

The person asking for help will assume the person or machine they are communicating with is as competent in language as they are, which means either a person or an Active Structure using semantics, and it already has a grasp of battlefield issues (maybe even read and understood Sun Tsu).

An LLM just isn’t going to cut it.

Comments

Popular Posts