Will Active Semantics Get to AGI?

 

If it can’t, nothing else will.

Gary Marcus has led the attack on LLMs, but his attack is blunted by his liking for neurosymbolics, which attempts to replace the logical structure of English text with a much cruder form, and then get an answer to a problem that neurosymbolics can’t describe (but it does have ML – good grief)..

When a search engine comes back with a million hits, it is essentially saying “do something else”. LLMs are that something else, and make a useful increase in the power of the search engine. But used beyond their limits, they are unreliable, because they have no idea what the words mean, other than some words are often neighbours of other words.

So why would semantics be better? We need to stress we are talking about active semantics, not just about meaning. The semantic structure can do things, or reason about itself, and add hypotrhetical structure in any direction.Words are turned into objects, which may be active (verbs) or passive (most nouns). Many words have multiple meanings (about 12% of words have a single part of speech and a single meaning – some words can have half a dozen parts of speech, and other words can have 60 or 80 meanings, and range from active to passive objects). The richness of expression is both a blessing and a curse.

Enlarged image

The figure shows a relatively simple statement – “a strong desire to know or learn something”. Each object has parents, but the word and a particular meaning is caught, or, with a verb, the verb, the particular form of verb, and the meaning, so the semantic structure can describe very complex situations without ambiguity.

Many words have two levels of parents – the word and a particular meaning. Some words come in multiple flavours – attributive and predicative for adjectives, transitive, intransitive, and about ninety others for verbs, Simple, Infinitive, Clausal for nouns. The particular word “desire” in the text of a word definition is a noun supporting an infinitive – “a desire to learn”.

.

The parenting may look expensive, but as more uses of the word with the specific meaning are added, it is only an extra link.

We (everybody) are so used to our Unconscious Mind doing all the work with parsing of text and building relational structure that it comes as a bit of a shock to realise how little we know at the conscious level. Still, if we want to build an AGI machine, there is no choice but to emulate the Unconscious Mind. The benefit will be that the AGI machine will be able to handle much more complex text than a human can – we will see that by both a reduction in snafus and an increase in complexity of what humans are willing to tackle.

What about mathematics? Child’s play in comparison. The links in the network allow numbers and ranges, and logical values to flow, objects having the appropriate attributes – mass, location, velocity, inertia, temperature. We built an analytic system for numbers and logic 40 years ago – it needed a way to describe a complex problem in English, because mathematics isn't very good at that - too much gets left out.

www.activesemantics.com

Comments

Popular Posts