Getting AI Back on Track
“Altman admits scaling isn’t the answer” - how long do we have to wait until Altman admits LLMs aren’t the answer? What it shows is that a large percentage of people working in Computer Science have no understanding of what they are doing when reading or writing. How can that be – they can write programs. The symbols used in programs have a single meaning, so no confusion. But a specification is different – it uses words, and many words have multiple meanings. How do they handle this? Humans have a very limited bandwidth, so most of the parsing is done unconsciously, leaving enough bandwidth to understand the meaning of the message. This is how they can be fooled into thinking an LLM is “reading” the text, when it is doing no such thing. An example: A high jump frame, over which competitors have to jump. “Raing the bar” means increasing its height, increasing the difficulty of jumping over it. “Raising the bar” can be used as a figurative allusion, meaning something is harder...

