You know the question. How do you know that it is not the answer?
‘Oh, you should never, never, doubt what nobody is sure about...’
-Willy Wonka
Is this microphone on?
That AI thing. You’ve heard about it. Billions in tech. It is going to change the world. Mysterious and unfathomable it is ; even rockstar PHDs don’t know what’s going on inside.
How in the world can the average Mr. and Mrs. Normal understand its’ complexities?
Step by step.
AI predicts what’s next. That’s it.
The future is based on the past multiplied (yes multiplied) by the evidence you have in your hand, divided (yes divided) by all the evidence that exists.
Rephrase this, (future is the past given what happened out of all possible ‘happenings’), and it kind of feels like common sense doesn’t it?
Why does AI sometimes get that wrong? (The current term for this is hallucination).
Recommended by LinkedIn
All the possible evidence, or ‘happenings’, i.e., infinity, is a bit challenging to calculate. In fact, there is not enough compute (processing power) on the planet to do that!
As a result, AI must approximate ‘all the evidence that exists.’
Hallucinations, however, are not the result of this approximation but rather a curious byproduct of the AI being uncertain if it is uncertain.
Unlike the certainty of Sherlock Holmes, AI cannot eliminate all the ‘infinite’ possibilities to find what is remaining, so it is both uncertain that it is uncertain while trained to be helpful.
Another way to say this, is that there is no basic reasoning here.
However, there is a super-intelligent worldview that can be leveraged when the human co-pilot crafts the reasoning.
With that said, the challenge is not implementing a customized AI platform where you can own state-of-the-art AI large language models as a zipped file. You own your AI, turn off the internet, it works … securely.
The challenge is the co-pilot.
You now have access to the most powerful tech ever created, at a remarkably low cost. It can do anything digitally that you can reason it to do.
Here is the truly hard part.
What are you going to do with it?