What can Machine Learning tell us?
I earlier wrote a very naive and philosophical article on "What did I learn while learning Machine Learning" but as I dive deep into it day by day, reading articles about Artificial Intelligence defeating humans in certain tasks and headlines like "Computer Shows Human Intuition—AI Breakthrough!", I became more intrigued but skeptical. And what excited me was how the program had been imagined and implemented - Instead of being fine-tuned by proficient players, AlphaZero originally knew nothing more than the rules of chess. It learned how to play, and to "win" (I emphasize that), by playing against itself.
But what has the computer learned so quickly that we haven’t in all those years? Unfortunately, the neural network isn’t telling us.
Is the software really showing human intuition, loosely defined as cognition without reasoning?
Also, a few years ago, Libratus from Carnegie Mellon University was shown to lead to a Nash Equilibrium strategy in which neither player can gain by changing strategy, resulting in a tie among equally proficient and knowledgeable participants.
More and more as it intrigues me, a common thing I seem to notice is, the strategies used are something like this:
Computers are playing not to lose; Humans play to win and in doing so they lose.
This seems counterintuitive. A profound understanding of all this eludes me. For instance, it sacrifices pieces to gain a position at a greater frequency than humans would usually attempt.