AI and the Project Manager: Rise of the Machine
Extract from 'AI and the Project Manager' by Peter Taylor (author of 'The Lazy Project Manager')
AI and the Project Manager: How the Rise of Artificial Intelligence Will Change Your World
1..1 Myths and legends
The history of Artificial Intelligence (AI) began, arguably, in antiquity, with myths, stories and rumours of artificial beings endowed with intelligence or consciousness by master craftsmen.
Thousands of years before machine learning and self-driving cars became reality, the tales of a giant bronze creature called Talos, an artificial woman named Pandora and their creator god, Hephaestus, filled the imaginations of people in ancient Greece.
The story of Talos, first mentioned around 700 B.C. by Hesiod[1], offers one of the earliest conceptions of a robot, Adrienne Mayor[2] has suggested.
“The myth describes Talos as a giant bronze man built by Hephaestus, the Greek god of invention and blacksmithing. Talos was commissioned by Zeus, the king of Greek gods, to protect the island of Crete from invaders. He marched around the island three times every day and hurled boulders at approaching enemy ships.
Although much later versions of the story portray Pandora as an innocent woman who unknowingly opened a box of evil, Mayor said Hesiod’s original described Pandora as an artificial, evil woman built by Hephaestus and sent to Earth on the orders of Zeus to punish humans for discovering fire.
In addition to creating Talos and Pandora, mythical Hephaestus made other self-moving objects, including a set of automated servants, who looked like women but were made of gold, Mayor said. According to Homer’s recounting of the myth, Hephaestus gave these artificial women the gods’ knowledge.”
Mayor argues[3] that they could be considered an ancient mythical version of artificial intelligence.
First beginnings and winters
The field of AI research was founded at a workshop held on the campus of Dartmouth College, in Hanover, New Hampshire, where the term ‘artificial intelligence’ was coined during the summer of 1956, before even the author of this book was born.
But achieving progress in this new field was challenging, interest in the field dropped off from 1974 to 1980 (which became known as the first ‘AI Winter’) but the field later revived in the 1980s when the British government started funding it again in part to compete with efforts by the Japanese.
The field experienced a second ‘AI Winter’ from 1987 to 1993, mostly due to reduced government funding.
But research slowly resumed and in 1997, IBM's Deep Blue[4] became the first computer to beat a chess grand master when it defeated Garry Kasparov. And in 2011, the IBM system ‘Watson[5]’ won the quiz show ‘Jeopardy[6]’ by beating the reigning champions.
High points from fantasy to reality
Here are some ‘high points’ in the AI journey, or rise of the machines, that clearly shows that the ‘AI Winters’ of the past are exactly that, history and not present, or future:
1637 –Long before robots were even a feature of science fiction, scientist and philosopher Rene Descartes[7] pondered the possibility that machines would one day think and make decisions. He identified a division between machines which might one day learn about performing one specific task, and those which might be able to adapt to any job. Today, these two fields are known as narrow and general AI.
1726 – Jonathan Swift[8] publishes ‘Gulliver's Travels’, which includes a description of the ‘Engine’, a which is possibly the earliest known reference to a device in any way resembling a modern computer. The Engine is a device that generates permutations of word sets. It is found at the Academy of Projectors in Lagado, “…everyone knew how laborious the usual method is of attaining to arts and sciences; whereas, by his contrivance, the most ignorant person, at a reasonable charge, and with a little bodily labour, might write books in philosophy, poetry, politics, laws, mathematics, and theology, without the least assistance from genius or study.”
1921 - Czech writer Karel Čapek[9] introduces the word ‘robot’ in his play ‘Rossum's Universal Robots’, the word ‘robot’ comes from the word ‘robota’, meaning ‘work’.
1956 – The Dartmouth Conference., where John McCarthy[10] coined the term ‘artificial intelligence.’
1966 –ELIZA, developed at MIT[11] by Joseph Weizenbaum[12], was the world’s first chatbot – and a direct precursor the likes of Alexa and Siri as it was the first to vocalise communication.
1980 –Digital Equipment Corporation’s XCON expert learning system was credited with generating annual savings for the company of $40 million. This is significant because until this point AI systems were generally regarded as impressive technological feats with limited real-world usefulness.
1988 – A statistical approach. IBM[13] researchers publish ‘A Statistical Approach to Language Translation’, introducing principles of probability into the until-then rule-driven field of machine learning. It tackled the challenge of automated translation between human languages – French and English.
1991 – The birth of the Internet. CERN researcher Tim Berners-Lee put the world's first website online and published the workings of the hypertext transfer protocol (HTTP)., and it does not get much ‘bigger’ or ‘impactful’ than that.
1997 – Deep Blue defeats world chess champion Garry Kasparov
2005 – The DARPA Grand Challenge. This was the second year that DARPA held its Grand Challenge – a race for autonomous vehicles across over 100 kilometres of off-road terrain in the Mojave Desert. In 2004, none of the entrants managed to complete the course. The following year five vehicles successfully completed the course, with the team from Stanford University taking the prize for the fastest time.
Recommended by LinkedIn
2011 – IBM Watson’s Jeopardy! win.
2012 – The true power of deep learning is unveiled to the world. Researchers at Stanford and Google publish a paper ‘Building High-Level Features Using Large Scale Unsupervised Learning’, building on previous research into multilayer neural nets known as deep neural networks. Specifically, they singled out the fact that their system had become highly competent at recognizing pictures of cats.
2014 - ‘Turing’ test[14] was (arguably) achieved for the very first time by computer programme ‘Eugene Goostman[15]’, although many suggest that since it only convinced a third of the ‘judging’ humans it was a mute success, and since it profiled as a 13-year-old boy it had a default level of lack of knowledge that could be explained to the ‘humans’ when it could not answer questions.
2015 – Machines can now ‘see’ better than humans. Researchers studying the annual ImageNet challenge (where algorithms compete to show their proficiency in recognising and describing a library of images) declare that machines are now outperforming humans.
2016 – Gameplay has long been a chosen method for demonstrating the abilities of thinking machines, and AlphaGo[16], created by Deep Mind, defeated world ‘Go[17]’ champion Lee Sedol over five matches. Although Go moves can be described mathematically, the sheer number of the variations of the game that can be played (there are over 100,000 possible opening moves in Go, compared to 400 in Chess) make the brute force approach impractical. AlphaGo used neural networks to study the game and learn as it played.
2018 – Self-driving cars hit the roads.
2020 to present – Many amazing points of progression, too many to note here (I do encourage you to do some personal research on this), but one that is well-placed in this current (as I write this book) pandemic is the achievement of COVID-19 Detection in Lungs by AI. Scientists at the University of Central Florida conducted a study to use artificial intelligence in the detection of COVID-19 in the lungs and the outcome was as accurate as a specialist medical doctor. They trained AI algorithms to identify COVID-19 pneumonia with ninety percent accuracy via computer tomography (CT) scans. It has identified 84 percent positive and 93 percent negative cases at present.
The rise of the machines has to be acknowledged as something pretty amazing and the acceleration in development, and successes, seems to be exponential.
Get your copy of 'AI and the Project Manager' to explore how this rise of Artificial Intelligence will change Project Management forever.
[1] Hesiod was an ancient Greek poet generally thought to have been active between 750 and 650 BC, around the same time as Homer.
[2] Adrienne Mayor is a historian of ancient science and a classical folklorist. Mayor specializes in ancient history and the study of "folk science", or how pre-scientific cultures interpreted data about the natural world, and how these interpretations form the basis of many ancient myths, folklore and popular beliefs.
[4] Deep Blue was a chess-playing computer developed by IBM. It was the first computer to win both a chess game and a chess match against a reigning world champion under regular time controls.
[5] Watson is a question-answering computer system capable of answering questions posed in natural language, developed in IBM's DeepQA project by a research team led by principal investigator David Ferrucci. Watson was named after IBM's founder and first CEO, industrialist Thomas J. Watson
[6] "Jeopardy!" is a classic game show -- with a twist. The answers are given first, and the contestants supply the questions.
[7] René Descartes was a French-born philosopher, mathematician, and scientist who spent a large portion of his working life in the Dutch Republic.
[8] Jonathan Swift was an Anglo-Irish satirist, essayist, political pamphleteer, poet and Anglican cleric who became Dean of St Patrick's Cathedral, Dublin.
[9] Karel Čapek was a Czech writer, playwright and critic. He has become best known for his science fiction, including his novel War with the Newts and play R.U.R., which introduced the word robot.
[10] John McCarthy was an American computer scientist and cognitive scientist. McCarthy was one of the founders of the discipline of artificial intelligence.
[11] Massachusetts Institute of Technology is a private land-grant research university in Cambridge, Massachusetts.
[12] Joseph Weizenbaum was a German American computer scientist and a professor at MIT. The Weizenbaum Award is named after him. He is considered one of the fathers of modern artificial intelligence.
[13] International Business Machines Corporation is an American multinational technology company headquartered in Armonk, New York, with operations in over 170 countries.
[14] The Turing test, originally called the imitation game by Alan Turing in 1950, is a test of a machine's ability to exhibit intelligent behaviour equivalent to, or indistinguishable from, that of a human.
[15] Eugene Goostman is a chatbot that some regard as having passed the Turing test, a test of a computer's ability to communicate indistinguishably from a human.
[16] AlphaGo is a computer program that plays the board game Go. It was developed by DeepMind Technologies which was later acquired by Google. Subsequent versions of AlphaGo became increasingly powerful, including a version that competed under the name Master.
[17] Go is an abstract strategy board game for two players in which the aim is to surround more territory than the opponent. The game was invented in China more than 2,500 years ago and is believed to be the oldest board game continuously played to the present day.
I don't have mine yet, but I am anxiously looking forward to receiving it.
Project Manager @ Ingram Micro | Data Management | Project Management | Lean Six Sigma | Process Engineering | Business Process Management | Digital Transformation | Customer Experience
3yI'll keep this in mind
Manager Deg Academy | AIProjectmanagement.nl | Lecturer | Author | Trendwatcher | Chairman SPIN
3yJust ordered my copy! Also very curious how this will change the role of projectmanagement in the coming years. And which work will be the 20%.