23 Things I Learned Co-Editing a 650+ Page Book on Generative AI and Its Impact on Education
I spent a lot of time over the past 6 weeks co-editing Chat(GPT): Navigating the Impact of Generative AI Technologies on Educational Theory and Practice, a 650-page volume about the impact of generative AI on education. That involved reading at least a few thousand articles, following what many of the most highly qualified and influential people are saying on LinkedIn, and providing a content edit of most of the very thoughtful chapters written by 32 incredible contributors. Based on that, I’ve reached the following tentative conclusions about where we are with generative AI in education and what we should think about going forward.
(12a) They need to learn the basics of AI and how LLMs function, both to understand what these tools can do and what they cannot do. If one more person says ChatGPT is useless because it can’t produce bibliographies, my head will explode.
(12b) They need to understand the basic difference between tools, including which ones can access the web, which ones include calculators, etc.
(12c) They need guidance for how they are expected to manage student use in the classroom.
(12d) Ideally, they need guidance on how to teach students to use it productively.
(12e) They need to learn how to use these tools to strengthen their own capacities as teachers, including saving themselves enormous amounts of time generating lesson plans, materials and other form of assessments. I have a theory that once they learn how to use it to assist with their own work, they'll be more comfortable with students using it to help with their work (and they'll even be able to help them do that).
(12f) They need to have their feelings heard and their opinions accounted for in decision-making related to the technology.
13. The biggest effect on education is on how teachers evaluate students, since they still use "original" work in the form of essays. Many teachers and professors falsely think “AI writing detection tools” are useful. And they falsely assume this problem (the ability of the tools to generate text output) is not going to get a lot worse: currently, yes, teachers who are familiar with GAI output can distinguish that from student work, but we are not that far away from where individual bots will be able to replicate the students’ current writing styles and abilities. It’s also the case that many teachers are not familiar with the existing and detectable language patterns produced by tools such as ChatGPT and thousands of students are passing off AI generated material as their own work. This is why they are inevitably struggling to manage this.
14.Thinking is too reactive. There is an astonishing lack of appreciation for how much this technology will continue to advance and that the current GAI tools are really nothing more than prototypes that are designed to see how people use them. These tools will overcome their many (or all) of their limitations and improve, continuing to be able to accomplish more and more of what humans can do. There is even strong reason to believe the most advanced tech these companies already possess is being held back and slowly released. Educators need to stop saying this technology is no big deal and that it won’t impact the classroom.
15. Artificial general intelligence discussions are relevant. (AGI) refers to the idea that machines will eventually have the same average intelligence as humans. No one knows precisely what that means. Does it mean the intelligence of your average remote office worker (Altman)? Does it mean a machine has enough intelligence to do what most workers can do? Does it mean it can do what every remote employee can do (that would be impressive!)? Does it mean a machine can do what must students or teachers/professors can do? And no one knows exactly what ChatGPT5 and other emerging LLM and non-LLM models will be able do, but we know that soon these technologies will be able to do a lot more (e.g., write in our own voices rather than generic (though grammatically perfect) GPT speak and talk like us. WHEN that happens (perhaps at the start of the spring semester 2024) can education (especially K-12) be any more prepared than it is now?
At a minimum, though, it makes sense to start thinking through this as there are more and more highly qualified individuals who think AGI, whatever your definition is, will be by the fall 2025 semester (Shapiro). And even if you refuse to believe that (it is debatable), figuring out exactly what the criteria are for determining AGI and then trying to figure out if a machine meets them at any point in time is a waste of time from the perspective of an educators, because AGI or not, these technologies will continue to grow and have the capabilities of to do things many humans currently do and are trained to do in school. That is going to undermine a lot of both the value of what we are teaching students and how we are teaching it.
16. What is a useful, employable skill will change radically. A lot has been written in the last few years about how students should learn how to code so that they’ll always have job security. Now we know that one of GAI’s greatest strengths is coding and that millions of coders will likely lose their jobs. Educators need to think about how to help students develop skills in a world where many of the jobs that exist now will either not exist or will only exist in limited quantities in the future.
17. The discussion about this technology in education is focused on extremes: Use it everywhere or don’t use it at all. This isn’t helpful. The discussion needs to be about where, when and how to use it. I think everyone would agree that first graders need to learn to write sentences and that second graders need to learn to write paragraphs. At the same time, it’s not clear that there is value to teaching technical writing to college students, as machines are probably better than that (or they soon will be). Until Brain-Computer-Implants are widespread or VR glasses can instantly put relevant information in front of us, we are all going to need foundational knowledge such as very basic math. Educators need to invest time and energy into figuring out what that essential foundational knowledge is and how to best teach students to acquire it. They should think through that in their own subject areas. And they need to think about how to assess this knowledge in this new world and reduce their reliance on the essay.
18. If we continue to sit back and pretend that we don’t need to engage this technology and engage it quickly and directly, we are in a lot of trouble. There is no way to keep it out of school buildings, as technology is porous in many ways. And attempting to isolate students from it only (further) isolates schools from the the “AI World” (Bill Gates) that we are starting to otherwise live in. We can lock students in the buildings and keep track of their every movement when they are there, but we can’t lock networked society out of schools, both physically and as part of the larger world our students are continually connected to. Students need to be educated to thrive in the world they live in, not the one we wish they (and us) live in.
19. There is tremendous value to AI integration into schools. As we outline in the book, this includes individualized instruction and assessment, critical thinking, career readiness and support for college admissions.
20. No matter what teachers/professors and administrators think of the technology, they have to manage the growing impact it is having on education. I think that at a minimum they have the responsibility to teach students about the dangers it can present when not properly, just as they talk with students about the harms of social media and potential addictions. Ideally, they would work on some integrations in order to properly prepare students for the AI World and help teachers lesson their own work loads.
Recommended by LinkedIn
21. AI has enormous potential to strengthen human capacity as long as humans take advantage of it properly.
22. Humans need to retain control of the technology and educators need to retain control over how it will be used in our space. We need to welcome the technology, but on terms that enable it to work to benefit us and our students. We do not exist to benefit the technology and its developers.
23. Educators need to wake up and start asking hard questions. AI is here, this is no longer a future projection. We can post all we want on social media about the limitations of these technologies, and we can debate forever about what AGI is and when it will arrive (and if we can even know when it does), but the reality is that even if there were no improvements at all related to current technologies, the number of people we will need to do any of the jobs below will massively decline in the near future, as AI can do these as well or better than most people already.
Copy editors
Paralegals
Computer coders
Software engineers
Data analysts
Technical writers
News writers
Financial analysts
Traders
Graphic designers
Customer service agents
This list could be much longer; I just don’t think it needs to be to make the point. Yes, people who are exceptionally brilliant, talented and understand how to use these technologies well in these areas will survive; those who are not brilliant, talented and able to use the the technologies well will not, because it’s inconceivable that they could do one of these jobs better than a machine and in any company only a limited number of people are needed to leverage these technologies.
Given this truth, how will the educational system respond? Are universities really going to collect $300,000 from students, putting many of them in a lifetime of debt to train them to be coders and copy editors? Are high schools going to continue graduating students who can write 10 page reports in their own words but who don’t know how to use a copilot? Is K-16 education going to start teaching students about the social repercussions of this technology and the potential mental health downsides? What’s the next step?
Writer and Tutor @ justinsuran.com
11moI wonder if in 20 or 25 years the merging of human and AI will be so complete that we no longer try to maintain boundaries between the two.
Generative AI Learner ; Federal & State Education Grant Consultant, Reviewer, Evaluator
11moWow - so much information in a few pages...I have got to get the entire book! 😀
Educator focused on integrating technology into the mainstream learning experience. Tiny fish in this expanding pond…gulp! Education | GenAi | Change Management | Business
1yThis is a great snapshot of everything that is happening in the education & AI space right now. It is an exciting time to be a teacher. Where can I find a copy of the 650+ book?
Director, Human Research Protections Program and Chair, Institutional Review Board
1yThanks very much for this! I especially like #17–the need to think about and talk about AI and its place in education. We either decide for ourselves or it will be decided for us.
Montessori Entrepreneur | School Development and Leadership Consulting | International Speaker | Writer | Teacher and Parenting Coach | Trainer | Course Developer | Content Creator
1yGreat article! I recently wrote an article on ChatGPT for Montessori teachers, and as a person who is focused on Montessori elementary education, I’ll be curious to read about the thoughts on its usage with elementary-aged students. In Montessori spaces in general there has been lots of debate on computer usage in general. Most say it doesn’t have a place in preschool settings, and many feel that upper elementary is the time to introduce computers for research and projects, so that the early elementary years can be more focused on the tangible experience of using books, paper and pencil, and of course the Montessori materials. Having started a Montessori homeschool program with a virtual component, I see that students in grades K-3 do better with tangible materials whereas students in grades 4-6 do well with manipulating the corresponding virtual materials. When it comes to AI tools, I’m guessing that the consensus will eventually land in a similar way…students around ages 9 and up having some training and access to these tools. I’m guessing there will be many conversations by fellow Montessori thought leaders before a general consensus regarding best practices is established.