How ChatGPT and other tools are being incorporated into independent school classrooms
The world shook last winter when the latest advance in artificial intelligence – the true-to-life language model known as ChatGPT – became a household term.
Analysts and experts quickly began contemplating the ramifications of this awe-inspiring technological leap. Some were calling it a force as powerful as a nuclear weapon, with the same risk to humanity. Debates mushroomed over whether the technology should be halted, or at least heavily regulated.
Many of Maryland’s independent schools had been incorporating technology into classrooms for years, and had strengthened honor codes to adapt to changing technologies. Educators are already guiding students about how to incorporate AI into their research, and teachers are beginning to use it for lesson planning, exercises, and activities.
For those versed in technology and education, ChatGPT and its competitors seemed less a revolution than a logical extension of recent developments. Artificial intelligence has been everywhere for those paying attention, like in Siri and other voice command apps, and in transcription services.
“At first we banned ChatGPT,” says Kellie Riley, director of academic technology at the St. Paul’s Schools in Baltimore County. “And then over break, we did some research and started to realize this is just going to keep evolving, and we asked, ‘How are we going to move forward?’”
Riley helped lead an effort that offered a series of presentations and briefings for students and staff, along with the development of a new policy that will be in effect for the 2023-24 school year.
The policy states: “At St. Paul’s, we are committed to embracing artificial intelligence as a tool for education. We recognize its potential to enhance learning experiences, encourage critical thinking, and cultivate creativity while emphasizing academic integrity’s importance.
“During orientations and training sessions, students and faculty will receive information about the commitment to promote understanding, responsible use, and ongoing learning about AI within the educational community,” it continues.
Riley has shared the document with colleagues from other schools. She says she is pleased that St. Paul’s is embracing technology that has the potential to assist students with many different learning styles.
“I have a background in assistive technology and helping students with learning differences,” she says. “Having them be able to type their thoughts into ChatGPT, or some other AI program, and use it as a sentence starter – it’s pretty incredible.”
At the Friends School of Baltimore, a group of educators planned to meet for about 50 hours over the summer with the objective of reaching a consensus on how to approach AI, says Joel Hammer, chair of computer science.
“We have a really good group that’s trying to think about how we can protect students from an artificial intelligence that deprives them of critical thought and culture,” Hammer says. “But we also have a group that sees the power to enable kids to do more with what they already know, and expand their knowledge.”
Hammer says that he wants his students to be “not just wily consumers of technology…but to be the masters, rather than to serve our new benevolent robot overlords.”
“There’s a sense that you can just kind of punt to the chatbots, and if you want, it will write your essay for you, but it’s not going to be very good, and you won’t learn anything,” he says. “But you are really missing an opportunity to have artificial intelligence actually help you think more deeply about the topics at hand. It’s actually good at taking care of mundane things.”
Friends does not have a formal policy on AI usage yet, he says. “I’m a computer science educator, which means that if it wasn’t this, it was something else,” Hammer said. “I’ve always been challenged by the constantly evolving field that I teach.”
In school settings, a prime concern about artificial intelligence and the advancing quality of language models is plagiarism. Because ChatGPT took off over winter break last year, it came at a time when teachers had been working with their students for months and knew their writing styles. This coming school year could be different, however.
One technology used by several schools to check for plagiarism – a program called turnitin.com – announced last spring that it was integrating an “AI detector” to measure how many sentences in a written submission may have been generated by artificial intelligence.
“Educators told us that being able to accurately detect AI written text is their first priority right now,” says Chris Caren, CEO at Turnitin, in a news release. “They need to be able to detect AI with very high certainty to assess the authenticity of a student’s work and determine how to best engage with them.”
Independent schools generally have robust honor codes. At Loyola Blakefield, the college prep school in Towson, upper school principal Brian Marana notes that the school’s academic integrity policy was rewritten fairly recently. “We weren’t talking about banning particular tools,” he says. “But it was saying if you are using a piece of technology in an unauthorized way, or getting unauthorized assistance, you were setting the groundwork for an honor violation.”
That integrity policy was one of three frameworks at Loyola Blakefield that have helped faculty and students integrate AI technology into their learning. The others are what he calls a technology structure and a pedagogical structure.
On the technology side, Loyola Blakefield has been working since 2018 on a strategic plan, led by director of information technology Steve Morill, that aims to, “encourage our students in the discerning use of technology,” says Marana, deploying tools to amplify students’ abilities while also learning when not to use technology.
Marana said the pedogeological structure includes opportunities for faculty to engage in professional development conversations about ChatGPT. Ryan Bromwell, assistant principal for academics, plays a key role in those efforts.
“Having these three structures in place helped us navigate things,” Marana says. “The message we sent to our students and to our faculty was that this in many ways is just another step in the technological evolutionary journey.”
“If years ago you were wringing your hands over calculators, and how calculators might undo mathematical ability, and if 20 years ago you were wringing your hands a little bit about Google, and how Google would undo students’ ability to learn basic information… all of those concerns in their own way are valid,” he says. “Yet we also know that all of those tools are useful and helpful. And that’s been our basic approach to Chat GPT.”
At Gerstell Academy in Finksburg, a group of teachers and administrators has been examining the strengths and weaknesses of ChatGPT, and determining how its usage fits into the school’s mission of producing strong leaders.
“What we are really looking at is related to the benefits of ethical decision making, and coaching our students and educating our students to use a critical lens when gaining information from a source,” says Brian Abbott, head of the Gerstell Academy upper school. “It’s not that we are encouraging the use of AI right now. But it’s important to do that critical thinking when you are putting your information together.”
“We don’t want to put our heads in the sand and ignore it,” says Meghan Jothen, Gerstell’s director of instructional technology. “We want to teach our students how to compete and how to use AI and harness it.”
At Maryvale Preparatory School, “we used the past year to monitor the ongoing advances in AI and observe its uses in the classroom,” says Academic Dean Gracie Smith. One example: the school’s 9th grade biology teacher used ChatGPT to generate a description of protein synthesis and asked students to evaluate the results.
“At the end of the school year, we collected detailed feedback about AI from our faculty,” Smith says. The school is using the data to inform policies and practice for the 2023-24 school year.
“There are so many opportunities for the use of AI in the classroom, the workplace, and the world,” Smith continues. “For Maryvale to best prepare young women for life, it is essential that our students learn the benefits and drawbacks of AI, and develop the skills to ethically interact with it.”
At Boys’ Latin School of Maryland, “we are currently involved in professional development around AI to ensure that AI is carefully incorporated into our teaching philosophy, and that it is used to enhance learning, not replace it,” says Brandon Mollet, the school’s academic dean.
“As experts in all-boys education we focus on experiential, hands-on learning and are committed to this teaching philosophy across the entire school, grades K-12. Our students are up and moving, collaborating and applying critical thinking skills to real-world problems. Our experiential, hands-on approach to education will never change, and new technology will always be viewed through the lens of how it can be incorporated to enhance the student experience.”
From Boys’ Latin to St. Paul’s, officials at Maryland’s independent schools know the debate isn’t going away, and will continue to evolve. They are ready to respond and adapt, just as educational institutions should.
This article is part of the 2023-2024 Guide to Baltimore Independent Schools.