Bringing AI into the Classroom

Banner photo by Solen Feyissa on Unsplash

Associate Professor Matt Parfitt has experimented with incorporating generative AI into his courses at CGS and sits on a BU task force on the topic.

Matt Parfitt was teaching in CGS’ London Program during the summer of 2022 when he came across a generative AI program called Copy.ai. Parfitt, an associate professor of rhetoric with a passion for tech and coding, learned about the program from a coder he follows on X. “This coder tweeted, ‘I just posted my latest blog post, composed entirely with Copy.ai,’” Parfitt says. “This was just before ChatGPT came out. Copy.ai allowed you to enter a prompt and write an essay—something we’re totally familiar with now.”

Parfitt was so intrigued, he downloaded the program and began experimenting with it.

Parfitt teaches a rhetoric course that covers cultural and technological revolutions throughout history, from the industrial revolution all the way up to the digital revolution. The course spans two semesters; it starts in the spring and continues in the summer at the London Program. He wanted to see what kind of responses he’d get from Copy.ai to some of his usual course assignments.

One of his assignments asks students to write an essay on a topic related to the digital revolution. Parfitt remembered a student the year prior who had written an essay about cyberbullying. So he asked the AI program to write a 400-word essay on that topic. “As it happened, we were discussing artificial intelligence in class that day, related to a reading that we’d been doing,” he says. Parfitt brought the AI-generated essay into class and, without letting students know who wrote it, asked them what they thought of the piece. “Out of three sections, only one student guessed that AI had written it. The other students were not aware that there was an AI that was capable of this kind of thing,” he says. “Now, of course, they’d guess it immediately.”

Parfitt set out on a path to explore AI’s capabilities. “I wanted to know: how does this technology work? It occurred to me that this is a moment in history where an entity other than a human being is producing language—kind of a big thing,” he says. Soon after he dabbled with Copy.ai, ChatGPT was introduced to the mainstream in the fall of 2022.

“And boom, it was everywhere,” Parfitt says.

He began showing colleagues how to use ChatGPT. He also knew it was inevitable that he’d soon be seeing students using generative AI, and that it was best to embrace rather than fight the technology. So he became a member of BU’s inaugural AI Task Force, created in September 2023 as instructors worried about the implications of generative AI, including possible threats to academic integrity and weakening of students’ critical thinking skills.

“The thing we most wanted to prevent is people just ignoring AI as though it’s going to go away,” Parfitt says of the task force. “It’s obviously not going to go away. Students were already using it in all kinds of ways, often in very legitimate ways.”

The task force created a final report at the end of the 2024 academic year, which addresses ethical concerns around generative AI and includes best practices and recommended policies for using generative AI in classes. The report notes, as one example, that “in tasks that incorporate writing, GenAI tools can assist with language development, genre awareness, and the structuring of arguments.”

This is something Parfitt learned firsthand when, during the spring 2024 semester, he and a group of professors from the College of Arts & Sciences piloted AI-intensive writing courses, with support from a grant from BU’s Shipley Center for Digital Learning & Innovation. Parfitt incorporated AI into his rhetoric class about revolutions.

Thanks to the grant, students were given subscriptions to ChatGPT Plus, the paid version of the generative AI chatbot, and each section had an “AI Affiliate” teaching assistant.

Parfitt says one thing he discovered through incorporating generative AI into his course is that the software can act as a tutor of sorts for students. He points to an assignment he gave early in the semester, where students had to write a paper about the Neolithic Revolution. “A lot of students don’t even know what those words mean, at first. They’re at sea. They don’t even know what the search terms should be if they’re looking up scholarly articles,” he says. “So being able to ask [ChatGPT] a very naive question and get back a response that at least begins to tell them who the major voices are in the conversation about a particular topic is really valuable. You can’t go to Google or Wikipedia to get that kind of answer, because you already have to know what the search terms are in order to get a good response. Just to kind of launch them and get them started on a research project is huge.”

Writing Awareness

Parfitt didn’t change the course assignments for this special AI-intensive iteration of the class; rather, he wanted to see how AI could be applied to them. He and his other colleagues at CAS who were teaching AI-intensive writing classes agreed on a policy of letting students use AI to write up to 40 percent of an assignment, as long as they acknowledged where they used it and how they incorporated it. That kind of explanation of how they engaged with AI and integrated it with their own writing was one method of promoting critical thinking. However, even with permission, Parfitt says students never even came close to using that much AI-generated work in their assignments.

Parfitt also found that using generative AI opened up important discussions around diction, syntax, and grammar. In the middle of the semester, he discussed with students the essay as a literary form, one that is distinct from academic papers. As part of the unit, he had students read an essay by the English author Geoff Dyer, Blues for Vincent. “It’s in four segments, and each of the segments are related to one another, but exactly how they relate to one another is not immediately very obvious,” Parfitt says. “So we read the essay together, and we talk about how they’re connected.”

Then, Parfitt asked his students to write a fifth segment of the essay on their own. And as an addition this past spring, he asked students to have ChatGPT write a fifth segment. “What I wanted the students to do was really closely compare the strengths and weaknesses of each—their version and ChatGPT’s,” Parfitt says. Students discovered that ChatGPT’s version lacked a clear, interesting voice. “ChatGPT does a good job with diction and sentence structure, and a pretty impressive job of doing something that actually does seem to fulfill the assignment, but the student’s own writing is going to have a personality that the ChatGPT writing really doesn’t. It’s going to be kind of neutral or cliched, in a way that their own writing isn’t.”

He hopes to spend more time diving into these nuances of language and voice as he incorporates generative AI into future courses. “I think I would have needed a two-hour class to really get into the pros and cons of the ChatGPT versions versus student versions. Maybe I’ll spread the discussion over two classes.”

Parfitt says he and the CAS instructors are passing along what they learned from these writing classes with their colleagues, providing teaching tips and techniques. When asked what he thinks the future implications of AI on research, writing, and critical thinking will be, he has a cautious, but optimistic, outlook. In some ways, he thinks AI might encourage academia to pay more attention to the nuances of the human voice in academic writing. “I kind of hope that it’s going to change academic writing more than anything else,” he says. “I think and hope that we’re going to be emphasizing those human qualities of writing a little bit more, emphasizing voice and writing from your own specific subject position, rather than some sort of neutral stance, which academic writing often has. Maybe we’ll value that a little bit more than we have.”