BU writing instructors brainstormed this list of teaching ideas in the spirit of exploration and experimentation. We will continue to refine and update. You’ll notice that almost all of the suggestions end with a recommendation that students reflect on, debrief, compare, or discuss the AI-generated text—a step crucial to developing their critical AI literacy.
Grammar and Vocabulary
Please note that per Writing Program guidelines, students in all WR classes are authorized to use AI tools for input on grammar, usage, and vocabulary and don’t need to cite the tool when used in these ways. Refer to Section 2 of the syllabus templates for specific language.
Feedback on grammar and diction. Have students ask AI to provide feedback on grammar and diction in their drafts and to suggest corrections and explanations. Encourage follow-up questions as part of the assignment (e.g., students need to have a minimum of two required follow-up questions for the AI tool in order to engage in a conversation). Complete as an in-class activity or ask students to do it as homework and share their conversation with AI with you. Follow with a class discussion of the pros and cons of this kind of access to “correct” English. How might this kind of AI feedback empower students for whom standard American English is not a home language? How might it disempower them?
Building confidence using particular grammatical constructions. Have students identify a verb tense they struggle with, or a particular kind of phrase or clause (apposition, participial phrases, etc.). Then have students ask the AI tool to generate ten sentences using that specific construction. Ask students to evaluate those sentences, confirming that AI has produced correct sentences and noting how the construction is used in the different sentences. Then ask students to have AI write a summary of a text or an analysis/response to a short passage of text including that construction, and evaluate the writing and usage. You may choose to provide students with focused questions/points of analysis.
Generating text for revision. Ask AI to produce a flawed example of writing. This should be something that your students are currently working on in the class, e.g., a paragraph with quote integration errors, grammar problems, or cohesion issues. Then ask students to evaluate and revise the text using criteria discussed in class. Finally, you can ask AI to revise the text, so that the AI-produced revisions can be compared to the student revisions. What are the differences between the revisions? Where was the AI tool more or less effective than the human at revising the prose for a particular genre, audience, and purpose?
Learning new vocabulary from class readings. Ask students to identify two or three new/unfamiliar/challenging vocabulary words they have come across in their reading for the day, and complete the following steps:
- Ask AI to act as a teacher and explain the meaning of a word, the different contexts the word is often used in, and the word’s grammatical properties.
- Ask AI to use the word in 5-8 sentences.
- Use your judgment regarding the quality of these sentences. If you would like to see how the word may be used in more advanced settings than what AI has produced for you, ask AI to give you more options in longer and more complex sentences.
- Produce a few novel sentences or a short paragraph that incorporates the words and to share their sentences with peers.
Generating keywords. Have students ask an AI tool for a group of words related to the topic/theme they are writing or presenting about. This is especially helpful if students are writing, presenting about, or researching a topic they are unfamiliar with and need help with making sure they have a decent vocabulary pool. The instructor may ask the students to further engage with AI and ask the meanings of the words, synonyms, usage in sentences and in different contexts. Students could be asked to share samples of their AI conversations with the instructor or in a group setting. Depending on the topic, it may be valuable to discuss if or how the keywords reflect generative AI’s documented biases.
Analyzing connotations. Ask students to produce three or four synonyms for a keyword in their paper or presentation. The synonyms could be generated using AI or an online dictionary. Next, have students ask AI to explain the connotations of each word and the contexts in which they tend to be used. Students can also ask AI to produce sample sentences that use these words. Students should then carefully consider the differences between the synonyms and decide which are most appropriate to use in their paper/presentation.
Alternatively, this activity could be used to understand tone and/or register.
Style & Language Choices
Analyzing style. Instructors and students ask AI for feedback on a text and to identify key stylistic features (elegance, coherence, brevity, audience, diction and tone, etc.). Consider using a text students are familiar with first to help scaffold the activity. Discuss and assess the AI output. What does it notice, neglect, or assume? Follow up by prompting students to perform the same exercise on a piece of their own writing, perhaps in conjunction with an upcoming assignment focused on style. Have them discuss and debrief in small groups.
Identifying linguistic bias. Introduce the idea of linguistic bias and explain how the source material from which AI generates its output can reproduce human bias. In small groups, ask students to input one or more prompts relevant to your topic into the AI tool. Ask them to identify instances of linguistic bias in the AI-generated text. For example, look for stereotypes, gender-specific language, and cultural assumptions embedded in the language. Use this exercise as a way to raise awareness of linguistic bias in texts students read and/or produce for the course.
Revising sentences. Students collaborate with AI on sentence-level revisions. Give the tool a sample sentence, either a student or AI-generated example, and ask it to revise for clarity and concision, to make it more or less formal, to add a particular emphasis, etc. Prompt students in groups or as a class to explain the revisions and propose possible additional adjustments. Ask the AI to offer its own explanations and discuss the differences. You might also ask AI to provide several revisions in response to your request, then discuss the different versions AI generated.
Generating topic-specific example sentences. Instructors can use AI to help generate “unclear” sample sentences specific to your course topic (containing nominalizations, abstract subjects, nonspecific verbs, jargon, etc.). Specify your desired confusions. Use these examples to help illustrate stylistic principles of clarity. Prompt students to revise for clarity and to explain their revisions. Compare student revisions with AI revisions and discuss.
Considering language variation. Ask AI questions about why it sounds generic. Does its prose reflect all speakers of English, or only the English of certain groups of speakers? Is it capable of code-switching and code-meshing? Discuss its answers.
—ChatGPT (July 2023)
—ChatGPT (July 2023)
Feedback & Peer Review
Soliciting specific feedback. Ask students to interact with AI as another peer reviewer. First, have students solicit feedback on their paper draft or an excerpt from their draft, listing two or three criteria that they want AI to use in evaluating their draft. These criteria should be directly connected to the assignment that students are working on. Next, encourage students to critically evaluate the feedback they received from AI and to decide whether or not they agree with it and how they might implement it. Optionally, students could also be asked to reflect (orally or in writing) on the similarities and differences between the feedback they received from AI and from their human peers. A variation on this theme: Students ask a peer to create a reverse outline of their draft and identify coherent and incoherent paragraphs, and clear and unclear transitions. Then ask AI to do the same and compare notes.
Soliciting general feedback. In this variation on the above activity, you can ask students to solicit more general feedback from AI on their writing. For example, have students ask AI to identify sentences that may be hard to understand or ideas that need elaboration or illustration. Students can also ask AI to identify grammar or word choice issues in their writing, citation errors, cohesion problems, etc. (These may depend on the assignment goals and language strategies that are currently being practiced in class.) This activity can be combined with one of the grammar activities listed above, e.g., after generating a list of grammar errors, students can ask AI to explain the grammatical rules/word meanings and to generate a list of sentences that feature these rules/words.
Modeling how to give feedback. Some students feel uncomfortable giving feedback—especially critical feedback—to other writers. This issue may be more common among international students because they may not feel confident in their use of English or because overt critique may be considered impolite or “face threatening” in some cultures. AI-produced texts can be used in class to model giving feedback and offering constructive critique. For example, after asking AI to produce an essay similar to the one the class is currently working on, have students work in groups and identify a few strengths of the essay as well as some areas for improvement. You can also ask students to offer revisions or additions that would strengthen the essay. Finally, consider asking students to reflect on what they learned about giving feedback, including consideration of emotional aspects of giving feedback to a human rather than an AI tool.
Working with Texts
Revising discussion questions. Ask students to develop a few discussion questions for a given course reading. Have students enter them in AI and ask whether the questions lend themselves to summary or analysis of the text. Next, students solicit feedback from AI on the discussion questions. They should list specific criteria they want AI to use in evaluating their questions in terms of both content and style. Finally, students critically reflect on the feedback and revise their questions to focus on analysis.
Identifying features of a summary. Use AI to produce a summary of a class reading. (Note: AI programs like ChatGPT do not have access to recent texts, so consider using a text that is at least several years old. Alternatively, you can upload a PDF to Claude and ask it to summarize.) Then, guide students in analyzing the rhetorical moves and language used in the summary. The activity can be used to generate a list of summarizing conventions and serve as an introduction to a summary assignment. Students can also be asked to evaluate the AI-generated summary based on a list of key criteria, like accuracy, fairness/neutrality, etc., and to practice giving constructive feedback. This activity can be adapted to other genres, like argument or rhetorical analysis.
Evaluating paraphrases. Ask the AI tool to produce four paraphrases of a given sentence. Identify the paraphrasing strategies that were used in each paraphrase and discuss how, if at all, the different word and grammar choices could affect interpretation of the sentence. Finally, evaluate and rank each paraphrase: How accurately does it represent the original meaning and tone?
Checking for plagiarism. Ask AI to compare a paraphrase or summary you have produced to the original text with an eye to plagiarism. How close is yours to the original text? Where does it diverge? How likely is it to be regarded as plagiarized? Use the AI tool to alter your text so that it is more linguistically distinct from the original text, and analyze the particular kinds of changes it made. Reflect on what you learned about paraphrasing from this exercise.
Understanding elements of argument. Students crowdsource a list of qualities they believe must be present for an argument to exist and then prompt AI to evaluate or add to their list. Which additions or critiques are important and which are trivial? Why? As a follow up, students could share their own written arguments with both a peer and an AI tool. The author then compares the results of peer- vs. AI-produced analysis and identifies different ways each might be helpful.
Evaluating and refining premises. Invite students to generate a claim (or use one that is commonly believed or disbelieved) and then use AI to determine that claim’s stated and unstated assumptions. Students could also ask for a breakdown of factual versus logical grounds for the claim or ask AI to rank the assumptions from least to most controversial, subtle, etc. This activity could lead to interesting conversations about what counts as “controversial,” “subtle,” etc.
Evaluating and refining claims. Have AI produce a series of provisional claims about a topic. Facilitate group discussion in which students analyze and critique the claims produced by AI. Alternately, have students produce a series of provisional claims about a topic and have AI rank them from least to most complex, from least to most sophisticated, or in some other way. Have students discuss whether or not they agree with the rankings and why. These are opportunities to talk about what criteria we use in deciding what makes a “good” claim.
Exploring counterclaims. Have students ask AI to generate counterclaims by entering their own working claims and prompting the tool to “act as a contrarian” and push against their argument to get them to clarify their position. Encourage students to continue the chat by listing their reasons one by one, along with the evidence, to see what counter-argument AI provides or listing all their reasons and evidence for the AI tool to respond to as a set.
Using evidence and analysis as foundation for claims. Invite students to compile a list of quotations or pieces of evidence from a source, along with some preliminary analysis. Students then come together in small groups to compare their lists and analyses and together generate a provisional claim that best acknowledges the complexity and nuances of the analyses brought to the table. At this point, an AI tool can be prompted to create an outline for a critical essay using this provisional claim and the evidence used to arrive at it. Students discuss and critique the outline together. What do they like/dislike about the AI-generated outline? Does it do justice to their analyses? Why or why not? A variation on the exercise above is to have each student ask the AI tool to generate several different outlines, perhaps deliberately asking it to put emphasis on this or that theme or aspect of the argument. What happens as a result? Students could discuss the results in small groups. This activity could serve to show how the frame of reference, the context, affects the development of the claim.
Developing genre awareness. Ask the class to name a familiar or favorite genre and prompt AI to generate a large number of examples. Facilitate a discussion in which students identify a few static features of a given genre as well as its social or practical function (genre as “social action”). This activity could give rise to conversations about the ethics (and etiquette?) of AI use. For instance, are there genres or contexts that it would feel silly/gauche/unethical to collaborate with AI on? Why or why not?
Discerning elements of scholarly genres. Have students use AI to compare and contrast sources geared toward scholarly vs. non-scholarly audiences. In small groups, students arrive at a list of features they believe are distinct to the scholarly sources. They then prompt AI to do the same and discuss the significance of these findings. A further step could be the introduction of the concept of critical language awareness, discussion of power relations and bias (including in the materials LLMs have been trained on), in scholarly writing. What do the language and genres we use tell us about the social worlds we come from and the social worlds we are aspiring to enter?
Imitating genres. Students collaborate with AI to identify/decode genre conventions and practice deploying them in appropriate rhetorical situations. Have AI produce short texts illustrative of familiar genres already scaffolded in class. Prompt students to discuss the output, citing features and conventions specific to the genre, and ask them to evaluate whether the AI is successful in reproducing those elements. Repeat this process with a new (or target) genre depending on your course level. Ask students to note differences with a familiar genre and to identify some key affordances of the new/target genre.
Transforming genres. Use AI to model how writers can rework material to communicate with a new audience in a new context. Give the AI tool an example from a familiar genre and ask it to adapt the text for a target genre (from a formal essay to an op-ed, e.g.). Have students study the output and discuss it in small groups. Debrief together as a class and assess what the AI produced. What adjustments and revisions are apparent? Are they adequate and appropriate? What could be improved? Another option is to invite students to use AI to translate research projects into different genres targeted for different audiences and media. For instance, students can be given three different target audiences or media types, and then use AI to produce short versions of their research paper targeted for those audiences or designed for that media (such as a slide deck, a speech, a pamphlet, etc.). Have students evaluate AI’s facility with these different genres and to identify any potential biases that might be affecting assumptions about the target audience.
Creating genre templates/examples. Instructors can use AI to produce model templates for a new target genre or example sentences responsive to a new rhetorical situation.
Bot or not? Share this CNN article with students, then create a similar challenge based on your course materials or assignments. As the authors note, “Be on the lookout for oddly generic or repetitive writing, as well as factual errors.”
Fact checking. Use AI to generate information about a topic, and then have students verify or fact-check that information using sources that they can vet for credibility. Discuss which sources should be considered credible for verifying particular kinds of information (i.e., established scientific facts vs. cutting edge research findings vs. recent news vs. government data). Alternatively, ask AI to include sources in its output. Then ask students to use information available through library databases to confirm the legitimacy and accuracy of the sources (including authors, titles, dates of publication).
Finding and selecting sources. Students can ask AI to produce an annotated bibliography about a topic. Then, students can check the annotated bibliography sources using more traditional strategies, such as looking for those sources via BU Libraries search or Google Scholar. Have students identify which sources are real and which have errors or may be entirely fabricated, and then reflect on the dangers or benefits of relying on information provided via this manner. Students can use AI to identify potential exhibit source texts; e.g. prompts that ask AI to produce a list of primary sources in one or more media types associated with the course topic.
Forming new research questions. When helping students understand what makes a good research question, ask students to use AI as an interlocutor in this process, to help them brainstorm ideas about questions that interest them or to provide a range of possible views about an issue or topic. Have students use AI to produce a series of currently debated topics in a field or area of study. Students can also ask AI to translate those topics into questions. After, students can themselves evaluate the questions produced or consider pursuing variations on those questions. Then have students draft out a map (visual or written) of perspectives, related concepts, or adjacent topics discovered during their conversation with AI, as well as what new questions occurred to them during and after the interaction.
Learn More: Writing Instruction in the Age of Generative AI