Inside Higher Ed: GPT-4 Is Here. But Most Faculty Lack AI Policies.
Excerpt from Inside Higher Ed | By: Susan D’Agostino | March 21, 2023
“No.” “Nope.” “Not at this time.” “Not yet!” “Just discussing it now.” “I have not.” “I will do this in the future.” “Yes.” “No way.” “Not yet, but I have a lot of ideas …”
This is a representative sample of faculty responses to the question “If you have successfully integrated use of ChatGPT into your classes, how have you done so?” in a 2023 Primary Research Group survey of instructors on views and use of the AI writing tools. (Note: The survey is behind a paywall.) A few other responses of note were “It’s a little scary,” “Desperately interested!” and “I’m thinking of quitting!”
A few short months after OpenAI released ChatGPT—a large language model with an unusual ability to mimic human language and thought—the company released an upgrade known as GPT-4. Unlike the earlier product, which relied on an older generation of the tech, the latest product relies on cutting-edge research and “exhibits human-level performance,” according to the company.
"To be sure, 2023 is still young, and some students, professors and colleges are hard at work drafting artificial intelligence policies. An undergraduate Data, Society and Ethics class at Boston University, for example, has drafted a blueprint for academic use of ChatGPT and similar AI models that they hope will be a starting point for university discussions."