Since the advent of generative AI and throughout its constant evolution, professors at Drake University have grappled with how to address its possible uses in the classroom. As Drake does not have a unifying AI policy for all departments, individual professors have created their own policies based on their requirements from students and what they want students to learn in the course.
Molly Shepard teaches organizational leadership at the Zimpleman College of Business and is director of the Master of Science in Leadership at the School of Education. She varies the work she gives to her students, but all assignments intend to build students’ critical thinking skills. Her AI policy reflects her views on how students could use it.
“I don’t have a problem with students using AI technology, but much like any other source, it should be referenced as a source,” Shepard said.
Shepard created this policy two years ago, and though she has considered changing it, she decided against doing so. Few students have acknowledged AI use in their essays, though Shepard doesn’t know if that means they aren’t using it at all.
“It can be very clear in some assignments that students have used AI in terms of a written response, which is why I have altered assignments to try to engage in more critical thinking activity that allows you to express opinions, that allows you to bring in sources that support that opinion as opposed to it being just something that AI can come up with,” Shepard said.
Ultimately, Shepard said that students aren’t helping their learning or career prospects if they rely on AI as a replacement for critical thinking. She herself has thought critically about its use, including the possibility of it being biased or using other people’s content for generation.
“It’s generally something that I think can be embraced and considered, but I think we also have to think through the ethical implications that there is behind it,” Shepard said.
Professor of Political Science Kieran Williams assigns students a mix of short essays, research pieces and reading analyses, all of which require work in research and writing. For his AI policy, he extrapolated from his plagiarism policy.
“I’m not going to forbid the use of AI, but if you are using it, you have to be transparent in the way that you would with any other source,” Williams said. “I think students need to get in the habit of learning to cite AI, like actually start using citation formats for AI, and that would usually mean providing quite a lot of information about how you used it.”
In the 1990s and 2000s, Williams saw the advent of the internet and the increased access to information it brought to education, though in his eyes, the internet developed slower than AI and AI has potential to change education further.
Students who use AI in Williams’ class will need to cite what query they used, the generated response and the time and date that the AI generated it as the AI is constantly evolving. Williams also wants students to check the information against a verified human source to avoid AI hallucinations — made-up or incorrect information that the AI presents as fact.
“For a lot of Drake students, it’s not going to be their primary source. They don’t want it to be their primary source. They understand its limitations,” Williams said.
Williams added that workforces will likely expect graduates to have a working knowledge of AI generation.
Timothy Harrington-Taber is an adjunct professor of physics who does not have an explicit AI policy in his syllabus. Harrington-Taber said that AI generation cannot yet create solutions for the problems he gives students. He is against the idea of students using AI to generate answers.
“The principle that comes across is that students are graded on their own work. What generative AI produces is not your work,” Harrington-Taber said.
Harrington-Taber said that he accepts students using AI for editing functions such as rephrasing for clarity and looking for background information, but that students should check any citations it produces due to the prevalence of hallucinations.
“The convention in physics, and I believe most other fields of science, is that when you are citing a source for a specific bit of background, you are supposed to cite the original source. Now, I don’t know how well generative AI does that,” Harrington-Taber said.
Gabriel Ford of the English department does not have an AI policy this semester as the assessments he plans to assign students are quizzes, tests and hands-on projects. But in the past few years he has pivoted his assignments towards in-class presentations and discussions in part to avoid auto-generation.
“It gives a more robust and vivid and visualizable audience to my students. They’re not writing for the void, but they’re writing to contribute to the classroom intellectual community in ways that other people can receive, respond to and build on in our space,” Ford said.
Ford plans to continue moving towards these types of assignments in future courses. He is unsure of what his AI policy would look like if he crafted one, though it is important to him that students do the research they say that they’re completing.
“I’m still watching and learning on what the second quarter of the 21st century’s university space will look like with respect to the digital affordances and what thinking and demonstrating thought and demonstrating research looks like for those students,” Ford said.