While Daily acknowledges that this technological growth is raising new concerns in the world of academia, she doesn’t think it’s entirely unexplored territory. “I think we’ve been in some version of this territory for a while,” says Daily. “Students who commit plagiarism often borrow material from ‘somewhere’, for example a website without a clear author’s name. I suspect that the definition of plagiarism will be extended to things that produce.”
Ultimately, Daily believes, a student using text from ChatGPT will be viewed no differently than someone copying and pasting snippets from Wikipedia without attribution.
The opinion of students about ChatGPT is a completely different problem. There are people, like Cobbs, who can’t imagine putting their name on something generated by a bot, but there are others who see it as just another tool, like a spell checker or even a calculator. To Jacob Gelman, a sophomore at Brown University, ChatGPT exists merely as a handy research assistant and nothing more.
“Calling the use of ChatGPT to pull trusted sources off the internet as ‘cheating’ is absurd. It’s like saying that using the Internet to do research is unethical,” says Gelman. “For me, ChatGPT is the research equivalent of [typing assistant] Grammatically. I use it for practical reasons and that’s all.” Cobbs expressed a similar sentiment, comparing the AI bot to “an online encyclopedia.”
But while students like Gelman use the bot to speed up research, others take advantage of the high-capacity fast input feature to generate completed works for submission. It may seem obvious what qualifies as cheating here, but different schools around the country offer different ways.
According to Carlee Warfield, president of Bryn Mawr College’s Student Honor Board, the school considers any use of these AI platforms as plagiarism. The popularity of the tool only requires more attention to evaluating the intent behind student transgressions. Warfield explains that students who submit essays produced entirely by AI are categorically different from those who borrow from online tools with no knowledge of standard citations. Since the ChatGPT phenomenon is still new, students’ confusion about the ethics is understandable. And it’s unclear what policies will remain in place once the dust settles — at any school.
Amid fundamental changes in both academic and technological fields, universities are being forced to rethink their definitions of academic integrity to fairly reflect the conditions of society. The only problem is that society does not show stagnation.
“Villanova’s current academic integrity code will be updated with language that prohibits the use of these tools to generate text that students then display as text they independently generated,” explains Daily. “But I think it’s something in development. And what it can do and then what we need to keep an eye on will also be a bit of a moving target.
In addition to increasingly complex questions about whether ChatGPT is an investigation tool or a plagiarism engine, there is also the possibility that it can be used for learning. In other educational settings, teachers see it as a way to show students the shortcomings of AI. Some instructors are already there adjust how they teach by giving students assignments that bots couldn’t complete, such as assignments that require personal information or anecdotes. There’s also the issue of detecting AI use in student work, which is a burgeoning cottage industry completely of itself.
Ultimately, says Daily, schools may need rules that reflect a range of variables.
“I suspect a broad general policy will be developed that essentially says that unless you have permission from a professor to use AI tools, using them is considered a violation of the code of academic integrity,” says Daily. “That gives the faculty a wide latitude to use it in their teaching or in their assignments, as long as they explicitly indicate that they allow it.”
As for ChatGTP, the program agrees. “Advances in areas such as artificial intelligence are expected to drive significant innovation in the coming years,” it says, when asked how schools can combat academic unfairness. “Schools should continually review and update their academic honor codes as technology evolves to ensure they address the current ways technology is used in academic settings.”
But a bot would say that.