ChatGPT Sparks Debate Throughout Fordham’s Campuses

The ability to reprimand students who utilize the program is in discussion following cases of plagiarism found in classrooms

By DELBERT MEAR III and MARYAM BESHARA

ChatGPT, the latest in artificial intelligence technology, has sparked discussions at Fordham and other colleges and universities about how it can be used in higher education without compromising academic integrity. 

OpenAI, a San Francisco-based AI company, developed ChatGPT and released it as a free public application on Nov. 30, 2022. The software’s capabilities are based on the user’s input, ranging from mathematical equations to prompts that ask for the generation of ideas, arguments or other essay writing elements. The program is also able to answer follow-up questions, admit mistakes, challenge incorrect premises and reject inappropriate requests. 

The application has garnered mass attention for its succinct and coherent responses as well as its wide breadth of knowledge spanning across different fields. Upon its release, Fordham faculty, students and the administration voiced their opinions on the software’s place within an academic setting. 

Faculty Share Initial Thoughts on ChatGPT

The software program elicited mixed reactions throughout the university, with some faculty members finding the software interesting and others approaching it with hesitation. 

However, she added that while AI can supplement some tasks humans perform, she is unsure that it would ever replace “critical thinking, problem solving (or) human beings.”

Melissa Labonte, an associate professor in the political science department, shared that her initial thoughts on the software stemmed from campus conversations regarding what ChatGPT might mean for academia. She noted that she is unafraid of the software and thinks of it like Google. However, she added that while AI can supplement some tasks humans perform, she is unsure that it would ever replace “critical thinking, problem solving (or) human beings.”

Unlike Labonte, Steven Stoll, professor of history at Fordham, shared that upon learning about ChatGPT, he was afraid of the new program because it could replace writers. He mentioned that the software is a concern in the humanities department because it allows those seeking an education in reading and writing to bypass that process. 

“It really is throwing away a liberal arts education simply to see it as a game or something purely transactional, in which one’s task is to outsmart the professor and figure out a way to have an essay written for you,” he said.

Stoll added that he does not see any benefit or use of the program, especially when it comes to the software helping to solve issues that have to do with presenting, thinking, expressing or writing. 

“For the future of the university, it’s a constant threat that undermines the teaching of writing.” Steven Stoll, professor of history at Fordham

“It’s brilliant, but there’s a lot of things that people invent that turn out to be very destructive,” he said.

Stoll emphasized that the prominence of ChatGPT jeopardizes not only the craft of writing but also grading methods. 

“For the future of the university, it’s a constant threat that undermines the teaching of writing,” he said. “It’s very disturbing the idea that professors would be reading and grading the dumb output of a machine, as though it were the work of a student, and then just calling that the legitimate work of the university.” 

According to Labonte, faculty who are critical of the software are doing so “from a place of good intentions.” She noted that ethics and standards of academic integrity are important pillars when it comes to identifying oneself as a recipient of a Fordham diploma. 

Use of ChatGPT Found in Classrooms

In an email sent on Jan. 25 to students enrolled in his “Capitalism” course about the use of ChatGPT, Stoll expressed his disapproval of the software and any students engaging with its services. 

“Using these machines violates every principle of education,” he said in the email. “They make a mockery of the entire enterprise.” 

Stoll said he urges students in his class to consider alternative actions if they are thinking of using ChatGPT, such as requesting an extension on an assignment or going to the writing center for extra credit.

Stoll added that he is determined to use other software programs to detect the use of ChatGPT and related bots. He said that he uses ZeroGPT, a free software that detects ChatGPT-generated content but he believes that programs like ChatGPT will become more advanced.

Stoll said he urges students in his class to consider alternative actions if they are thinking of using ChatGPT, such as requesting an extension on an assignment or going to the writing center for extra credit. He noted that using anything to “fabricate your work is throwing your money away on a college education.”

Kathryn Kueny, professor of theology at Fordham College at Lincoln Center (FCLC), said that she has already begun to see students using ChatGPT for assignments. She shared that although she understands how the software can help generate ideas or bring topics together, she noticed that the content it produces is often formulaic. 

Kueny mentioned that some of the patterns she has seen from ChatGPT when her students have used it are that the submissions do not fully encompass the requirements detailed in the assignment; the product lacks specific sources; the language used is “stiff”; the vocabulary is different; and the sources in the bibliography are not sources to which students would typically have access. Kueny believes any ideas generated would need to be revised and edited prior to being published. 

According to The Oxford Review, ChatGPT is able to generate bibliographies with sources that are nonexistent.

Similar to Kueny, Labonte also noted that one of the criteria she noticed regarding the usage of ChatGPT in her students’ assignments is their bibliographies. She noted that the sources they submit are ones she believes they would normally not have access to or are not reflective of her students’ work. According to The Oxford Review, ChatGPT is able to generate bibliographies with sources that are nonexistent.

“As human beings, we need to be more mindful and more deliberate and more creative in terms of how we think about it,” she said. 

According to the Faculty Senate meeting minutes from Jan. 20, Vice Provost Jonathan Crystal noted that there have been a few cases of plagiarism at the university. He added that Fordham Law School has suggested drafting a policy in conjunction with academic integrity committees at Fordham’s other schools, which have been asked by their respective deans to study the software. 

The meeting minutes indicated that another senator also mentioned during the ChatGPT discussion that given the university’s investment in Blackboard, a web-based virtual learning environment where professors can upload their coursework for students, the program should propose solutions to AI plagiarism problems. Following this input, others noted that there is a larger problem than plagiarism and believe that the university should approach it with the “same kind of concerted effort expanded on the challenges of online learning.”

Students Speak on ChatGPT

The program’s usage at Fordham and overall usefulness was a topic of discussion on the minds of Fordham students. Although the software sparked interest in its launch, it also created a sense of worry for those who want to graduate from Fordham alongside its reputation for academic integrity. 

“I definitely knew people were going to misuse it but I feel like it opens a whole new world of opportunities for both good and bad usage.”  Adib Belal, FCLC ’26,

Adib Belal, (FCLC) ’26, said that he sees ChatGPT as a “very exciting prospect.” He added that he believes it can do more good than harm.

“I definitely knew people were going to misuse it,” he said. “But I feel like it opens a whole new world of opportunities for both good and bad usage.” 

Kei Sugae, FCLC ’26 and a theater major, said that he does not anticipate using ChatGPT for classwork explaining that he does not need a program to produce texts or scripts because he is at Fordham to learn how to be creative.

“There is a part of me that sees it as a threat to creativity and people who desire to create for a living,” he said. “Being able to just produce a script, just by putting in the basic concepts of an idea, is taking away the process that many playwrights go through.”

Sugae mentioned that outside of an academic standpoint, he believes that it is “amazing” that people have been able to develop a software program like ChatGPT but noted that there will be a new level of mistrust regarding what information is released into the world. 

“I want Fordham to take measures to limit ChatGPT or at least come up with a method that ChatGPT can be used to the students’ advantage and educate students on how to use it toward a higher level of academia.” Kei Sugae, FCLC ’26

In a similar vein, Belal said that while the software can replace certain structures, he does not believe it can replace writing or literature. He finds that humans are more skilled when it comes to those fields but noted examples of the software being useful as a tool for essay writing or to help develop answers or explanations for practice problems. 

“It’s a very efficient way of getting things done and studying,” he said. “You can use it to learn in that sense.”

Regarding the administration, Sugae said that he wants some form of acknowledgement from an academic integrity standpoint. 

“I want Fordham to take measures to limit ChatGPT or at least come up with a method that ChatGPT can be used to the students’ advantage and educate students on how to use it toward a higher level of academia,” he said.  

Sugae noted that if students rely on ChatGPT without repercussions, the work and talent produced from Fordham will be limited and the university’s reputation will be affected. 

“The more students who don’t produce their own work, the less qualified and less skilled labor coming out of Fordham will be,” he said. 

Faculty and Administration Comment on Action

ChatGPT has evoked a consistent feeling throughout Fordham faculty and administration in which there is a consensus to ensure that students are not misusing the software and their education is not being threatened. 

Maura Mast, dean of Fordham College at Rose Hill, likened the program to Chegg, an educational technology program that students used in 2020 to get answers to tests. She noted that part of the administration’s responsibility to students is to help them understand how to navigate this technology and “when is it okay to rely on it as a tool and when you need to rely on yourself for learning.”

Laura Auricchio, dean of FCLC explained her belief that when a new type of technology is introduced, it opens up opportunities for both possibility and tremendous misuse. She added that the technology is reliant on the intent behind its use and cited essay writing as an example where ChatGPT should not be used. 

“There’s room to adapt our policies to ensure our students do not engage in unethical practices by using this tool.” Melissa Labonte, an associate professor in the political science department

“Just because something is digitally produced instead of produced by a human, it doesn’t make it of less value,” she said. “It makes it a different value and to be judged on different standards.”

Mast noted that a separation needs to be made regarding what ChatGPT does and how it can be used effectively. 

Both Mast and Auricchio believe that the university should enact a policy to ensure the academic integrity of students, but Mast said that she thinks “there’s very little we can regulate in terms of what students do.” Mast added that although the university does have rules and policies in place regarding academic integrity, there is a limit to how the administration can “police” the software program.

Like Mast and Auricchio, Labonte does not believe the university will be able to prohibit the software’s use. She shared that she understands the criticisms surrounding ChatGPT but believes that the conversation needs to be centered around regulation and adaptation rather than prohibition.  

“There’s room to adapt our policies to ensure our students do not engage in unethical practices by using this tool,” she said.

Labonte added that at the end of the day, students are the ones who are responsible for their work, and if they choose to utilize ChatGPT as a tool in a way that is “ethically inappropriate” or is not in line with Fordham’s policies, she believes some sanctions are warranted. 

She also said ChatGPT should be carefully monitored, and it is still too early in the software’s release to understand the impact it will have on different forms of writing.

She also said ChatGPT should be carefully monitored, and it is still too early in the software’s release to understand the impact it will have on different forms of writing. She noted that the program’s power should not be underestimated, but it should also not be overestimated because it is in its early stages. 

Labonte noted that in order to effectively address the implications this program may have, all stakeholders must engage in conversation with one another, rather than administrators, faculty and students all speaking separately. 

“I don’t think this is something that’s going to ever disappear; it’ll be something we’d have to get used to and plan for and think seriously about,” she said.