Professor Chris Girman has an idea for an in-class assignment with his students at Point Park University. It would bring a widespread concern in higher education directly to his classroom.

He’d like them to generate essays using ChatGPT, an artificial intelligence tool attracting national attention over fears it will lead to more elusive cheating on writing assignments. Girman, chair of the Department of Literature, Culture and Society, doesn’t want his students to submit AI-generated essays for a grade. He’d like them to critique the chatbot’s work in class. 

If the essays are “beautiful,” what makes them so? What do the students dislike? Are they happy with the paper that the tool generated on their behalf? Ultimately, Girman wants his students to take ownership of their work and become deeply invested in the process of writing. Through that, he hopes they’ll say: “I don’t want this bot to take this away from me.” 

“We confront it, and that huge fear goes away. We realize how we can control it,” he said of the technology. “Students don’t want to be robbed of the product. They don’t.”

Across the country and in Pittsburgh, universities and professors are wrestling with students’ potential use of artificial intelligence tools like ChatGPT, which can rapidly produce essays and other written materials in response to prompts. Though they have questions and concerns about the technology discouraging original thought, several English and writing professors told PublicSource that AI could also spur improvements in teaching and learning.

“If AI can produce an essay because there’s so much out there already on that topic, and it’s a good, factually sound essay on that topic, well-written — do we need to be pushing ourselves to take a fresh approach?” said Sigrid King, director of Carlow University’s English program.

As professors nationwide look to adapt, some fear that overly punitive responses and a reliance on assignments that restrict the use of technology — such as handwritten and oral exams — will be unproductive for educators and harmful for students of color and those with disabilities.

“We don’t want concerns about the use of these tools to result in significantly more policing of students,” said Lindsay Onufer, senior teaching and learning consultant at the University of Pittsburgh’s teaching center.

How are Pittsburgh professors navigating AI in the classroom?

Ask James Purdy about the ways professors can tackle AI in the classroom, and the head of Duquesne University’s writing center will outline numerous paths forward. Among them? Discussing the technology openly with students, explaining the value and purpose of writing and focusing on the creative process in class — instead of asking only for the final result. 

Assignments that incorporate images and other mediums could be valuable, he added, and those that reference class discussions and local issues could be more difficult to generate with AI. 

“Rather than kind of fall into a ‘gotcha’ culture, where it’s like, ‘Oh, we have to catch students who are using this,’ I think it’s important to engage the technology and to consider its affordances and constraints,” he said.

Some professors are reassessing their approach to essay assignments to get students thinking creatively on their own. This spring at Carlow, King is teaching a course on Shakespeare in the 21st century. She figures there’s a significant amount of material that ChatGPT could draw on to produce an essay for her class, as the technology was trained on a vast amount of online text. In an attempt to outsmart the AI, she’s considering essay prompts that draw on students’ individual experiences.

Even before the rise of ChatGPT, in-class writing assignments were a component of Carlow writing courses. Now, faculty are using these assignments to not only note their students’ style, voice and skill early on but to also signal when a chatbot may be the author.

King doesn’t view the technology as bringing about the demise of essay-writing in college, but she’s concerned that students will lose out on critical thinking skills if they use the tool for that purpose. 

“They’re missing a step in their education about what it means to be an engaged and informed citizen in the world,” she said.

Sigrid King, English program director at Carlow University, navigates AI software ChatGPT in her office on Wednesday, Jan. 25th, 2023. Photo by Amaya Lobato-Rivas/PublicSource)
Sigrid King, English program director at Carlow University, navigates AI software ChatGPT in her office on Wednesday, Jan. 25th, 2023. (Photo by Amaya Lobato-Rivas/PublicSource)

Girman said he aims to learn how his Point Park students have used ChatGPT and provide them with opportunities to use the tool for research. He said universities should adjust their policies on academic integrity, but he doesn’t believe scare tactics are wise. Instead, he’d like policies to pull on students’ morals, demonstrating that the improper use of these tools undermines the “entire reason why you’re here.”

Outside of Pittsburgh, not all responses to AI in the classroom have earned the endorsement of professors. As news of ChatGPT spread online this winter, Torrey Trust, an associate professor of learning technology at the University of Massachusetts Amherst, was taken aback by some of the responses she saw from educators. 

She said some had a “knee-jerk reaction” to potentially shift to more handwritten essays and oral exams, which she worries will disadvantage students with disabilities. She added that efforts to detect AI-generated writing may disproportionately harm students of color, whose work professors may check for integrity violations more frequently.

“Those are not good approaches to teaching, quite frankly. They’re not inclusive, and they’re not accessible,” Trust said. She has created a presentation on the capabilities and constraints of ChatGPT that also suggests how educators can respond.

At Duquesne, faculty were already revamping their curriculum prior to the national news around ChatGPT, English professor Greg Barnhisel said. The goal is to find new ways for students to demonstrate learning beyond an essay or exam, including through the use of multimedia. In one of Barnhisel’s classes, a student submitted a GPS mapping project on American writer William Faulkner. 

“This will just accelerate this conversation,” Barnhisel said. In the end, he said he believes the technology will demonstrate that the college essay is a “historical artifact.” 

“It may die, as other historical artifacts have. I think the key thing is we need to keep our eye on our goal here is to develop critical thinking in students and to develop effective communication.”

What support might faculty need?

Some professors in Pittsburgh aren’t sure how they’ll approach AI in the classroom. Robert Ross, a professor in Point Park’s Department of Literature, Culture and Society, said that having students write only in class would resolve concerns to an extent, but he recognizes that some may be unable to write comfortably in that environment. “I need a whole empty house to write,” he said.

Chris Girman, professor at Point Park University, talks about AI software ChatGPT and its potential impacts on teaching and learning in his office on Monday, Jan. 23rd, 2023. (Photo by Amaya Lobato-Rivas/PublicSource)
Chris Girman, an English professor at Point Park University, would like his students to become so invested in the writing process that they don’t want to rely on ChatGPT. (Photo by Amaya Lobato-Rivas/PublicSource)

He said he trusts his current students, but he imagines he’ll have to make changes in the future. Like other professors who spoke with PublicSource, he’d like to know more about the technology. ChatGPT was released in November 2022, and Ross learned of the technology after he had already crafted his syllabi for the spring. 

Professors said they would value discussions with faculty across disciplines; the inclusion of writing instructors in administrative decisions around AI in the classroom; the greater availability of detection tools; or broader standards in higher education for navigating this technology.

At Carnegie Mellon University, the Eberly Center for Teaching Excellence and Educational Innovation is encouraging faculty to set clear expectations with students about the appropriate use of these tools in class, Director Marsha Lovett said in a statement. She added that CMU’s academic integrity policy — which prohibits cheating, plagiarism and “unauthorized assistance” — covers improper uses.

And at the University of Pittsburgh, faculty can access a list of resources on ChatGPT that the University Center for Teaching and Learning published online last week. The webpage states that faculty can use AI tools to produce test questions, draft lesson plans and provide writing feedback in language learning classes, among other opportunities. It also states that tools claiming to spot AI use aren’t always reliable and shouldn’t be used to monitor for plagiarism.

“There really is no reason to panic,” said John Radzilowicz, interim director of teaching support at the center. “We want to talk to faculty about how you can use the opportunities the technology presents to actually improve your practice.”

At their worst, AI tools such as ChatGPT can spark panic over plagiarism and turn the act of writing into “technical Mad Libs,” Purdy said. But at their best, he said the technology can prompt educators to take a more thoughtful approach to teaching. 

“Broadly, they ask us to rethink what writing is for and to embrace the messiness of learning through writing.”

Emma Folts covers higher education at PublicSource, in partnership with Open Campus. She can be reached at emma@publicsource.org.

This story was fact-checked by Kalilah Stein.

Know more than you did before? Support this work with a gift!

Readers tell us they can't find the information they get from our reporting anywhere else, and we're proud to provide this important service for our community. We work hard to produce accurate, timely, impactful journalism without paywalls that keeps our region informed and moving forward.

However, only about .1% of the people who read our stories contribute to our work financially. Our newsroom depends on the generosity of readers like yourself to make our high-quality local journalism possible, and the costs of the resources it takes to produce it have been rising, so each member means a lot to us.

Your donation to our nonprofit newsroom helps ensure everyone in Allegheny County can stay up-to-date about decisions and events that affect them. Please make your gift of support now.

Emma is a higher education reporter for PublicSource. In her role, she collaborates with Open Campus, a nonprofit newsroom focused on strengthening higher education coverage in local communities. Emma...