By Julie Riddle '92
The sudden rise of generative AI in 2023 and its subsequent embeddedness across the technological landscape may seem akin to an invasion for some people. But for professors Jessica Clements (English) and Pete Tucker '92 and Scott Griffith (computer science), "GenAI" is just the latest evolution in their respective academic disciplines.
Unlike many earlier technological evolutions, however, GenAI "has hit hard and fast and is easily accessible," says Clements, whose specializations include technology and digital writing.
Across campus, faculty and students are grappling with GenAI – its opportunities, pitfalls and complexities – thoughtfully and deeply. Here, Clements, Tucker and Griffith share some of their perspectives on using – and not using – this (r)evolutionary technology in class.
At Whitworth, decisions about whether and how to engage with GenAI in the classroom are up to each professor. Clements and Tucker advocate that faculty members take a proactive and nuanced approach that helps students critically evaluate GenAI's benefits and limitations and equips them to use the technology effectively and ethically.
Both professors practice what they promote. Clements allows her English students to use GenAI selectively in all course levels. Appropriate uses include defining terms, explaining concepts and assisting with generating ideas. In fall 2024, Clements themed her EL 110 writing class around GenAI. Her first-year students analyzed an AI interface, crafted research-based arguments on ethics and other AI-related issues, and composed an AI digital literacy autobiography.
"Critical engagement with generative AI may be even more important for my first-year students,” Clements says. "Why let potentially problematic habits get ingrained when you can help guide and establish guardrails early on?"
In contrast, Tucker says computer science students should not use GenAI during their first three semesters. "In our first-year coding course, [students] could put each assignment prompt in ChatGPT and get a fine answer," he says. "But we need students to learn that foundation. If they don't, they're not going to be able to go out and take on any field in computer science."
For now, GenAI isn't intelligent enough to produce entirely accurate advanced code. For Tucker's upper-division Software Quality Assurance course, he assigns students to have GenAI build a complex code, test the results, and identify the bugs and how to fix them.
As GenAI rapidly advances, Whitworth's computer science faculty meets every six months to update the department's academic honesty policy and discuss its application to individual classes. "We're learning alongside the students and adjusting as we go," Griffith says. "It's a really liminal space to be in."
The department's current policy references a model from Stanford University, which treats GenAI as if students are working with another person. Just as students must cite collaborations with others on assignments, they must cite the use of GenAI.
At the start of each semester, Tucker reviews the policy with his students and tells them, "I don't want to give AI feedback. I want to help you get better." If faculty suspect a student has turned in AI-generated code, Tucker says, "we use it as a formative opportunity and say, 'Let's have a conversation about this so I make sure you understand the concepts I need you to learn.'"
Whitworth is vital in equipping students to understand and critically evaluate the applications of GenAI in different fields, Clements says. "Our Christian, liberal arts professors from varying disciplinary perspectives are uniquely situated to come alongside our students to help them think about generative AI in rich, multifaceted ways."
The university’s new B.A. degree in interdisciplinary computer science advances this kind of multifaceted thinking: Majors are required to earn nine upper-division credits in another discipline, such as psychology or education. "Just staying in computer science isn't enough anymore," Tucker says. "Our solutions are being applied in a lot of different fields, and students should know how those fields speak."
The critical thinking, problem-solving and contextual understanding skills students gain at Whitworth are differentiators when it comes to working in fields being revolutionized by GenAI. "We need graduates who can go out and say, 'Wait a sec – where did that information come from?' and who understand the larger context and implications of AI systems," Griffith says. "Whitworth challenges students to think deeper about the 'why' behind things."
Even as GenAI transforms a spectrum of career fields, "We'll always need human beings to add that element of creativity and to mold what AI puts out," Clements says. "AI can be integrated well into editing, publishing and other English studies-related fields without usurping the human value in that work."
This story appears in the spring 2025 issue of Whitworth Today magazine.