Change is the only constant, and college educators are facing yet another potential sea change in how students interact with technology: ChatGPT, a sophisticated natural-language chatbot making headlines for its ability to quickly deliver anything from MATLAB functions to fully fleshed-out essays.
To get in on the ground floor of these changes, faculty in the Walter Scott, Jr. College of Engineering are ramping up discussions around using ChatGPT as a natural part of the educational journey.
Provost’s Ethics Colloquium on the Academic Impact of ChatGPT
Feb. 16, 4-6 p.m., Lory Student Center Theatre. Learn more.
Dan Baker, teaching associate professor in the Department of Civil and Environmental Engineering, and Sam Bechara, associate professor of practice in the Department of Mechanical Engineering, are co-coordinators of the college’s Master Teacher Initiative, a program housed in The Institute for Learning and Teaching. They recently hosted the first in a series of lunchtime discussions around the threats and opportunities that ChatGPT – and other technologies surely to follow – present in the context of a rigorous engineering education.
Teaching in the context of artificial intelligence
Both Baker and Bechara approach the artificial intelligence software with curiosity and excitement, even as questions around academic dishonesty and changes to the learning process demand answers while such tools become ubiquitously available. The Provost’s Office, in collaboration with TILT, has published guidelines around how programs like ChatGPT intersect with the university’s academic policies, and a Provost’s Ethics Colloquium on Feb. 16, 4-6 p.m. in the Lory Student Center Theatre, will dissect the issue further.
“This is not the first technology tool that we’ve faced in an educational setting,” Baker said to the small group of colleagues gathered to learn more about ChatGPT. Indeed, throughout history disruptive technologies have caused consternation around the ability for humans to continue teaching and learning. There was the slide rule, the calculator, the desktop computer, Baker said. The internet came after that.
“We’re trying to learn from the past as well,” Baker continued. “So in that context, (ChatGPT) is a tool.”
While times are a-changing regarding best practices for pedagogy, they were changing long before ChatGPT made its debut, noted Bechara.
“One of the things we’ve done historically as instructors was this whole carrot and stick thing, and a lot of times the carrots and sticks are the assignments,” Bechara said. “And I think we really have to move away from that mentality. Assignments can no longer be mechanisms we use for motivation, because there’s no reason for students to do them anymore except for learning. Emphasizing that the learning is what’s important, not the assignment, is what’s really going to be important.”
And since it’s become so “trivially easy” for students to instantly find answers to homework problems using tools like ChatGPT, instructors need to up their game as to the types of problems assigned, Bechara continued. Case in point: Bechara showed an engineering problem he once assigned to students but doesn’t anymore, because it’s too easy for ChatGPT. Sure enough, asking ChatGPT to “write a MATLAB function that returns an array that contains all prime numbers between 0 and input of n” brings forth a perfectly correct response.
Baker and Bechara’s approach so far: Be open and honest with students, show them how the tool works and what it can be used for, and emphasize that learning and personal growth are the reasons they are at CSU, not just to get a degree.
There’s more discussion to come. Pinar Omur-Ozbek, a faculty member in civil engineering, attended the seminar and came away with lots of questions. “I’m open minded, and I’m glad these tools are developing,” she said. “We want to embrace these things as a learning tool, and to ensure that it doesn’t hinder us.”
Learn more about CSU’s approach to ChatGPT and AI
The Institute for Learning and Teaching (TILT) has developed a new website called Artificial Intelligence and Academic Integrity. Academic Integrity Program Director Joseph Brown and TILT staff developed it to provide faculty with short-term strategies for ChatGPT, as they continue to monitor the availability of technology-based solutions already in development. The website also includes information about how content created by an AI engine and submitted for credit is covered by the CSU Student Conduct Code.