University of Wyoming to Release Guidelines on Student and Faculty Use of Artificial Intelligence Chatbots
Working group addresses impact of ChatGPT software in education
- Published In: Other News & Features
- Last Updated: Feb 27, 2023
ChatGPT, created by OpenAI, allows the public to freely use its services. OpenAI has also developed DALL-E, the AI model that develops images. (Wyoming Truth photo by Kaycee Clark)
By Kaycee Clark-Mellott
Special to the Wyoming Truth
LARAMIE, Wyo.— The University of Wyoming (UW)’s Artificial Intelligence Chatbots Working Group is expected to publicly issue recommendations on student and faculty use of chatbot software and artificial intelligence (AI) early this week.
“Heading into the [spring] semester, we realized that we’d probably need to start thinking about how to adjust policies and our curriculum or assignments in relationship to the technology,” said Rick Fisher, a group member and lecturer in the department of English.
The 11-member group was co-chaired by Anne Alexander, vice provost of strategic planning and initiatives, and Renée Laegreid, professor of history and chair of the faculty senate. Members included Steven Barrett, vice provost of undergraduate education, and Gabrielle Allen, director of the school of computing.
The guidelines were distributed internally at UW on Feb. 24. They include barring “unpermitted use of Artificial-Intelligence-based applications,” surveying faculty for “future programming” and establishing avenues to communicate with students about the appropriate use of AI in courses.
Additionally, the working group recommended that UW prepare for more advancements in AI and endorsed the following action items: develop a template for course syllabi that includes the phrase “unpermitted use”; allocate resources to address AI-driven shifts of campus-wide practices; promote diversity, equity, inclusion and privacy with specifics on equitable access to chatbot software, as a “plus” version of ChatGPT is available for $20/month; take action to limit the bias or inaccuracies students may come across in relation to DEI; update faculty retention, tenure and promotion policies to “reflect disciplinary impacts of AI-variants”; and establish an ongoing working group to address future AI developments.
UW faculty are split on how AI software can impact education, Fisher said. He reached out to colleagues across campus and discovered some embraced it, but others were skeptical.
AI and ChatGPT experimentation, concerns
The working group specifically looked at ChatGPT, one of the most popular chatbots. It was created by OpenAI and released last November. ChatGPT pulls data from a significant portion of the internet that also uses a flexible model with “billions of degrees of freedom,” said Lars Kotthoff, an assistant professor of computer science and head of UW’s Meta-Algorithmics, Learning and Large-scale Empirical Testing (MALLET) lab.
The program is intended to answer questions in a conversational format. On social media, ChatGPT has been shown answering online exam questions, brainstorming business ideas and writing essays. In theory, students could use the software to write papers for their classes.
“The overall idea is to eventually get [the software] something close to the human brain,” Kotthoff said.
Both Kotthoff and Laegreid compared AI to a previous technology innovation: the calculator.
“When calculators first came around, people might have been quite resistant to [their] introduction in academia because, ‘You should be able to do that in your head,’” Kotthoff said. “A similar example is word processors… If a student submits an assignment and they didn’t put it through a spell checker, I’m going to get quite angry with them because it’s a simple thing to do.”
While the software isn’t perfect, it is tempting to try it out. UW President Ed Seidel related his experiment with the software during the university’s ChatGPT roundtable; he used it to compare Japanese imperialism during World War II to America’s Manifest Destiny. The result: five “really good” paragraphs, Seidel said, at which point he requested the software transform the comparison into a Bob Dylan song.
Laegreid and Fisher also tested ChatGPT to see its capabilities firsthand.
“I asked it to write my annual narrative [for 2022],” Laegreid said, “and oh my goodness, was it so wrong. Not only did I graduate from high school as valedictorian in 2022, I graduated from Harvard, got my postdoc and I’ve been publishing extensively.”
Fisher, meanwhile, was curious about how the software would fare against his op-ed assignment about UW’s Saddle Up program. After he refined the information he shared and provided a clarification to distinguish UW from the University of Washington (also known as UW), Fisher was amazed at how ChatGPT performed.
“It was uncannily like the critiques that my students were making,” Fisher said. “I was surprised at how good it could make up an answer that’s a generic critique. On the other hand, it lacked many advanced features I would be expecting” from students.
Some members of the working group are concerned that the software will prevent students from engaging in critical thinking or developing original ideas. However, Kotthoff believes it can help develop a different kind of critical thinking.
“I think it is going to help with critical thinking, because students can actually focus on the concepts behind it rather than the nitty gritty,” Kotthoff said. “It provides a different angle on things [and] that is going to be extremely valuable in all kinds of different contexts.”