While some struggle to understand what ChatGPT is all about, Rose-Hulman Institute of Technology senior Jay Jochheim is already adept at using it.
Jochheim, a senior mechanical engineering major from Chicago, uses the artificial-intelligence-powered tool for a variety of purposes, including with his tabletop role-playing game group.
“It helps me with creative aspects” in developing encounters for the role-playing group, he said.
He also has used it to help bridge the gap in an assignment he is trying to figure out. He gave the example of an assignment for a code-based class, in which he sought assistance when his code kept having a lot of errors.
ChatGPT is an artificial intelligence (AI) system that can generate human-like text in response to prompts from users.
OpenAI’s artificial intelligence chatbot was opened to the public in November 2022, and in less than a week surpassed the one million users mark, with people using it for things like creating code and writing essays, according to Forbes.
ChatGPT has generated a major buzz in education circles, both higher ed and K-12, with some concerns about the potential for cheating. Some schools have already seen students using ChatGPT to do homework or write essays.
“It definitely has generated a huge balloon of talk in higher education,” said Kay C Dee, Rose-Hulman Institute of Technology associate dean of learning and technology.
She and faculty members Kim Tracy and Michael Wollowski are part of a Rose-Hulman committee charged “to identify issues and opportunities with Chat GPT and identify which areas of the institute are best equipped to address those,” said Dee, who chairs the committee.
Jochheim sees both the benefits and drawbacks.
Among the positives, he said, “There are a lot of ways for it to help people compress a lot of information and filter out a lot of the garbage and chaff.”
Among the negatives, he’s concerned that general knowledge attainment will be lower, in some aspects. People won’t have to pursue and acquire as much knowledge because ChatGPT does it for them.
Other negatives include potential for misuse and copyright infringement issues, Jochheim said. He believes laws and regulations in response to the new technology will be slow to develop.
Interacting with chatbot
In an interview, Kim Tracy, Rose-Hulman’s assistant professor of computer science and software engineering, provided a demonstration.
He used the prompt, “Write an upbeat introductory story for a local newspaper reporter about ChatGPT.”
It generated the following response, in part:
“As the world becomes more reliant on technology, artificial intelligence has become a hot topic of conversation. Among the many AI models out there, ChatGPT is quickly gaining popularity as a go-to virtual assistant for many individuals and businesses.”
According to Tracy, Chat GPT “absorbs public information from the web and other sources to produce its best answer. It’s like a sophisticated search engine … It is trying to match all this existing prior information to what you are telling it you want in order to give its best response.”
But based on the information it regurgitates and the source, it may have factual inaccuracies, biases or even offensive language.
“Computers do not have minds,” Wollowski said.
For a class he teaches, he asked ChatGPT about sensory organs of an ant. It got most things right, but “really failed about vision.” It stated they have great vision, but in reality, ants have blurry vision, he said.
Users need to know their topic and ask very specific questions, Wollowski said.
A new technology tool
There are concerns about ChatGPT, including the potential for cheating, such as a student having ChatGPT write an essay for them.
“Certainly we don’t want students using it and passing off ChatGPT’s work as their own. That’s very hazardous as we know ChatGPT tends to be wrong a lot,” Dee said. “So if they blindly trust it, that’s a problem.”
A good use, she said, was the example given by Jochheim on using it for the creative aspect of role-playing games. “Give it a prompt, see what it generates and you can take it from there,” Dee said.
Tracy says it can help with productivity, such as generating a first draft of a letter or grant proposal that can then quickly be refined.
The response from educators runs the gamut, Dee said.
“For some, this is an existential threat. This is taking our jobs,” some say. “Others are excited about a new tool we can use in new ways,” she said.
In higher education, there has been a lot of collaboration and sharing on how different colleges are approaching it.
There is a compilation of different policies and syllabus statements from colleges across the nation “so that others have a model to use when they try to deal with this in their classrooms,” Dee said.
Rose-Hulman as a whole doesn’t have a formal policy yet, but each faculty member is free to set their own policy in their course syllabus, Dee said.
In her class, the syllabus states that students can use ChatGPT as a way to generate creative prompts, but they “need to be aware ChatGPT output is often wrong. It can be objectionable. It can be offensive. It can be biased.”
She’s teaching a medical device regulatory affairs course right now, and most assignments require factual correctness. “They can’t depend on ChatGPT for that,” she said.
She tells students they can use it, but they must cite the source. If they use it and pass it off as their own work, that is academic dishonesty and “here is the penalty you will face.”
Also, if the information is wrong, they risk failing the assignment, Dee said.
‘Next evolution’ of society
At Indiana State University, James Gustafson, associate professor of history and faculty senate chair, said that personally, “I am less and less concerned the more I learn about it.”
In his opinion, “It would only be useful for cheating if the tests were prepared very poorly and could be cheated on easily without it,” he said. “It was actually a good reminder to everyone to prepare meaningful evaluations.”
For example, Gustafson does a lot of “identification” quizzes, where he has students explain major themes in the course back to him.
“Part of that could easily be done by anyone with internet access,” he said. “But what ChatGPT and other AI tools lack is participatory knowledge. How did we discuss it in class, what context did we develop around that idea to make it meaningful, how can you explain this through the primary sources we read together?”
He has asked the faculty senate student affairs committee to look at academic dishonesty policies and make recommendations on any changes they deem appropriate.
Molly Hare, director of ISU’s Faculty Center for Teaching Excellence, says, “We are adamant that we want to use this technology and these opportunities to benefit teaching and learning,” rather than banning the technology or prohibiting people from using it.
“This is the next evolution of our society,” Hare said. “It’s time for us to reap the benefit of this information.”
With ChatGPT and AI overall, “We’re still learning more and more,” she said. Her office is working with faculty on how they can incorporate the technology into their classrooms.
Areas of discussion include issues related to ethics and academic honesty.
K-12 concerns
In K-12 education, some school districts across the nation have banned ChatGPT from all school devices out of concerns students will use it to cheat.
But in a New York Times article, Kevin Roose, a technology columnist, states his belief that banning ChatGPT from the classroom is the wrong move.
“Instead, I believe schools should thoughtfully embrace ChatGPT as a teaching aid — one that could unlock student creativity, offer personalized tutoring, and better prepare students to work alongside AI systems as adults.”
He suggests that in the short term, schools should treat ChatGPT the way they treat calculators — allowing it for some assignments, but not others, “and assuming that unless students are being supervised in person with their devices stashed away, they’re probably using one.”
Then, over time, teachers can modify their lesson plans — replacing take-home exams with in-class tests or group discussions, for example — to try to keep cheaters at bay, Roose wrote.
According to a March 3 Education Week story, a new survey shows that many teachers have a positive view of the artificial intelligence technology and are even using it more than their students.
Among those surveyed in the Impact Research/Walton Foundation survey, 59% of teachers agreed that ChatGPT will likely have legitimate educational uses that cannot be ignored.
Teachers reported using the AI program for lesson planning, generating creative ideas for their classes and putting together background knowledge for their lessons, according to the March 3 Education Week article.
In the Vigo County School Corp., “We want to be familiar with all of the new technologies that might benefit our students. We want to continue to evaluate the merits of the new technologies and teach our students how to use digital tools respectfully and responsibly,” said Karen Goeller, deputy superintendent, and Doug Miller, director of technology.
Currently, the VCSC does not allow students to access ChatGPT using district-provided Chromebooks. Users must be 18 or older, according to OpenAI’s conditions of use.
However, all staff devices are able to access ChatGPT. By allowing staff to access it, “We are able to build an understanding of how we may implement the technology in the future,” Goeller and Miller stated.
Kevin Bolinger, ISU education professor, said he has used ChatGPT to create documents using assignment prompts from various instructors. He finds it “incredibly efficient and accurate.”
He believes K-12 will adapt. Persuasive essays or technical writing may have to be done in class to insure students have the ability to express themselves through writing in these ways.
Otherwise, teaching strategies will change to focus on what artificial intelligence creates, Bolinger suggests.
For example, ChatGPT can create book summaries, persuasive essays and original poetry. Rather than have students, as they have in the past, create these documents, educators will need to shift toward evaluation of the AI-created product — evaluating it for how closely it matches the objectives of the assignment.