Search

Putting the AI in Academia

USC Upstate faculty are exploring ways to responsibly incorporate artificial intelligence in the classroom.

By Elizabeth Anderson


Since breaking into the public consciousness a little over a year ago, generative AI has been impossible to ignore. AI has powered many forms of technology for some time, but the release of ChatGPT – a type of generative AI that allows users to interact with a chatbot – put a powerful tool into the hands of anyone who was interested in using it.

USC Upstate is among the universities that are now grappling with AI’s potential to transform the workplace, and by extension, the schools preparing students for the careers of the future. ChatGPT in particular is driving a re-examination of curriculum and teaching methods, to ensure graduates develop competency in a fast-evolving technology.

“We are preparing students for a very vibrant, cutting-edge, forward-facing technological economy,” says Celena Kusch, executive director of the Center for Academic Innovation and Faculty Support at USC Upstate. “We know that the places that our students are going to get careers need us to stay on top of these things.”

AI on college campuses is not new. Chatbots, for example, are embedded on many university websites to answer admission questions. Plagiarism detection sites, virtual test proctoring, and early alert systems for academic underperformance are other types of commonly used systems that include AI.

Now educators are looking for ways to extend the technology into the classroom. Because many students are already experimenting with ChatGPT on their own, several faculty members have begun incorporating lessons into their classes to help students learn how to use the technology effectively and ethically.

Talking heads

AI has been part of the training process at the nursing school for a few years already. The simulation lab uses AI-powered software that can be programmed to produce specific symptoms. When nursing students are checking vital signs, the simulators can produce different heart rates, blood pressures, or respiratory rates, says Logan Camp-Spivey, director of simulation.  Or if students are learning to recognize signs of a heart attack or breathing problems, the program can be changed to show those symptoms.

“We can really adapt all of our simulators based on what content is being covered in the course,” Camp-Spivey says.

The simulators can also provide simple responses to questions posed by students. While the technology is fairly basic right now, Camp-Spivey says she’s already seen news about models that will allow for in-depth conversations between student and simulator.

AI technology has improved virtual simulations as well. Based on the action a student takes, a program will adapt to produce the outcomes that would follow. That helps the student learn, rather than just following one path that automatically provides the right outcome, Camp-Spivey says.

Since simulation is an important training tool for future nurses, providing a safe way to make mistakes and learn from them before going into a clinical setting, Camp-Spivey is excited about AI’s potential to make the experience even more realistic.

“I think AI is something we need to embrace,” she says. “It’s learning how we as educators can use it in a positive way for our students to enhance their learning and make it more meaningful.”

Beyond the bedside

Assistant professor Kristi Miller, who teaches nursing informatics, makes sure her students gain an understanding of how AI can be used to improve patient care. Because AI is capable of collecting and analyzing huge amounts of data, it can identify areas for improvement and assess the effectiveness of any changes. For example, AI could analyze data to determine if changing the way nurses take temperatures helps reduce infection rates at hospitals, Miller says.

AI also has the potential to help nurses provide very individualized care. If you’re working in a hospital that specializes in cancer treatment, you could use AI to analyze years of patient records to see which kinds of treatment have worked best for people of specific ages, races and genders – a task so vast no human would have time to do it, Miller explains. With that information, medical professionals could improve care and make it more equitable.

“So to me, what we really need the future nurse to be is someone who has a dual degree in computer science and nursing,” she says. “Because what we have right now is people who know nothing about health care designing our electronic health records and data collection. Nurses need to be involved in designing the interface.”

Miller is also teaching students how to use AI to analyze scans, so they can explain the benefits to patients. One advantage is improved diagnostic accuracy, since AI can sometimes catch something a technician may have missed.

“Everyone thinks the main role of nurses is giving medication and wiping butts, but our main role is actually assessing and educating the patient,” Miller says. “So we’re trying to teach the public what AI is and how it can help them.”

Miller notes that as with any technology, AI is a tool that can be used to improve how nurses do their job and care for their patients. It’s not a replacement for nurses. “The thing we really have to emphasize to people is that there’s always got to be a human involved,” she says. “AI isn’t taking over the world.”

Think about it

Associate professor Wei Zhong, who teaches computer science, agrees with the sentiment. “Right now, people think AI can think, but that’s memorization,” he says. “It’s just learning from patterns. I tell students, creative thinking, that’s where we excel.”

Zhong offers one-hour labs in his AI courses that allow students to experiment with AI and become familiar with such technologies as facial recognition and object detection (such as the alert systems used in cars). Students also work on projects using ChatGPT API, which allows developers to integrate ChatGPT into a product or service, such as ordering food or answering common customer questions.

“It’s very application based,” Zhong says. “I’m not going to teach them theory. The goal is to get them interested.”

He also wants students to understand the limitations of AI. While ChatGPT might be able to write code better and faster than a human, Zhong says it’s not perfect. “Every code has a reasoning behind it,” he tells his students. “Always think about the why.” His assignments incorporate that critical thinking element, so students analyze why a code works, or doesn’t, or if there’s a better way than what ChatGPT has generated.

While coding is an area that likely will see job losses in the future because of AI, Zhong believe there will be a growing demand for workers with ChatGPT API knowledge. With so many new online services constantly in development, knowing how to incorporate API into software will likely be an important skill, Zhong says. “You don’t need to understand the theory behind it, but you can know how to use it to make software smarter,” he says.

Testing the waters

Other faculty at Upstate are learning how to use AI alongside their students. When Uma Gupta, associate professor of business analytics, decided to start teaching about AI this past fall, she reached out to Kusch to find out if there were others on campus already using AI in their classes. A small group formed so faculty could learn from each other and trade ideas, and identify areas for professional development.

Gupta admits that teaching generative AI has been a “sobering experience.” The amount of information available is overwhelming and requires a different approach than what faculty are used to, she notes. “In other disciplines, you can say, OK, here is the foundation, here are the fundamentals, and build on it,” she says. “But with AI, the fundamentals are shifting every day.”

One of her students had to go back and revise their research paper two days after completing it to incorporate new information that had just emerged. Gupta says she advises her students to be fearless, because the technology isn’t going away and they have to learn how to use it.

Just as AI stands to transform the health care industry, it has major potential for business as well. The ability to analyze big data and look for patterns that can improve operational efficiency, or the transportation of goods, or product safety, makes AI an extremely valuable tool for industry, Gupta notes. “So we’re talking about machine learning, where the machine is learning how to do things far better than what humans can do,” she says.

That means the traditional role of a data analyst is likely to change. Gupta wants her students to be able to evaluate their skills as the technology continues to evolve, so they can add value above what AI can do. “You won’t be able to keep your job unless you do,” she says.

To help her students explore AI, Gupta has her class research chatbots – what they are, how they can be used, how they might be applied in their careers. One student, for instance, looked at how chatbots can be incorporated into music therapy to improve a child’s well-being. Another looked at applications in manufacturing. “They could not have done that if I went into class and said, let me teach you about chatbots,” she says.

Balancing act

When ChatGPT became widely available, many initial conversations in academia focused on the potential for cheating – students asking AI to write their papers for them, for example. But discussions have gradually pivoted to how AI can be incorporated into class assignments so students understand its limitations as a shortcut for classwork.

Kusch, who also teaches English, says one assignment she likes to do is have students prompt ChatGPT to write a poem in the style of someone they’re reading in class. The results are always laughably bad, but that’s a perfect opportunity to employ critical thinking skills, Kusch says. “Why is AI so terrible at writing poetry? What makes the real poetry so much better?”

She and other professors in the department also have students use ChatGPT to do background research for a paper. Then the students have to analyze the results ChatGPT has provided, and compare it to the research they’ve done on their own. “So it eliminates the temptation to use the process in unauthorized ways, and it prompts them to think about, what time savings did you get out of this?” she says.

Kusch also sees AI as a tool to provide feedback. Ideally when writing, “it’s so good for students to do peer review, or have a conference with a faculty member to talk about a draft of their paper,” she says. But some students aren’t comfortable with that, or don’t have time during the day to do it. With ChatGPT, however, they can easily ask, “What do you think about this draft? What suggestions do you have? And they can do that in the middle of the night, with no shame,” Kusch says.

Teaching the teachers

Like Kusch, Stephen Bismarck, chair of the department of education, wants his students to learn both the capabilities and shortcomings of AI. In his case, however, the skills he’s teaching students are not only for their personal benefit, but also to pass along to their future students.

The first part of the process is helping his students understand how to effectively use the technology. One strategy Bismarck has found helpful is to have students use AI to develop a teaching unit. For many of them, getting started is the biggest hurdle. “They want to teach a unit on, say, linear functions. And they can look to their textbook and see what’s in there, but I want them to go beyond that,” he says.

Instead, Bismarck has students prompt ChatGPT about a specific topic to see what kind of lessons it comes up with. The students quickly realize ChatGPT is not the perfect solution – the results are usually pretty basic, with instructions such as “find three problems related to this topic.” “But it’s a good starting point,” Bismarck says. “So now that you have that starting point, your next step is to use your expertise to analyze that and make it better.”

As students progress through the semester and learn different methods of teaching math, they return to ChatGPT to get ideas on how to apply each method to hands-on class activities. “So that’s really what I’m utilizing it for – a jumping-off point, a way for students to analyze versus just getting stuck on creating,” he says.

Beyond developing their own AI proficiency, education students also are learning how to teach their own students how to use it. Bismarck notes that when he taught his class last semester and asked how many people had used ChatGPT before, about half of them had. This semester when he asked the question, every hand went up. That means that the students his students are teaching are likely using it too. By having his classes take a critical look at what ChatGPT produces, Bismarck hopes they will do the same in the classroom. “It’s sort of a trickle-down effect, where the emphasis is on analysis,” he says.

Something for everyone

AI isn’t just valuable for teaching core subjects, either. In specialized areas, such as teaching multilingual classrooms, the technology provides opportunities to create resources for future teachers that are currently limited or nonexistent.

Refika Turgut, assistant professor of literacy and language education, works with graduate students who teach English as a second language and undergraduates who plan to teach in multilingual classrooms. Because her students represent different majors, Turgut says a “one size fits all” approach isn’t helpful for them.

Physical education can be a particularly challenging subject to find material for, since there aren’t many resources for teaching multilingual PE classes.  With ChatGPT, Turgut can enter in some prompts to get recommendations of research-based strategies her physical education majors can use. The result is an article framework she can develop herself. And she can do the same for every major in her class, so every student has something to read that’s specific to their subject area.

And the customization doesn’t end there. Turgut can create specific scenarios that her students might encounter – a sophomore math class with Russian-speaking students who are new to the country, for example – and see what kind of strategies ChatGPT suggests in those situations.

“I’ve heard from teachers who say, ‘I can’t differentiate my teaching based on all these different types of learners,’ so they might give up or say it’s too hard,” Turgut says. “But now, if they know how to use AI as a tool, it will make it easier for them to design activities or to get suggestions for instructional practices they can institute in their classroom.”

Turgut was quick to explore AI’s possibilities in the classroom after the release of ChatGPT. She realized immediately it was something faculty would have to incorporate into the classroom, and by extension something future teachers would need to know how to use.

“As a teacher-educator, I’m really excited about using AI, and changing my courses, making them better and making them more relevant for my students and for teachers,” Turgut says.

Being human

Faculty across most disciplines agree that the curriculum of the future is likely to look very different from what it does now. That raises multiple issues that universities across the nation are beginning to grapple with, from what types of policies to put in place to govern the use of AI to who will help develop the policies. Without any federal regulations yet on ethical use of AI, institutions are mostly on their own in setting up guidelines.

Also in the background is the recognition that some entry-level jobs in fields students are preparing for will likely be replaced or greatly reduced through use of AI. White-collar jobs that are typically a starting point for graduates, such as computer programming, medical records coding, customer service, and legal research, are among those expected to see an impact from increased use of AI. The result will be a higher bar to meet for entry-level work, says Jess Stahl, vice president of data science and analytics with the Northwest Commission on Colleges and Universities.

“Part of preparing students for this is going to be your own deep understanding, and you staying current on how AI is impacting your discipline and its various career pathways,” Stahl told faculty during a recent virtual presentation.

Kusch says this doesn’t mean every faculty member needs to be an expert on AI. More important is ensuring there are enough faculty within each department who are excited and engaged with the technology who can help students learn how to use it.

And while some jobs are likely to disappear because of AI, Kusch sees definite limits to that. Nurses provide a level of personal care that cannot be replicated by a machine, and ChatGPT will never be invested in a student’s success the way a teacher is. Nor are virtual experiences of the arts the same as sharing them in community with others. As Kusch sees it, “What we can do better than the most advanced technology is to be human.”

Related stories

  • AI at work: Upstate psychology professor researches how AI is used in hiring
  • Fighting fire with fire: Upstate computer science professor explores how AI can be used in cybersecurity