Graphic of green text spelling ‘UNC’ (Graphic by Adrian Tillman via UNC Media Hub)
Lesley Gonzalez via UNC Media Hub
UNC-Chapel Hill has pushed full steam ahead with their approach to AI as the campus seeks to serve as an example for other higher education institutions.
The UNC Generative AI Committee, run by Director of the Writing and Learning Center Kim Abels, has issued guidelines that faculty, staff and students can reference while using AI on campus. But the committee’s work stops short of setting policies for the whole campus, according to the following statement on the UNC Office of the Provost’s website: “The committee’s charge does not include developing university-wide generative AI policies but serves to support each school develop its own policies appropriate to its unique circumstances.”
Mike Barker, vice chancellor of IT and chief information officer, has been working closely with Information Technology Services and other campus departments to bring the AI discussion to the forefront and work on getting campus engaged with the new technology.
“The chancellor is engaged, the provost is engaged, deans and academic leaders are engaged. It’s an exciting time to be in this particular space at the university,” he said.
As campus leaders collaborate to craft functional university rules around AI, students and staff have voiced both support and concerns for the use of AI at the school.
Tom Collopy, professor of the practice and the entrepreneur-in-residence for the Shuford Program in Entrepeneurship, teaches ECON 327: Venture Building using Chat-GPT 4. The program is hosted by the College of Arts and Sciences and is built to empower students by giving them entrepreneurial experience and mentorship.
“In my entrepreneurship class, I want them [the students] to come out with tools they can use no matter what they’re doing. And I think Chat-GPT’s that kind of tool,” he said.
Collopy’s class follows his textbook and dives into how to use Chat-GPT to the fullest, teaching students how to develop better, more effective prompts in order to support their entrepreneurial goals. Everything done with the program is documented and submitted, in order to keep track of what is AI-produced and what is contributed by the student. This is in accordance with the Generative AI Committee’s policy that the use of AI must be open and documented.
“I feel like we’re at that point where I think people are cautious of [AI], you know, but, if you allow people to try experiments and see where the bounds should be, I think that’s good,” Collopy said.
Aastha Modi, a business administration major from Morrisville, is a student in Collopy’s class, and has been exploring AI in many of their other classes as well. In a previous semester, Modi conducted market research into how AI is used in movie production, and has since continued to look into AI as a tool across different fields.
“I know that it’s a hugely growing field right now. I feel like knowing how to use it now would be beneficial,” they said.
Modi doesn’t tend to use AI in their daily life, but has come to be increasingly familiar with it through Collopy’s class and other professors who have encouraged its use.
“I’m in an ECON410 class right now, and I struggle a lot, and sometimes I’ll use Chat-GPT to explain concepts to me when I don’t have people around me who can,” Modi said, “but I don’t really see a use for it in a creative space.”
“There’s a lot of activity, a lot of discussion about what are the appropriate use cases for AI,” Barker said.
For example, Collopy restricts students from using AI on class quizzes.
“Everything that they’re doing has a foundation in Chat-GPT in terms of learning, it’s just the quizzes are the only limitation of ‘can’t do that.’ If they [the students] did it, it wouldn’t actually help them,” Collopy said, “because the quiz questions are designed to be specific to what was read. It would just take longer.”
Adjunct instructor Lee Meredith of the UNC Hussman School of Media and Journalism has also started to embrace AI use in his classes, particularly his Media Management class. He encourages students to experiment with and report their AI use on things like research projects and essays.
“This (AI) is with us, it’s going to be with us. And the quicker we learn how to use it, the quicker we learn the attributes and the benefits of AI. The quicker we tap into those, the better off we’re going to be,” Meredith said, “and also, I think, in lockstep with that, the quicker we learn the dangers that are associated with the use of AI. You know, you can’t learn about the dangers if you’re not engaging. And so I do want my students to engage.”
Meredith believes that the current guidelines for AI are well made and support an appropriate use of AI in the classroom. He believes more official policies should be made with specificity in mind to help faculty and students alike keep up.
“I don’t think the process should be so onerous that people are tempted not to comply. Or even worse, that people are tempted not to use AI because the compliance process scares them off. I think we want to encourage the use of AI, and we do want to document its use appropriately,” Meredith said.
Meredith has also used AI to help get feedback on his lectures and bring some art into his syllabus. Running his lecture points through generative AI programs for feedback has helped him develop some new ideas, he said.
“I don’t rely on AI as the final source for much of anything,” Meredith said, due to AI’s occasional “hallucinations” that can provide inaccurate information and false sources.
Despite this, Meredith greatly appreciates how AI can be used in the creative sphere.
“I have no art skills whatsoever. So, I have creative ideas. I could think of something that would be cool,” he said, “Could I personally implement that as an artist? No, no way. But now with AI, you know, I can do some things.”
Overall, the current goal expressed in campus discussions is to familiarize students and staff with AI as much as possible.
“We’re in a phase with this where we’re providing the opportunity for people to become more familiar,” Barker said of the university’s work to improve AI awareness and literacy. “Broad familiarity is a win, so [we’re doing] things like exposing the chat-like interface to GPT 4 via Microsoft Copilot.”
Copilot is Microsoft’s recently launched generative AI tool that ITS on campus has made accessible to all students, faculty and staff. It’s meant to help those at UNC-CH have a secure AI resource to experiment with and improve understanding and familiarity.
This may help students like Savannah Noel, a sophomore from Utica, New York, who majors in Biology and Psychology. She has used other AI tools as guides for her study sessions, but isn’t completely sure about campus boundaries related to its usage.
“I honestly don’t know too much about UNC-CH’s policies and their restrictions on AI. But I know a lot of professors are strict with anti-AI and all of that stuff, but I don’t know the details behind it, and to what extent,” Noel said.
Barker and other campus leaders are working to improve these unknowns for students. As professors continue to develop standards for AI in class, students are picking up their own routines for AI usage in their academics.
The university has come together to break down how AI affects different areas of campus life, and as the discussion develops over the coming years, UNC is working to stay at the forefront of it.
“People are going to use it anyway, truthfully, so if we teach them the right ways to use it, it’ll benefit everyone in the long run,” Noel said.