Siriboon, Adobe Stock Images Much like the arrival of the World Wide Web in the 1990s, the introduction of AI into the culture has been met with fear, excitement, and a rush to keep up. AI is already reshaping daily life, the workplace, and now universities themselves. In a short span of time, tools such as ChatGPT, Runway ML, and others have changed how students write and learn, how professors teach, and how employers evaluate skills. AI can be a powerful tool for efficiency but may flatten the integrity of learning itself. The world of higher education now has a choice: Should it resist, regulate, or embrace this new technology?
For Peter Hans, UNC System president, the only choice is to adapt accordingly. In a speech to the Board of Governors in September, Hans struck a hopeful yet cautious tone, remarking, “I suspect the AI story will play out in a similar muddle somewhere between utopia and annihilation,” a reference to how society once grappled with the rise of the internet. He outlined an ambitious vision in which North Carolina universities lead the charge for AI education and equip students with the skills they need to enter the job market. The plan includes partnerships with companies such as Google and Microsoft, an AI Advisory Council, and the launch of a systemwide “AI Skills Module.”
Hans’s vision mirrors a broader statewide push to integrate AI responsibly. This vision mirrors a broader statewide push to integrate AI responsibly. The N.C. General Assembly recently introduced House Bill 1004, which allocates $16 million for “AI Hubs” across certain UNC System institutions. The initiative aims to bring AI into the classroom. In September 2025, Governor Josh Stein signed Executive Order No. 24, which established the North Carolina AI Leadership Council and an AI Accelerator within the Department of Information Technology. These bodies are tasked with ensuring “North Carolina’s leadership in AI literacy, governance, and deployment,” according to the governor’s press release. The UNC System is following suit, thus signaling a coordinated effort between policymakers and higher education to make North Carolina a national model for AI education.
Can AI enhance human learning, or will colleges teach students to outsource their thinking? As the job market evolves, universities have a responsibility to align education with workforce demands, ensuring that students gain real value from their degrees—in other words, a clear path to employment. According to Hans, the UNC System’s AI projects will help guarantee as much. Still, the news is not all positive. Hans warned of AI’s “tempting invitation to mediocrity” and the danger that it could deepen student isolation and anxiety. His remarks captured one of the central questions facing higher education: Can universities use AI to enhance human learning, or will they slowly teach students to outsource the very thinking that they were built to cultivate?
Hans’s remarks focused on three main points: access, coordination, and preparation. First, the System Office team has coordinated licensing agreements with AI providers such as Google, Amazon, OpenAI, and Microsoft, ensuring faculty, staff, and students will have access to the best technology and tools. The agreement with Google makes the UNC System one of the first to provide career certificates, allowing students and staff to sharpen their AI credentials across 17 institutions.
Second, the UNC System Office will focus on coordination across the system, with schools sharing best practices. Currently, the UNC System is working on hiring a chief AI officer to “oversee AI strategy” and integrate best practices. A new AI Advisory Council is also being formed to help identify needs, find productive uses of AI, and coordinate the exchange of knowledge across institutions.
Third, the system will emphasize workforce preparation, a core component of universities’ obligations. Hans announced plans to launch a systemwide “AI Skills Module” by early 2026, designed to make sure UNC students gain essential technological skills for an economy “fundamentally changed by AI.” According to Heather McCullough, director of learning technology and open education for the UNC System, the module will comprise six sections: introduction and goals; AI fundamentals; benefits, risks, and academic integrity; responsible application; practical exercises; and industry-focused approaches. McCullough said the system anticipates that a biannual update cycle will work well as technology changes.
Across the UNC System, campuses are already experimenting with AI in meaningful ways. NC A&T, UNC Charlotte, and NC State are leading AI research-and-development projects that explore how technology can drive innovation in science, business, and education. At NC A&T, the AI Accelerator initiative gives faculty, staff, and students hands-on AI training. UNC Charlotte offers an AI graduate certificate, which teaches AI proficiency online. And NC State is incorporating AI into the classroom through its Advancing AI in Education measures.
East Carolina University, UNC Asheville, and Winston-Salem State University have released statements regarding AI and have started using it. Hans said, “At ECU, there’s a professor assigning students to teach things to ChatGPT, having it play the role of an eager protégé so students can demonstrate their mastery of a topic.” The North Carolina Community College System announced a partnership with Google to offer AI and tech training programs. These efforts show a growing recognition that AI will play a central role in higher education in the future.
Integrating AI into higher education shouldn’t be just about advancement and new tools but also about who controls them. AI isn’t just convenient; it can aid in academic discovery. AI tools have the power to analyze data, identify trends, and pursue new theories … quickly. On campus, AI has the potential to streamline tedious processes having to do with scheduling, financial aid, parking, and various administrative tasks so that the main focus of professors can be teaching. Now, more students have access to advanced technologies, such as tutoring support and research tools, that were once limited to enrollees at larger universities or those who could afford them. All these upsides mask a harder question, however: Could AI undermine the purpose of higher education? Integrating AI into higher education shouldn’t be just about advancement and new tools but also about who controls them.
It remains unclear who is actually steering the AI effort or whether these groups are producing more than reports and recommendations. The UNC System’s licensing agreements with major companies seem like a step forward, but they raise some transparency issues. What data will be shared with these companies and under what terms? Will the reliance on corporate platforms shape the direction of curricula and learning outcomes across participating universities? Without clear oversight and accountability, these partnerships risk prioritizing corporate interests over the educational missions and core values of universities.
In 2024, UNC-Chapel Hill launched the faculty-focused Provost’s AI Committee and AI Acceleration Program, designed to guide the adoption of AI across the university. That same year, UNC chancellor Lee Roberts announced a Generative AI Working Group to explore the implications of generative AI. UNC Greensboro has its own “AI Working Group.” Yet, with so many overlapping bodies, it remains unclear who is actually steering the effort or whether these groups are producing more than reports and recommendations. The solution is transparency. To ensure that these groups are effective, the UNC System should clarify the roles and responsibilities of each and make their findings accessible to the public.
Workforce preparation is another major factor. As previously mentioned, the system plans to launch an “AI Skills Module,” which equips students with essential knowledge for a changing job market. A central role of universities is to prepare students for the workforce so they can support themselves financially and develop skills for long-term professional success. McCullough told the Martin Center that “many job reports indicate that analytical thinking, technological literacy, and skills such as creative thinking, resilience, [and] flexibility are going to remain in high demand. We believe this training program will address these key workforce skills.” This ambition is both practical and logical. But should universities define success by how well they meet labor demands or by how well they cultivate independent, critical thinkers?
Ideally, they should strive for both. Nevertheless, there is a danger that overreliance on AI could erode students’ ability to think critically and form unbiased opinions. The risk that students will become passive consumers of machine-produced information rather than actively analyzing, evaluating, and reasoning should be a concern for everyone, especially those shaping AI policies in higher education. The UNC System must harness AI’s practical benefits while safeguarding critical thinking. It can do this through deliberate strategies to preserve independent thinking in the classroom. For example, rather than writing essays, which students might outsource to ChatGPT, they could read and critique AI-generated articles in person.
Just like the internet and social media, AI has the potential to worsen student anxiety and depression. Hans warned in his speech, “There are too many souls retreating into digital distractions at the expense of real human connection.” The allure of fast-paced tools designed to offload thinking may deepen students’ dependence on screens, which in turn will lessen their human interactions and genuine connections. Already, students spend hours completing coursework, networking, and socializing through screens rather than in person. Disconnection, loneliness, and burnout could be defining features of an AI-driven academic culture. To combat this, universities need to prioritize real human interaction alongside technological innovation, or they risk producing graduates with digital skills but no social ones.
If higher education doesn’t rise to the challenge, then students will leave university without being able to discern truth and deception. A word that everyone has become familiar with over the last decade is misinformation. With AI, the line between fact and fiction is growing even more blurred. Social media is littered with AI-generated videos and photos depicting celebrities doing or saying things that never happened. For universities, which should exist to promote truth and critical thinking, allowing this potentially deceptive tech to seep in is a dangerous game. There’s a responsibility to teach students how to identify what is real and what is not—or, as Hans said, how to “find true knowledge in a sea of questionable information.” If higher education doesn’t rise to the challenge, then students will leave university without being able to discern truth and deception.
Even optimistic observers should question whether the UNC System is prepared for the challenges that, no doubt, lie ahead. There will inevitably be mistakes, setbacks, and tough questions about what works and what doesn’t. Administrators may have to face ethical dilemmas, policies that need revising, and initiatives that don’t work out as planned. The true test will be whether leadership can course-correct and make changes instead of pushing ahead and calling it innovation. Hans’s vision may create a future in which North Carolina universities lead the country in AI education. If done carefully, with transparency, attentiveness, and a focus on real learning, it could make the UNC System a national example of how to use technology without losing what makes education valuable. But success won’t be measured by how many AI initiatives are in place or what companies we partner with. It will be measured by whether students leave college curious, disciplined, and able to think for themselves.
Reagan Allen is the North Carolina reporter for the James G. Martin Center for Academic Renewal.