If Shaw, Unsplash [Editor’s note: The following article concludes a two-part series on AI and American colleges. Please click here to read coverage of three possible AI futures.]
The email dropped into my university account with a quiet ding, an inauspicious start to what would become an unwanted foray into the messy world of AI “false positives.”
With the arrival of the short missive from my professor informing me that my essay had been flagged for AI use, I was caught up in what would became a weeks-long process to appeal and clear my name of alleged AI use on an assignment in my counseling graduate course. The ordeal thrust me into the university’s AI-detection administrative machine, where I joined the many others across higher education experiencing a similar false-positive fate. Without wanting to be, I had become part of the broader AI conversation taking place across the university ecosystem.
Generative AI is putting at risk not only the university-educational model but the entire business model of online-only programs. These discussions often revolve around how AI is disrupting the age-old university-educational model, as well as how universities are attempting to maintain the status quo through the use of AI-detection tools such as Turnitin, which have been roundly criticized as ineffective and expensive. This has led to some major universities pulling the plug on AI-detection software entirely.
The issue is reaching a watershed at the same time that, “for the first time, more college students will take all of their classes online than will take all of their classes in person,” according to a recent projection from the Hechinger Report. A less discussed aspect, however, is that generative AI is putting at risk not only the university-educational model but the entire business model of online-only programs, as well.
Universities are slow-moving creatures by default and are not known for breakneck innovations in educational approach. Universities are slow-moving creatures by default and are not known for breakneck innovations in educational approach. The university I attend is no different, despite its large online student body and many asynchronous programs, such as the one I am enrolled in. Like most universities, mine is closely governed by federal regulators, whose findings carry implications for financial-aid eligibility. Yet generative AI has made many of their traditional methods moot.
If, for example, federal regulators require me, as a student, to prove my attendance and involvement in an asynchronous class by writing a few discussion-board posts or completing a few assignments per week, then it is important that I actually do that work. If I can use AI to write my posts, then clearly the model is broken, as regulators cannot guarantee that it is I who have met the attendance requirements. Enter the band-aid solution of universities deploying problematic and ineffective AI-detection software and attempting to go on with business as usual, despite the glaring flaws in this approach.
An alternative solution to this problem would be for universities to re-engineer their curricula and assignments in such a way as to mitigate the use of generative AI. This seems like a logical idea on first pass but quickly loses steam in the details, especially where online learning is concerned.
Universities, as previously stated, are slow-moving vessels and do not respond quickly to external pressures, let alone to those as disruptive and fast-moving as AI. Further, many university leaders report a lack of resources and institutional preparedness as they adopt to a world with AI. A survey released in January by the American Association of Colleges & Universities and Elon University reveals some troubling statistics. While 95 percent of leaders say “the teaching models at their schools will be significantly or to some degree affected” by AI, only 48 percent believe the change will be “significant.” Fifty-nine percent of surveyed leaders believe that cheating has increased with AI, while 41 percent state that faculty are “not very effective” in catching the cheating.
Nor can universities simply capitulate to AI usage without major inconvenience. Most of the surveyed leaders note significant hindrances to AI adoption in the university setting, including faculty unfamiliarity, a general dislike of AI on the part of professors, and concerns about negative educational outcomes.
The issue of curricular innovation is made worse by accreditation standards. Even if universities want to adapt to changing conditions, they are often limited in their ability to do so by strict accreditation and state regulations.
For example, multiple universities have experimented with the seemingly simple idea of a three-year bachelor’s degree, reducing the traditional 120 credit hours to as low as 90 in some cases. Many universities see the merits of the three-year degree, which could help address such issues as flagging student enrollments, the forthcoming demographic cliff, and the costs of traditional four-year programs. While some universities have leapt ahead, many others are on the sidelines waiting to see how accreditors respond.
Universities can’t afford to move at accreditors’ glacial pace. Would accreditors stand in the way of the kind of radical change that AI could require on campus? Maybe, and maybe not. What is certain is that universities can’t afford to move at accreditors’ glacial pace. OpenAI’s Sam Altman shocked the world with his September 2024 statement that super-intelligent AI could be a reality within “a few thousand days.” Time is not on higher ed’s side.
An additional hurdle is uncertainty regarding how and in what ways universities will embrace AI. An additional hurdle for universities is uncertainty regarding how and in what ways they will embrace AI (or reject it completely). This lack of clarity is worrying for students and threatens online schools especially, as they lack the ability to implement the kind of AI-proof, in-person assignments (e.g., blue-book exams) that are becoming more popular at some brick-and-mortar universities. On the one hand, many universities “require” students to forgo the use of generative AI on assignments and deploy AI software to ostensibly police them—even as professors use AI to develop lesson plans and evaluate doctoral dissertations. On the other hand, students look to universities as leaders in AI and are disappointed that they are not receiving training in these skills. Worse, they are often discouraged from developing their own AI knowledge as they try to adapt to a changing job market.
A survey released in July 2024 by Cengage Group showed that 70 percent of graduates want generative AI training in the college classroom, while 55 percent said their degree programs did not prepare them to use AI in the real world. Additionally, 39 percent of respondents expressed concern that AI poses a risk to their future employment. These concerns are echoed by parents of high schoolers, who are showing heightened scrutiny of a college education in the age of AI. Parents are concerned about whether colleges will give their kids job prospects in an AI-powered market while also offering them something they cannot simply learn online for free.
A survey from the College Guidance Network reinforces these worries. The survey shows that AI has influenced the opinions of two-thirds of parents about the value of college. Thirty-six percent of responding parents are concerned with the AI-skills curriculum at universities, while 62 percent had discussed AI and its effects on the future of work in the past two weeks. Universities are feeling the pressure to demonstrate their value in a definitive way.
The university-educational model and online business model are under duress like never before. The much-discussed demographic cliff is upon us, and the rising cost of university tuition is forcing frank conversations about the value of college among students and parents alike. Students are increasingly looking to universities to lead from the front on AI use but are either being discouraged or are getting caught up in bureaucratic turmoil. For universities to remain relevant during this generation-defining technological shift, their emphasis must be on curricular change and full AI acceptance rather than on focusing precious time and resources on the lost cause of AI policing. But even that may not be enough.
John Stuart is a master’s student in clinical mental-health counseling.