The Problem with AI Adoption in Higher Education

Eric Sandosham, Ph.D.
6 min readMay 19, 2024

--

Learning with AI instead of about AI.

Photo by Leon Wu on Unsplash

Background

Academic professors can’t seem to wrap their minds around having AI in the workspace. In fact, the vast majority of them actively resist, and some outwardly reject, having any kind of serious AI implementation that can impact the lives of students, faculty, or administrators. I’ve had friends in academia bemoan the use of Gen AI (pre-ChatGPT days) by students to ‘assist’ in their assignments. These friends were disgusted by the students’ lack of integrity. On the contrary, I told my friends that I would hire such students because they clearly knew how to be more productive at work; they would be an asset to the workforce. And therein lies the problem. Academia is in the business of generating and disseminating knowledge. Even when they teach applied courses, the focus and assessment still prioritises knowledge acquisition. Even with Stanford and MIT, which spins out some amazing application-based start-ups, they are simply monetising knowledge. Students do not acquire skills. Faculty do not acquire skills.

Just two to three generations ago, the corporate world was more than willing to accept the handover and provide job-specific skills to fresh graduates. In fact, they distinguished their corporate brands on that very activity. Universities provided the inputs on critical thinking and conscientiousness, which were then harnessed and honed into corporate assets. But now, the corporate world is increasingly looking to the universities to produce work-ready graduates. Regardless of whether it’s fair or not, that is the new reality.

And so I’m dedicating my 39th weekly article to deconstructing the challenges of AI adoption in higher education.

(I write a weekly series of articles where I call out bad thinking and bad practices in data analytics / data science which you can find here.)

Ready for Work

For students to be work-ready meant familiarity with ‘office automation’ (e.g. use of Excel and PowerPoint) in the past, and then it became internet and web literacy. And now it is about AI leverage. In each of the past paradigm shifts, it took longer than desired for universities to embrace the technologies in their campus environment. A good example is the digitalisation of academic libraries. Even in 2010, there were discussions within academic circles as to why digital libraries are a bad idea! This at a time when Wikipedia was thriving.

There are academics who think they are asking good questions, such as “What is the evidence that integrating AI into student life will improve learning outcomes?” But in fact, this is not a useful question. Academics approach most things like the way they approach research: “Show me the evidence.” And this only reinforces the ‘ivory tower’ perception from the non-academic world. To be work-ready means that students much be able to develop critical thinking and acquire knowledge in an environment that is akin to the one in which they will themselves working in. The campus environment cannot be an isolated ‘fishtank’. Graduating would be like releasing captivity-bred wildlife into the natural environment — most will not thrive and many will die.

Testing the effects of technology adoption in the campus environment can be a nice research activity, but universities cannot, and must not, wait until the research is available to proceed with adoption. They need to separate the work they do in knowledge acquisition (i.e. research) versus the business of higher education. It’s a catch-22 problem — you can’t wait for research in your own domain to be conducted before you can implement changes.

Instead, academics must spend the time to understand the evolving work environment outside of academia. If employees are using digital devices to interact, learn and manage work, then the students need to do the same in their campus life so that post-graduation integration into the workplace becomes more natural and effective.

Learning with AI

There is a fundamental difference in learning with and about AI. Universities are prepared to learn about AI but not with AI. But modern universities can no longer afford to teach about innovation and technology (the underlying foundations and breakthroughs) without integrating that technology into the classroom. Technology cannot be taught without direct experience. How many Marketing professors can actually do real Marketing in the corporate world? I don’t know what that percentage is, but I’m prepared to wager it’s a really, really small number. But these professors could get away with simply teaching Marketing knowledge, but not Marketing skills and competencies. But it would be hypocritical to do the same with technology topics. Students deserve better.

If the corporate workplace is looking to use AI to reduce the friction in “seek & search”, to summarise research and information, to conduct first-level analysis, to reduce uncertainties in decision-making, to automate decisions, … then students need to be experiencing that in their campus life too. It’s the only practical way for them to appreciate the technology, develop deeper awareness and curiosity for it, allowing them to optimise the tools in new and novel ways.

There are so many possibilities of integrating AI into campus life. Consider the solution category of recommendation engines. You can leverage this matured AI solution to improve course selection, increase the relevance of reading and research materials, choosing better extra-curricular activities to achieve the desired campus life experience, making better internship choices, etc. Anytime there is a choice selection involved, we can always frame it as a matching problem and optimise it via a recommendation engine. Students’ interaction with these recommendation engines will teach them how they can fine-tune and improve the outcomes through better quality of data inputs and feedback.

Raise the Bar with Gen AI

Generative Artificial Intelligence, or Gen AI for short, is probably the technology that causes the greatest discomfort with academics. In essence, Gen AI can accurately summarise readings, write essays and even perform exploratory analysis on data. Professors feel that this significantly reduces the effort of learning, which they perceive as a bad thing. I completely disagree.

Let’s consider the domain of writing. Broadly speaking, there are 3 different objectives to writing. One can write to regurgitate information, or one can write to make a differential argument, or one can write to illuminate conscious experience. Gen AI should replace the first, as there is minimal learning to be had in the activity. And Gen AI should be used as a research or assistive tool for the second, as it brings efficiency without sacrificing the learning outcome. But Gen AI will be useless for the third since it is not designed for individuality (Gen AI is an amalgamation by design). By allowing students to leverage Gen AI in their assignments, it will expose them to these realisations; a valuable learning outcome that is work-ready.

And because of the encouraged and expected use of Gen AI in student assignments, professors must raise the bar on the passing grade. Just like calculators and computers have raised the bar on the passing grade for mathematics; no one is expecting the student to perform computations by hand. The default is that they must encourage the use of computer technology to get work done.

Conclusion

Academic professors are trying to get their students to behave like them – they want their students to develop a love for knowledge but that is the wrong objective. The purpose of higher education should be to focus on creating work-ready graduates. That is the expectation that higher education must rise towards. And this means immersing the students in environments that are similar to the corporate workspaces so that the learning is representative and leverages the expected technology of the day. Asking whether these technologies are beneficial to learning can be a topic of research interest, but ultimately a pointless question since the modern workplace has already committed to using them. Universities simply need to adopt these technologies as much a they can, and adjust by raising the bar on what’s considered as acceptable learning achievements.

--

--

Eric Sandosham, Ph.D.
Eric Sandosham, Ph.D.

Written by Eric Sandosham, Ph.D.

Founder & Partner of Red & White Consulting Partners LLP. A passionate and seasoned veteran of business analytics. Former CAO of Citibank APAC.

Responses (1)