Abandoning AI is abandoning students
AI is here to stay, and it’s time for higher education to catch up. University leaders have an obligation to ensure that effective AI use is preserved on campus. Institutions have a legal obligation and a duty of care to prepare their learners for the ‘real world’; that now includes effective and principled AI use in the workplace and in learners’ personal and professional development.
As a multi-hundred-billion-dollar industry, AI is not going anywhere.
First and foremost: higher education is intended as preparation and a launchpad to a career — further in academia — or into the private or third sectors. AI use is now a fundamental and core component to business operations, and, at the very minimum, students should hold some base-level knowledge of AI use.
In practice: 75% of respondent companies said that they are likely to adopt AI technology, according to the World Economic Forum’s 2023 Future of Jobs Report. And, according to Market Watch, the number of employees using AI has increased by two-fold between 2022 and 2024, and 70% of workers are using AI.

There are practical and advantageous uses of AI, but it is worth noting that, of course, students should learn and possess the required skills first. In other words, students should obviously have a grasp of the content they need in order to perform.
Many academics are uncomfortable with AI because they view its use as ‘cheating’, they claim it lacks ‘rigor’ or they hold personal environmental concerns. Concerns around “academic freedom” are frequently raised. But, let’s be honest: academic freedom has never been fully afforded to many nontraditional scholars, notably Black scholars, queer and GNC, disabled, and nontraditional speakers or communicators (including those with disabilities or those whose approach to learning and teaching vary distinctly from campus majorities). Enforcing a prohibition on AI use in a higher education institution on the basis of environmental concerns stands in direct contrast with institutions’ other daily operations: investments in fossil fuels, international travel and global recruitment, and practices in use by existing university governance and technology practices (data storage, integration with AI-supported and AI-led office productivity software, etc.).
There are two (of many) issues that are worth taking into consideration here – these can be quickly resolved with swift attention and a commitment from faculty to embrace a culture that is future-forward, not a relic of the past. Principally, these issues include (1) AI in teaching and learning and (2) AI in assessment. These issues are interconnected, and it is time that academics consider that in their approach to AI use governance and classroom guidelines.
Simply put: this new suite of tools is here to stay. It has become vital to workplaces around the world. It helps with tasks from editing and proofreading, media generation, file organization, PowerPoint presentation accessibility, emotional support, cognitive scaffolding, profile design, data reporting, and coding.
The accessibility functions of AI are well-documented, too. AI increases interactivity and support by bringing it directly to learners; they can use AI (whether a formal chatbot or free/paid tool) to support them throughout assignments, during revision, as they read and study, and as they prepare for assessments. Using AI tools at home and alongside studying is transformative – helping students increase their confidence, independence, and access. AI can support students’ communication with one-another and with the academy, as well as with prospective employers, offering a boost to people from marginalized communities and whose institutions may provide very limited, if any, support related to transversal skills or personal and professional development.
In the UK, more than half of learners report losing marks due to assessment that does not take into account or support their accessibility needs. And, shockingly, less than 40% of students said that “their agreed adjustments were fully implemented.”
A recent article pre-print by a UK trained academic explores how AI can enhance accessibility and inclusion, especially for international and marginalized communities. Envisioning AI as a tool to improve students’ (and staff members’) sense of belonging speaks for itself and one would be hard-pressed to find a well-regarded, high-performing, high-impact academic who disagrees with the key points raised in the article.
But the wave is ongoing. According to Cengage, “Nearly half (45%) of instructors are now using AI in their work, up from 24% in 2023, with positive sentiment about the technology rising from 28% last year to 49% in 2024. However, 3 in 5 (59%) instructors report not having a generative AI (GenAI) policy in place for students or are unsure about such a policy.”
AI in teaching and learning
What does AI look like in the classroom?
- Course/module development. Pop-quizzes, classroom-based trivia questions, accessibility support (making materials accessible and engaging to diverse learners), note taking, module planning; and, general classroom presentation design.
- Literature reviews/background and contextual research. Seasoned faculty can use AI to support literature reviews of their module content. (We all know those academics whose program or course syllabi haven’t seen any chance in over five years!) Academics can refuel and replace outdated content in significantly less time.
- Audio/visual media for teaching and learning. Podcasts, PowerPoint and AI-produced slides, classroom videos, and interactive games for learners.
- In the humanities and social sciences, academics can use AI with their learners to co-design and co-publish materials, such as books and scholarly reflections on their engagement with the scholarship over the course of the academic year – leveraging AI to produce public histories and support a sense of belonging and ownership for learners. The academy is the place for all – regardless of stage – to leave their mark. Envision a world where academics and learners use AI in the classroom to bring students into their work… to co-design and improve lessons and classroom development together.
Why are students not showing up to class?
I served as a students’ union sabbatical officer, and I worked closely with students and faculty to gauge issues around low student attendance and low student participation. Year after year, I heard academics complain of low student attendance and general apathy, but too often, the onus is placed on the learner to conform or adjust to meet academics’ participation expectations.
Rather than placing the burden on students to conform, academics must create classrooms that are engaging, relevant, and worth attending.
There are a number of reasons why students do not show up to class: boredom, lack of belonging, poor content delivery, employment, commuting costs, and disinterest – a general sense of apathy towards the course.
Scholars of today are consumers. They are spending thousands (if not tens of thousands or hundreds of thousands) for their higher education and the access that it affords to opportunity. The onus is on academic staff to create an environment in which learners are excited and passionate about learning. Market pressures and increasingly globalized education demands a higher education market that is flexible and responsive to social and community needs. A static approach on learners today – the current and future business, academic, and social leaders – is academic malpractice. It is also an egregious abuse of the trust that millions of families place in the academy.
Academics have a moral and social obligation to meet learners’ robust needs; if they cannot deliver that, institutions will — and should — bend to market pressures. This does not jeopardize academic integrity; it demands flexible and innovative approaches to processes.
What would compel today’s average learner – who is increasingly financially independent and self-sustaining – to attend a classroom that neither engages, challenges or inspires them? That student can instead work and review the weekly literature assigned as well as any notes or lecture recordings accompanying it. Today’s classroom must be participatory and/pr engaging; otherwise, what would compel learners to actively participate?
AI in assessments
Modernizing how we assess and track students’ growth
The emphasis in the humanities and social sciences on summative assessments is antiquated. These summative assessments tend to have little focus on student graduate outcomes and professional opportunities.
A mid-term essay and a final-year essay or dissertation does not provide sufficient scope of student success or grasp of the material. These formative assessments are traditionally programmed at stages in which there is little or no opportunity for corrective intervention.

A Twitter user (@meatballtimes) raised this very simple question: ‘why do students use AI to cheat?’
In the thread, @meatballtimes chronicles some of the reasons why students use AI; the assignments are boring and not related to students’ interests or desired outcomes, the assignments do not engage with the way that diverse learners learn, and the external pressures that more highly demand students’ attention.
Faculty may use AI to develop assignments that support a diverse and highly engaging learning environment.
Formative assessments such as podcasts, videos, exhibitions, blog writing, photography, graphic design, practical experience, and reflective writing can meaningfully transform student engagement and success. AI models can help students manage their time to best focus their attention on the skills needed to excel post-study. (Imagine a second-year student chatting with an AI bot about which classes and extracurricular activities they should prioritize to increase their opportunities post-graduation!).
Beyond these efforts, initiatives that flip the narrative and turn students into teachers in the classroom also act as positive interventions that are (mostly or can be) AI-proof whilst still maintaining academic integrity and rigor needed to demonstrate learners’ development.
Academics can integrate AI in developing, monitoring, and executing these processes by putting students in control.
AI-supported programs such as Google’s Gemini and Notebook LLM, Canva, Co Pilot, Open AI’s Sora, Jenny, Paperpal, Scite, and more all provide low-cost solutions that academics and students alike can use to improve their experience and reduce some of the pressures that we now face in a world that is increasingly digital and online – with no chance of going back. The pressures and demands on learners’ lives are unquestionably higher than ever; with virtually infinite access to learners’ time 24/7 and market pressures that demand that they work increasing hours in order to survive.
Ultimately, however, it is about the teaching and learning experience. The current state of affairs suggest that only one of two parties is satisfied with the current approach to teaching and learning with respect to artificial intelligence: academic faculty.
Faculty’s reluctance to transform and adopt more innovative practices – as part of a multi-billion-dollar transformation of social infrastructure – is condemnable. The academy is reliant on learners to fund its work and support communities of research and learning. Much of this resistance stems not from concern over integrity, but from fear, unfamiliarity, and a rigid commitment to outdated norms. However, like learners, the academy must embrace growth and change. The market demands it; and, increasing pressure from politicians and social leaders demand it, too. The academy still looks far too much like it did forty years ago; few people of color in leadership, a structure predicated on traditional ways of learning that were developed and reinforced by people who are racialized as white and for people who are racialized as white, and entirely disconnected to the working class and middle class communities whose lives are most impacted by the academy’s research and output.
Business leaders, learners, and social leaders must stand firmly in opposition to the rejection by some in the academy and their rejection to modernization of the academy and AI. The academy will soon be forced to make a decision with respect to the future of the arts, humanities, and social sciences; the people who make these disciplines and academic departments must either work swiftly to innovate and collaborate on future-forward solutions or stand down and admonish the relic of history that they themselves will make when market and social pressures see departments shut their doors for good.

