Skip links
Human hand shaking a digital hand

Embracing AI in Pharmacology Education

Published by:

Dr Mark Dornan, Dr Martina Galeotti, Dr Mark Owens, Fearghal Lewis, Diane Armstrong, Barry Tucker, Lynne Robinson, Sara Sproule, Niall McKenna

Introduction

Artificial intelligence (AI) is rapidly reshaping healthcare, yet its role in higher education is still emerging. At QUB, School of Nursing and Midwifery, we sought to embrace this opportunity by embedding AI directly into the Year 3 Undergraduate Applied Pharmacology module taken by nursing students across different fields: Adult, Children and Young People (CYP), and Mental Health.

Our aim was not simply to “add AI” to the curriculum, but to use it as a teaching innovation that prepared students for the realities of contemporary practice. The module demonstrated how AI can:

  • Support creativity and responsibility – students used generative AI to create patient cases, but had to fact-check and remain accountable for accuracy.
  • Stimulate ethical debate – tutorials provided space to question the benefits, risks, and responsibilities of AI in medication administration.
  • Prepare for real-world practice – students rehearsed consultations where patients presented AI-generated health information, strengthening their communication skills.

By embedding AI within applied pharmacology learning, we demonstrated how digital tools can enrich traditional teaching while building the critical, ethical, and communicative competences that future nurses need.

The overall learning outcomes of the module were to ensure that students:

  1. Developed critical decision-making skills around shared decision-making and patient partnership, supporting effective communication and collaboration with patients, carers, and colleagues.
  2. Developed ethical and legal reasoning in relation to prescribing and administration of medicines.
  3. Build clinical judgement in managing complex drug regimens, polypharmacy, and co-morbid conditions.

The module began with a practical tutorial on medication history-taking. Students were asked to prepare a short patient case in advance, which they could either create themselves or develop with the assistance of generative AI tools such as ChatGPT.

Their case needed to include:

  • Patient information (age, background, relevant conditions).
  • At least three medications (prescribed, over the counter (OTC), and herbal).
  • At least one issue such as non-adherence, side effects, confusion post-discharge, or high-risk medicines.

If students chose to use AI, they were required to validate the information against trusted references such as the British National Formulary (BNF) and National Institute for Health and Care Excellence (NICE) guidance. This emphasised that while AI could be a useful tool for creativity and efficiency, clinical accuracy and responsibility remained with the student.

Students either drew upon an example they had encountered during clinical placement, or the majority used generative AI to create their scenario. Students rotated roles between nurse, patient, and observer. They practised structured medication history-taking, clarified adherence, identified potential interactions, and documented findings.

During the practical sessions, tutors invited students to present their cases to their peers, and asked clarifying questions about drug names, doses, and indications to confirm student’s understanding. This approach not only helped consolidate pharmacological knowledge but also ensured that students recognised the limitations of AI and the importance of fact-checking. Engagement in this activity was consistently high, and students reported that it strengthened their confidence in both medication history-taking and critical evaluation of digital tools. End of tutorial reflection questions guided them to consider strengths, challenges, and how to escalate concerns, for example using SBAR to a prescriber or pharmacist.

These activities highlighted two key principles:

  • AI is a support tool, not a substitute for clinical verification or patient dialogue.
  • Professional accountability cannot be delegated to technology.

In the next phase, students engaged in structured ethical debates on the use of AI in medication administration across adult, CYP, and mental health settings. These sessions moved beyond pharmacological content to focus on professional judgement, ethics, and accountability.

Students considered a scenario in which a hospital had implemented an AI-powered system to automatically generate medication histories by collating GP and hospital records. While the system promised efficiency, it failed to capture non-prescribed medicines, such as herbal or OTC drugs, and raised pressing ethical questions.

The discussions prompted students to explore:

  • Whether these omissions would be recognised if clinicians relied on AI alone.
  • The dangers of over-reliance on AI, such as missed drug interactions, deskilling of clinicians, and loss of patient-centred care.
  • How to balance efficiency with thoroughness, using AI as decision support but not as a replacement for professional assessment.
  • The safeguards required for safe integration of AI, including human verification and staff training.
  • The issue of accountability: emphasising that the nurse using the tool remained responsible for decisions, not the technology itself.
  • How discrepancies should be escalated and documented, with reference to duty of candour and professional guidance.

The debates drew on professional frameworks emphasising that safe administration rests on human accountability (Royal Pharmaceutical Society & RCN, 2019; Shepherd & Shepherd, 2020). Students concluded that while AI could enhance safety and workflow, its integration had to be accompanied by robust governance and critical oversight.

Feedback showed that students appreciated the realism of the debates. They described them as valuable opportunities to grapple with the ethics of innovation, preparing them to manage the opportunities and risks of AI in practice. This is particularly relevant in Northern Ireland with the new implementation of electronic health records.

The final innovation prepared students for the reality that patients increasingly present AI-generated health information in clinical encounters. Students took part in a practical session simulating primary care consultations with patients who brought AI-generated summaries of their medications.

The summaries were created by tutors in advance with intentional mistakes. This ensured the exercise was both realistic and safe, and that students had to critically evaluate the information presented. Working in small groups, students were asked to identify errors, compare the AI content with authoritative sources such as the BNF and NHS decision-support tools, and reflect on the implications if such advice were followed. Tutors facilitated discussion, prompting students to consider not only what was incorrect but also how to respond sensitively to a patient who had trusted this information.

In Part 2: Students then applied the BRAN model (Benefits, Risks, Alternatives, Nothing) (Hoque, 2024) to structure a safe, therapeutic response. In groups, they created short summaries or diagrams that could guide a real consultation, reinforcing collaborative communication and evidence-based practice.

Reflection questions encouraged students to think about how to correct misinformation without undermining patient confidence, how to adapt communication for patients with different levels of health literacy, and how to promote digital health literacy by signposting patients to reliable sources such as NHS.uk and NICE.

This activity strengthened critical appraisal, advanced communication skills, and digital health literacy. Students reported that it gave them greater confidence to manage real-world consultations where patients may arrive with AI-generated information, helping them to balance empathy, partnership, and professional responsibility for safe care.

Conclusion

This module showed how AI can be embedded into the undergraduate curriculum in ways that are practical, responsible, and innovative.

Through generative AI scenario-building, students learned to use technology creatively while taking responsibility for validating their learning. Ethical debates gave them the tools to reflect on the legal and moral challenges of AI in medicines administration. Role-play scenarios equipped them with communication strategies for real-world practice, where patients are increasingly guided by digital tools.

In doing so, we moved pharmacology teaching beyond its traditional, content-heavy reputation and made it a space where students could experiment, debate, and practise safely with the technologies that are already shaping healthcare.

AI in education is not about replacing expertise, but about preparing students to engage with it critically and ethically. This work stands as an example of how universities can embrace AI as a catalyst for innovative teaching while keeping professional accountability and patient partnership at the core.

References

Hoque, F. (2024) ‘Shared decision-making in patient care: advantages, barriers and potential solutions’, Brown Hospital Medicine, 3(4), pp. 13–15. doi:10.56305/001c.122787.

Royal Pharmaceutical Society and Royal College of Nursing (2019) Professional guidance on the administration of medicines in healthcare settings. London: RPS.

Shepherd, M. and Shepherd, E. (2020) ‘Medicines administration 2: procedure for administration of oral medicines’, Nursing Times [online], 116(7), pp. 42–44. Available at: https://www.nursingtimes.net/clinical-archive/medicines-management/medicines-administration-2-procedure-for-administration-of-oral-medicines-06-07-2020/ (Accessed: 21 August 2025).

National Institute for Health and Care Excellence (NICE) (2024) British National Formulary (BNF). Available at: https://bnf.nice.org.uk/ (Accessed: 21 August 2025).

National Institute for Health and Care Excellence (NICE) (2021) Shared decision making. NICE guideline [NG197]. Available at: https://www.nice.org.uk/guidance/ng197/chapter/Recommendations (Accessed: 21 August 2025).