Intended for healthcare professionals

Opinion Primary Colour

Helen Salisbury: The role of artificial intelligence in general practice

BMJ 2025; 388 doi: https://doi.org/10.1136/bmj.r336 (Published 18 February 2025) Cite this as: BMJ 2025;388:r336
  1. Helen Salisbury, GP
  1. Oxford
  1. helen.salisbury{at}phc.ox.ac.uk
    Follow Helen on Bluesky @helensalisbury.bsky.social

How much will our working lives as GPs be changed by artificial intelligence (AI)—and should we be worried about losing our jobs? Patients will continue to need empathy, physical examination, and human connection, so we’re likely to be employed for a while yet. However, AI is already here in the back offices of many practices, supporting document management and triage systems, and increasingly in the consulting room too.

Some GPs are using “ambient scribing” AI systems that “listen” to the consultation and produce a written record of what they consider to be the medically relevant features. These, according to the sales pitches, free up the doctor to fully focus on the patient in front of them, reducing the time spent typing. For myself, I’m wary, and I worry that any time saved would be spent proofreading and editing. For simple consultations the AI generated summary may be adequate, but they’re not time consuming to document anyway. With more complex interactions—around mental health, family problems, or drugs and alcohol—the choice of which details should go into the notes is a conscious set of decisions on my part after each consultation. This is particularly important given relentless pressure from the NHS to share our GP record more widely across health and social care.1

In addition to scribing tools there are suggestions that AI systems listening to the consultation will offer advice, derived from other consultations they’ve encountered on the same topic, prompting the clinician to ask about key symptoms they may have missed and thus improving history taking and diagnostic precision.2 We risk a dystopian future, possibly only a few years away, where that clinician is someone who’s been trained in physical examination and communication skills but actually knows very little science or medicine, relying on the AI system in their headphones to feed them the next question and make sense of the answers.

The healthcare worker will be able to take your temperature, examine your abdomen, and translate the bot’s conclusions into reassuring language, but they’ll lack the knowledge to challenge it when it reaches the wrong conclusions, which it inevitably sometimes will. Indeed, we might expect errors to multiply over time as any consultations conducted by expert primary care physicians form a progressively smaller proportion of the database on which the AI is founded.3

Although one tech company’s claim that AI would replace GPs by 2027 clearly won’t be realised, we should remain on our guard, as the tools sent to help us may deskill us and stop us thinking for ourselves.4 The Additional Roles Reimbursement Scheme promised us human assistants who definitely weren’t going to replace us or fill the role of doctors, yet they did.56 There are already marked inequalities in patient access to GPs, and AI is likely to exacerbate this trend.7 In the future, privileged patients may still see a real doctor, but the poorer ones will be served by cheaper and less trained clinicians with a bot in their earpiece.

Footnotes

References