1 of 74

How can doctors use AI ethically and safely?

Deen Mirza

2 of 74

Session Outline

  • GPSI elderly care, high volume appraiser, self-help author and medical examiner My interest in AI is as an author (not an IT person or an expert)

  • Ethical dilemmas of integrating AI into medicine
  • Practical approaches to incorporating AI into general practice,

3 of 74

Conflict of Interest

4 of 74

Content

  • How to use AI for medical tasks safely and ethically –GMC/MPS
  • Random examples of AI use in daily life/ medical work
  • My own conclusions for the non-techie GP
  • Ambient voice technology
  • Using AI for medical literature (insights from the author industry)

5 of 74

Artificial Intelligence (AI)

  • AI refers to computer systems designed to perform tasks that normally require human intelligence. These include recognising patterns, processing language, learning from data, solving problems, and making decisions.

6 of 74

Computer Vision Pitfall — 'Bald Head = Ball'

  • A televised 'AI' camera

tracked a linesman’s head

instead of the football

(Scotland, 2023).

https://www.essentiallysports.com/soccer-football-news-ai-automated-camera-backfires-on-scottish-soccer-club-as-technology-mistakes-referees-head-for-soccer-ball/

7 of 74

GP concerns about AI

Resistance to change

Fear of the unknown

Medical liability

Being replaced

Fear of being left behind

8 of 74

Large Language Model (LLM):

  • A type of artificial intelligence system trained on vast amounts of text data to recognise patterns in language. LLMs use deep learning (neural networks with billions of parameters) to generate human-like responses, summarise information, translate text, or answer questions.
  • They don’t “understand” language as humans do; they predict the next most likely word based on patterns.
  • Their usefulness comes from scale (huge datasets + computational power).
  • Safety depends on how outputs are validated, governed, and applied in context (e.g., healthcare vs. consumer use).

9 of 74

Types of Automation in Medicine�1. Robotic Process Automation (RPA)

Definition: A technology that enables the build, deployment, and management of software robots.

Key Features:

  • Robots emulate human actions and interact with digital systems.
  • Automates basic manual and repetitive tasks.

  • Examples in Medicine:
  • Automating patient appointment scheduling.

Reference: NHS England’s RPA initiative in primary care https://digital.nhs.uk/

10 of 74

Types of Automation in Medicine�2. Intelligent Automation (IA)

Definition: Integration of robotic and intelligent systems from various emerging technologies.

Key Features:

- Expands automation beyond rule-based tasks.

- Integrates multiple AI-driven technologies.

  • Examples in Medicine:

- Automating triage systems in general practice.

- AI-powered clinical documentation tools.

Reference: British Medical Journal (BMJ) – AI in healthcare automation

11 of 74

Types of Automation in Medicine�3. Artificial Intelligence (AI)

Definition: Simulation of human intelligence or cognitive processes

Key Features: Includes problem-solving, visual perception, speech recognition, and decision-making.

  • Provides predictive analytics for disease outbreaks and diagnosis. Examples in Medicine:
  • AI-powered diagnostic imaging (e.g., radiology, pathology).
  • Virtual health assistants and chatbots for patient queries.
  • AI-driven decision support systems for clinicians.

Reference: General Medical Council (GMC) – AI Ethics in UK Healthcare (GMC)

12 of 74

Expectations vs Reality 2015–2024

  • Predictions of near‑term autonomy (self‑driving cars; fully automated radiology) did not materialise.
  • Most value today is assistive/augmentative under human supervision.

  • But the pace of development has increased very quickly…

13 of 74

Over‑confident messaging can harm trust when systems fail to meet expectations

  • Babylon Healthcare said their chatbot, Babylon AI, had achieved medical exam scores of the same level as or higher than a human doctor. (27.1.2018)

  • "No app or algorithm will be able to do what a GP does”
  • "Every day we deliver care to more than a million people across the UK, taking into account the physical, psychological and social factors that may be impacting on a patient's health.“

Professor Martin Marshall, vice chair of the RCGP

https://www.itpro.com/bots/31401/doctors-reject-ai-chatbot-that-is-more-accurate-than-a-gp

14 of 74

AXI – my health journey

  • AI start up concept
  • Communication sensitive voice directed AI
  • For low priority cases after triage
  • Self help promotion and safety netting
  • NICE CKS/ NHS guidance
  • Ideas, concerns, expectations
  • Trained on RCGP level actors in role plays

15 of 74

What do the medical councils of America, Australia and the UK say about how medical doctors can safely and ethically use AI?

16 of 74

Australian Medical Association (AMA)

  • "Specific professional obligations to consider include accountability, an appropriate understanding of the tool, transparency of its use, informed consent and ethical and legal issues.“

  • "Medical care delivered by human beings should never be replaced with Artificial Intelligence (AI), but AI technology can potentially achieve improved healthcare."

17 of 74

General Medical Council (GMC)

  • "AI can now also support doctors when reading a variety of different imaging types, including X-rays, CT, and MRI scans, including mammograms.”

  • ‘To help keep patients safe you must:
  • report adverse incidents involving medical devices (including software, diagnostic tests, and digital tools) that put the safety of a patient or another person at risk, or have the potential to do so’

18 of 74

Navigating the frontier of AI technology in medicine: what’s next?”

  • “According to the doctors interviewed, AI systems can save valuable clinical time and enhance decision making. At the same time, these frontline users acknowledge potential pitfalls — among them, data security risks and the possible impact on clinical training.”

  • “They also warn that as AI evolves, lines of responsibility and accountability could become less clear, prompting questions about who holds ultimate responsibility for patient outcomes.”

(GMC Blog post, 12 February 2025; Dr Alan Abraham)

19 of 74

“Doctors using AI share thoughts on its growing use in medicine” (GMC news-archive)

  • (Published 6 February 2025)
  • “Most saw benefits to their efficiency when using AI, seeing it as a way to save or make more use of their time. However, some queried this saying they lacked confidence in the accuracy of some diagnostic and decision support systems, and so spent more time checking the results they received.”
  • “Many said they feel confident to override decisions made by AI systems if necessary, and that ultimately the responsibility of patient care remains with them.”

20 of 74

MPS Casebook article “The uncharted territory of AI, medicine, and the law” (June 2025) Peter J Mordecai

  • “At some point in time, it will go from being reasonable to not use AI to being unreasonable — it is just a question of when.”
  • “I do feel there will be a successful claim brought against a clinician for not using AI when it was available.”

  • “Given AI is relatively new to the world of medicine, I cannot emphasise enough how important it is that you take informed consent when using AI … explaining that AI is new, its benefits and limitations, safeguards, and alternatives.”

21 of 74

MPS Casebook article “The uncharted territory of AI, medicine, and the law” (June 2025) Peter J Mordecai

  • “Putting the wrong data into the AI system may lead to the wrong interpretation … one error could lead to a domino effect that causes the patient harm.”
  • “While we have AI, it is merely a tool. It is not a replacement clinician. A clinician’s judgment is still central to the patient’s care, and it is your judgement the patient will be seeking to rely on, not the AI’s.”
  • “This is an area of concern … clinicians risk becoming a ‘liability sink’ where they are held solely responsible for AI-driven decisions, even if the AI itself is flawed.”

22 of 74

MPS Casebook article “The uncharted territory of AI, medicine, and the law” (June 2025) Peter J Mordecai

  • “Clear detailed documentation must be kept at all times … including a clear record of the consent discussion, review of the AI, and any discrepancies between the AI and your opinion.”

23 of 74

How do I use AI?

24 of 74

Random uses for family tasks

  • Writing a CV for a child
  • Finding open days for secondary schools (CGPT vs Gemini cross checking)
  • Thank you letters

  • (Experiencing criticism from Gen Z)

25 of 74

‘please write a thank you email to Ms XXX of XXXX school for the Prague trip she arranged for a-level art students that my son XXXX went on and really enjoyed.’

  • Subject:Thank You for the Prague Trip
  • Dear Ms. XXXXX,
  • I hope you are doing well. I wanted to extend my sincere gratitude for organizing the recent A-Level Art trip to Prague. My son, XXXXX, had an absolutely fantastic time and returned full of enthusiasm, not only for the art he experienced but also for the beautiful city itself.
  • The itinerary you put together was truly impressive, offering such a rich combination of art, history, and cultural exploration. XXXX especially enjoyed visiting the galleries and the opportunity to sketch in such an inspiring environment. It’s clear that a great deal of thought and effort went into making this trip a memorable and enriching experience for all the students.
  • Thank you once again for your dedication and hard work in making this possible. It is truly appreciated!
  • Best regards,�[Your Name]
  • Thank You for the Prague Trip
  • Dear Ms.XXXXX,
  • I hope you are doing well. I wanted to extend my gratitude for organizing the recent A-Level Art trip to Prague. My son, XXXX, had an absolutely fantastic time and returned full of enthusiasm, not only for the art he experienced but also for the beautiful city itself.
  • The itinerary you put together was very impressive, with a nice combination of art, history, and culture.XXXXespecially enjoyed visiting the bone museum/ church and the opportunity to sketch in such an inspiring environment. He also really liked the crystal glass factory/ shop. It’s clear that a great deal of thought and effort went into making this trip an enriching experience for all the students.
  • Thank you once again for your hard work in making this possible.
  • Best regards,
  • Deen

26 of 74

Simple questions for everyday medical queries

  • Does a child with HFM disease need to stay off school/ nursery- quote NHS/ UK guidelines
  • As a private GP working for X would I refer Raynaud’s phenomenon to rheumatology or vascular surgery?
  • Is propranolol absolutely contraindicated in asthmatics for anxiety according to BNF/ NICE CKS?
  • Can you retrospectively diagnose Lyme disease from a suspected episode several years ago that was treated with antibiotics?

27 of 74

Does working in the printing industry expose you to asbestos?

  • Yes, it ispossiblefor a person working as aprinterto have been exposed to asbestos, although it is generallyless commoncompared to industries that traditionally used or dealt with asbestos more directly (such as construction, shipbuilding, or manufacturing).
  • In the past,asbestoswas used in a variety of materials and products, some of which could have been used in theprinting industryor the equipment used by printers. Some potential sources of asbestos exposure in the printing industry include:
  • 1. Older Printing Equipment:
  • Older printing pressesand other machinery may have containedasbestosin certain components such asbrake linings,clutch assemblies, orinsulation materialsused to protect parts of the equipment from heat.
  • In the past, asbestos was also commonly used ingasketsand other heat-resistant components of machinery, which could have been in use at printing facilities.
  • 2. Building Materials:
  • Many printing facilities, particularly older ones, may have been built withasbestos-containing materialsin their walls, ceilings, and flooring. If the building materials deteriorated or were disturbed (e.g., during renovations or maintenance), asbestosfiberscould become airborne and pose a risk of exposure.
  • 3. Other Equipment and Materials:
  • Printers may have worked around other materials that contained asbestos, such as certainpaper coatingsorinsulating materialsused in machinery and electrical equipment.
  • 4. Workplace Renovations or Maintenance:
  • If a printing facility was renovated or underwent maintenance involving older materials, workers could have been exposed to asbestos during these activities if proper precautions were not taken.

28 of 74

‘how quickly after a fall would you expect to see neurological/ GCS deterioration if the fall has caused an acute bleed/ intracranial haemorrhage?’

  • ‘The timing of neurological deterioration or Glasgow Coma Scale (GCS) changes after a fall that causes an acute intracranialhemorrhage(ICH) can vary, but generally,early deteriorationtypically occurs within thefirst few hoursafter the injury.
  • Immediate to 2 hours: In cases of more severe ICH, such asepidural hematoma(EDH) orsubdural hematoma(SDH), arapid deteriorationin neurological status can be observed. This is often due to the pressure exerted by the expanding blood clot on the brain. For example, inepidural hematomas, patients may experience alucid interval(a brief period of normal neurological function) followed by rapid deterioration.
  • 6-12 hours: In less acute cases, such as insubdural hematomas, deterioration may not be as sudden and could progress more gradually, typically worsening over the course ofseveral hourstoa day. This is often seen with older patients or those on anticoagulants, where the bleed may develop slowly.
  • Delayed deterioration(up to 24-48 hours) can occur withchronic subdural hematomasor less severe bleeds, where symptoms might progress over a longer period.
  • Thetiming of deteriorationis highly dependent on factors such as the location, size, and type ofhemorrhage, as well as the individual's age, medical history, and the presence of any comorbid conditions like anticoagulant use.
  • In general,watchful monitoringis recommended within the first 24 hours after a head injury, and any sudden or progressive changes in the GCS (such as a decrease of 2 or more points) would requireimmediate medical attentionand likely imaging (e.g., CT scan) to assess for potential bleeding or other brain injuries.
  • Key Points:
  • Acute deterioration: Within hours (0-2 hours) for somehemorrhages(e.g., epidural).
  • Gradual deterioration: May take 6-12 hours or more for others (e.g., subdural).
  • Delayed deterioration: Up to 24-48 hours, particularly for chronic bleeds or smaller bleeds.’

29 of 74

‘The Coroner has asked: ‘How the fall and rib fractures led to the development of pneumonia’

  • 'A fall resulting in rib fractures can lead to pneumonia due to several factors that impair the normal respiratory function and increase the risk of infection. Here’s how:
  • Pain and shallow breathing: Rib fractures cause significant pain, particularly when breathing deeply or coughing. As a result, a person may avoid deep breaths to minimize discomfort, leading to shallow breathing. Shallow breathing reduces the ability of the lungs to fully expand, which can prevent effective clearing of mucus and other secretions from the airways. This stagnation of mucus creates an environment conducive to bacterial growth, increasing the risk of infection and pneumonia.
  • Reduced mobility: If the person is unable to move or get out of bed easily due to the pain from rib fractures, they may not be able to cough or change positions often enough. This can further exacerbate the buildup of mucus and secretions in the lungs. Prolonged bed rest also impairs lung function, particularly in the lower parts of the lungs, making it easier for infection to take hold.
  • Increased risk of aspiration: Rib fractures, especially in older adults or those with multiple fractures, can lead to difficulties in swallowing or an increased risk of aspiration (inhaling food, liquids, or saliva into the lungs). Aspiration can introduce bacteria into the lungs, leading to aspiration pneumonia.
  • Impaired immune response: The body’s ability to fight infection may be compromised after a fall, especially in elderly individuals or those with weakened immune systems. This increases susceptibility to infections like pneumonia.
  • Underlying conditions: Falls often occur in older adults or those with chronic illnesses, such as heart disease, diabetes, or respiratory conditions. These pre-existing health issues can further increase the risk of developing pneumonia after a fall with rib fractures.
  • Thus, rib fractures can lead to pneumonia by disrupting normal respiratory mechanics, increasing mucus buildup, and lowering the body’s ability to clear infections from the lungs.'

30 of 74

Using AI in appraisal summary write ups 1

  • Never input other people’s data into CGPT/Gemini/Claude
  • Unethical – always be prepared to justify your process to the GMC
  • Don’t use AI to generate your writing- use it to enhance and tidy up your thoughts
  • Don’t put info into AI generate product out of AI that could identify a person/ practice- think: would this be anonymous if hacked/leaked

31 of 74

Using AI in appraisal summary write ups 2

  1. Read the GP appraisal portfolio (doesn’t involve AI)
  2. Have the face-to-face meeting (with no AI transcription) but take handwritten notes (no names/places)
  3. Use a transcription app AFTER the appraisal to ‘talk though’ your appraisal summary based on your notes/ immediate recollections (all in second person, with no names)
  4. Paste the raw transcription into both CGPT and Gemini and ask: ‘Please tidy this up as a summary of appraisal discussion for a GP according to Good Medical Practice 2024 and the RCGP appraisal guidelines’
  5. Edit the final version and transfer it into the toolkit

32 of 74

General tips for reports, papers, research

  • Use two AI bots simultaneously- compare and contrast the same prompt (CGPT and Gemini)
  • Keep specifying and checking: ‘Verbatim quotes only- no AI hallucinations’ – cross check with Gemini
  • Load up multiple documents sequentially saying ‘read these ahead of the next upcoming task’
  • Don’t give AI too many tasks at one time- break it up step by step
  • Start a report with a header and footer then input thoughts loosely paragraph by paragraph saying ‘Tidy this up, make it coherent with the rest of the text, make it sound formal yet reasonable’

33 of 74

Conclusions

  • Don’t input anyone else’s data into AI- e.g. GP appraisals/ practice meetings/ patient complaints
  • Ask CGPT not to plagiarize, and to reference the source(give them examples of your writing to mimic your style)
  • Restrict CGPT to NICE/NHS/UK/Royal College resources
  • In adopting new tools, check what peers think or do(and ask for DPIA/ DCB0160)
  • Productivity and effectiveness will be enhanced- don’t get left behind-but balance being a pioneer with being a guinea pig
  • Don’t fall into bad habits due to ease
  • Be ready to dissect your process to the GMC!

34 of 74

Ambient Voice Technology

35 of 74

Ambient Voice Technology

  • Tools that listen continuously in clinical consultations, then automatically transcribe conversations and pull out key medical terms and facts. These tools often also produce consultation summaries, referral letters, patient follow-ups, and task lists.
  • Microphones (lapel, desktop, room-based) pick up the spoken dialogue between clinician and patient.Speech recognition + AI/ML models process the audio: they identify medical language (diagnoses, symptoms, medications), extract relevant information, filter noise.
  • The system generates outputs such as transcripts, summaries, letters or notes

36 of 74

Why is AVT being promoted?

  • Saves time – reduces the need for manual note-taking and repetitive documentation.
  • Supports patient relationships – frees clinicians to give more attention to the patient during the consultation.
  • Improves record quality – captures details consistently, reducing missed information.
  • Promotes consistency – ensures notes and letters are more structured and complete.
  • Boosts efficiency – generates task lists and documents that streamline follow-up work.
  • Eases mental workload – reduces cognitive strain, especially when managing complex or multiple conditions.

37 of 74

Feedback from GPs

  • Quality of detail captured is remarkable
  • Used on HV with rambling demented patient- worked well
  • Doesn’t save that much time if you already touch type during consultations
  • Don’t like to read other clinician’s AI generated consultation notes: too generic
  • Useful for task list generation
  • Might capture things you don’t feel are relevant/ could be misleading ‘tight chest’ ‘headache vs brain fog’
  • AI hallucination: fentanyl patches instead of pain relief patches

38 of 74

“A framework to assess clinical safety and hallucination rates of LLMs for medical text summarisation”

  • LLM outputs showed a 1.47% hallucination rate and 3.45% omission rate when summarising clinical notes.
  • By refining the prompts and workflows, the researchers reduced major error rates below those seen in human note-taking.
  • Human oversight is critical even if error rates are low
  • Best used as support, not as a replacement for clinician judgement.

Asgari, E., Montaña-Brown, N., Dubois, M., et al. “A framework to assess clinical safety and hallucination rates of LLMs for medical text summarisation”, npj Digital Medicine, May 2025. DOI: 10.1038/s41746-025-01670-7

39 of 74

Ambient AI Scribes Study Insight

  • After >2.5 million uses during one year, doctors reported significantly less documentation burden using ambient AI scribes.
  • 56% of patients said the AI scribe positively improved the quality of their visit; the remainder reported no impact; no patient reported a negative impact.
  • Time savings estimated at nearly 1,794 eight-hour workdays over that year across the group.

“Ambient Artificial Intelligence Scribes: Learnings after 1 Year and over 2.5 Million Uses.” NEJM Catalyst Innovations in Care Delivery, Tierney et al., May 2025.

40 of 74

NHS England guidance on ambient scribing

  • Assign a Clinical Safety Officer and identify key risks (e.g. transcription errors; missing or incorrect clinical info; system outages; integration failures).
  • Complete DCB0160 documentation and a Data Protection Impact Assessment (DPIA) before deployment.
  • Plan integration with existing IT/workflows (especially EPR systems).
  • Put in place appropriate controls: regulatory compliance (DSPT, medical device regs, etc.), user training, and review of outputs before taking action.
  • Monitoring framework: audit output accuracy, incident reporting, review performance over time.

https://www.england.nhs.uk/long-read/guidance-on-the-use-of-ai-enabled-ambient-scribing-products-in-health-and-care-settings/

41 of 74

‘How Heidi meets all NHS guidance requirements’

  • ✅ DPIA: We’ve completed a comprehensive DPIA for NHS deployments, including lawful basis, risk mitigation, and data flow transparency. Versions have already been used to support ICB governance approvals.

  • ✅ Named Clinical Safety Officer
  • ✅ Encryption and GDPR Compliance: Heidi uses end-to-end encryption (TLS 1.2+ in transit and AES-256 at rest), is fully compliant with UK GDPR and the DPA 2018, with strict controls for data minimisation, subject access, and purpose limitation.

  • ✅ Appropriate Integration with NHS Clinical Systems: Heidi has the technical capability to support integration with major NHS clinical systems across both primary and secondary care, using FHIR, HL7, or custom APIs where supported by the local environment. This enables NHS clinics, practices and hospitals to choose the appropriate level of integration for them, from stand alone to fully embedded.

  • ✅ MHRA Classification: Heidi is currently registered as a Class I medical device under MHRA guidance for summarisation functionality. We do not generate diagnoses, management plans, or clinical referrals autonomously, and do not infer new clinical decisions. As we explore new and innovative features and functionality for the clinician to extend themselves and their capacity, our regulatory footprint will increase which is why we are progressing what Class II certification would look like for Heidi.

https://www.heidihealth.com/uk/blog/uk-clinicians-using-heidi-satisfy-nhs-guidance

42 of 74

Heidi Health

  • Ambient AI scribe for clinical documentation
  • Customisable templates, adapts to clinician style
  • Supports multiple languages, forms, letters, coding
  • Strong GDPR/NHS compliance, user-friendly UI
  • USP: Highly customisable and clinician-focused

43 of 74

ED transcript, note and advice/plan

  • Are there any medications I can use? Is anything I should be doing? Would it be a good idea to try um uh medication like Viagra, I can buy that over the counter? Do I need to be doing anything else?
  • Reports concerns about erectile dysfunction and enquires about treatment options. Specifically asks about sildenafil (Viagra) availability over the counter and seeks guidance on additional management strategies.

  • Advised that sildenafil is available over the counter from pharmacies following consultation with pharmacist. Discussed lifestyle modifications including regular exercise, healthy diet, stress reduction, and limiting alcohol consumption. Recommended addressing any underlying cardiovascular risk factors. Suggested considering GP consultation if symptoms persist or worsen for further evaluation and potential prescription alternatives.

44 of 74

Exertional SOB transcript

  • Hello doctor, I have found that when I um go upstairs or I walk um incline upwards, I'm starting to get shortness of breath and I get some numbness in my left hand. This has been ongoing for the last few weeks. I also have palpitations from time to time. Um, otherwise before this I've been fitting well.
  • 0:30
  • Exercising on a regular basis. Going for regular walks. And um, I've also I stopped smoking about 10 years ago, so I used to smoke for about 20 years. Um, my um, my concern is whether this is good for my heart or not and um, I also suffer from acid reflux and um, I don't know whether I should start a medication to see if um, I used to be put on um, something called omeprazole before. Don't know if I should start that again. Um, and just wanted to get some advice whether I need to.
  • 1:10
  • Any further investigations or not? Also, I've not been doing much exercise now and just want to get some guidance as to whether I should still be going to the gym and working out.

45 of 74

Exertional SOB clinical note

History:�- Reports shortness of breath when walking upstairs or on inclines, developing over last few weeks�- Associated left hand numbness during exertion�- Intermittent palpitations�- Previously fit and well with regular exercise and walks�- History of acid reflux, previously treated with omeprazole, considering restarting�- Concerns regarding cardiac cause�- Currently avoiding exercise, seeking guidance on gym attendance��- PMH: acid reflux�- DH: previously omeprazole (acid reflux). Allergies: �- FH: �- SH: ex-smoker 20 years (quit 10 years ago), regular exercise and walks prior to symptoms

46 of 74

Engaging Patients in AVT Use

  • Communication (Comms)
  • Explain to patients why the practice is adopting Ambient Voice Technology (AVT).
  • Use waiting room screens, posters, websites, and phone messages to raise awareness.
  • Update privacy notices to reflect AI/AVT use.
  • Transparency
  • Inform patients before consultations that AVT is in use.
  • Clearly explain its purpose: e.g., to free up clinician attention, improve documentation accuracy.
  • Position as a tool that benefits both clinician and patient experience.

47 of 74

Informed Consent and Patient Choice

Consent

  • Provide patients with a clear opt-out choice for AVT during their consultation.
  • Explain what opting out means (e.g., the clinician will type notes manually).
  • Present positively: opting in supports better-quality notes and more attentive clinicians.

Ethical Considerations (NHS Guidance)

  • Consent must be informed, voluntary, and ongoing (NHS England).
  • Patients should understand risks and benefits, including data security.
  • Practices must ensure no disadvantage for patients who opt out.

Best Practice

  • Frame as a partnership: AI supports care, but the clinician remains responsible.
  • Encourage feedback to improve patient trust and system performance.
  • Ensure accessibility: provide explanations in plain language and accessible formats.

48 of 74

Accurx Scribe (Tandem Health)

  • - Integrated into Accurx platform used by most GP practices
  • - Real-time transcription, summarisation, coding & letters
  • - Strong compliance: MHRA registration, DSPT, ISO 27001
  • - Seamless integration with EMIS and SystmOne
  • - USP: Large-scale adoption, strong NHS integration

49 of 74

Anima & NHS Pilot AVTs

  • Used in NHS pilot projects (BNSSG region)
  • Focused on improving feasibility and feedback
  • Designed for primary care settings
  • Compliance with NHS guidance
  • USP: Pilot-backed insights and early adoption feedback

50 of 74

Tortus and Lexacom

  • Tortus: Focuses on AI medical scribing and workflow automation.
  • Real-time transcription and integration into electronic health records (EHRs).
  • Lexacom Scribe: UK-based provider specialising in dictation and transcription solutions.
  • Long-standing reputation in healthcare dictation with AI augmentation.

51 of 74

Other AVT providers

  • Joy: Patient engagement and communication tool.
  • Human-centred design for improving clinician-patient relational care and reducing admin burden.
  • Kiwipen: Digital documentation and AI note-taking. USP: Emphasis on seamless clinician input capture with a focus on simplicity and reducing mental load.
  • Healthorbit: Offers AI-powered administrative and care coordination solutions. USP: Streamlining complex care pathways and task management.
  • Anathem: AI-driven transcription and documentation startup. USP: Lightweight and clinician-friendly system with adaptive learning models.
  • Xon Health: SCG company product offering healthcare transcription/AI documentation. USP: Scalable enterprise-level solutions with focus on regulatory compliance.
  • Tandem: Part of Accurx ecosystem, focusing on integrated workflows. USP: Collaboration-first model, embedding AI documentation into existing messaging/coordination tools.

52 of 74

Key Considerations Across Providers

  • All require clinician oversight & editing of outputs
  • Must align with NHS England guidance & compliance
  • Named Clinical Safety Officer required
  • Integration with existing EHRs is critical
  • Consider scalability, usability, and trust

53 of 74

DCB0160

  • DCB0160 refers to a specific standard from the UK's Digital Health and Care (DHC) framework, particularly relevant to health and care organizations working with digital health technologies.
  • DCB0160 is aData Security and Protection (DSP) Toolkit standard, and it is one of the key standards that healthcare organizations in the UK must adhere to when processing personal health data, ensuring they meet proper cybersecurity and data protection practices.

54 of 74

DCB0160 standard and Data Protection Impact Assessment

  • Need to ask supplier what happens with the data – every practice needs DCB0160 (legal standard) clinical safety case- and a DPIA- these are legal things that need to be in place. You need to get clear answer from the supplier not just mention of these terms.
  • A DPIA (Data Protection Impact Assessment) in the context of AI and medicine refers toa systematic process used to identify, assess, and mitigate the risksrelated to the processing of personal data, particularly sensitive health data, in applications that use artificial intelligence (AI).
  • It is a key component of privacy and data protection practices, especially in regions governed by laws like the European Union's General Data Protection Regulation (GDPR).

55 of 74

Importance of Data Protection Impact Assessments in AI/Medicine:

  • AI applications in medicine, such as predictive models, diagnostic tools, and treatment planning systems, rely on vast amounts of patient data.
  • Ensuring that these AI systems comply with data protection laws and respect patient privacy is critical, especially in light of the sensitivity of health data.
  • DPIAs help ensure that AI systems in healthcare are designed with privacy and security in mind, protecting individuals from potential harms like data breaches, discrimination, or privacy violations.
  • By conducting a DPIA, medical organizations and developers of AI tools can demonstrate their commitment to responsible data usage, build patient trust, and comply with regulatory requirements like GDPR or HIPAA (in the U.S.).

56 of 74

Writing medical literature using AI

Thoughts on AI from the Creative Penn

57 of 74

Artificial Intelligence (AI) in Publishing with Thad McIlroy and Joanna Penn

  • ​Perception of AI Integration: The publishing industry's apprehension towards AI stems from a feeling of violation, as content is perceived to have been indiscriminately used for AI training. This highlights the need for transparency in data usage.​
  • Quality Assurance: The increasing importance of high-quality outputs underscores the need for meticulous oversight when utilizing AI tools.​
  • AI in Research and Discovery: AI's role in enhancing search and discoverability is acknowledged, suggesting its potential in medical research for literature reviews and data analysis.​

https://www.thecreativepenn.com/2024/08/12/artificial-intelligence-ai-in-publishing-with-thad-mcilroy/

58 of 74

Fair Use, Copyright, and Licensing. AI and the Author Business with Alicia Wright (and Joanna Penn)

  • AI as a Collaborative Tool: AI can serve as a brainstorming partner, aiding in the generation of new ideas and solutions.​
  • Transformative Use and Fair Use Doctrine: The concept of fair use applies when AI creates something transformative, adding new expression or meaning to the original data.​
  • Importance of Quality Inputs: The outputs of AI systems are heavily dependent on the quality of the inputs and prompts provided by the user.​
  • Developing Professional Judgment: While AI can provide valuable insights, it is crucial for professionals to maintain and develop their own judgment and expertise, using AI as a supplementary tool rather than a replacement.

https://www.thecreativepenn.com/2025/01/27/fair-use-copyright-and-licensing-ai-and-the-author-business-with-alicia-wright/

59 of 74

Collaborative Writing With AI With Rachelle Ayala and Joanna Penn

  • Understanding AI Tools: one doesn't need a technical background to effectively utilize AI tools, as they are designed to be user-friendly and accessible.​
  • AI as a Collaborative Partner: view AI as a collaborator in the creative process, assisting in brainstorming and idea generation, rather than replacing human creativity.​
  • Evolving Creative Processes: The integration of AI into creative workflows is dynamic and continually evolving, requiring adaptability and openness to change.​
  • Maintaining Human Creativity: the importance of doubling down on human creativity and passion to produce authentic and meaningful work.​
  • Ethical Considerations: understanding copyright and ethical aspects when incorporating AI into one's work.​

https://www.thecreativepenn.com/2024/06/21/collaborative-writing-with-ai-with-rachelle-ayala/

60 of 74

Generative AI Impact on Creativity and Business in the Music Industry with Tristra Yeager

  • Impact on Content Creation: The proliferation of AI-generated content raises concerns about managing an overwhelming amount of material, making it challenging to discern high-quality, human-created work.​

61 of 74

How AI Tools Are Useful For Writers With Disabilities And Health Issues With S.J.Pajonas

  • Overcoming Cognitive Challenges: AI tools assist individuals experiencing brain fog due to conditions like chemotherapy or brain injuries by facilitating brainstorming and idea organization.​
  • Managing Physical Limitations: For those with physical disabilities that make prolonged computer use difficult, AI enables content creation through voice input or shorter typing sessions, reducing physical strain.​
  • Enhancing Focus for Neurodiverse Individuals: AI supports individuals with ADHD by maintaining engagement through interactive dialogues, helping to keep them on track during tasks.​
  • Boosting Productivity Amidst Low Energy: Individuals with conditions causing low energy levels, such as autoimmune disorders, can use AI to maximize productivity during limited periods of activity.​

62 of 74

9 Ways That Artificial Intelligence (AI) Will Disrupt Authors and the Publishing Industry. Nick Thacker,"

  • AI as a Collaborative Tool: AI can assist in various aspects of the creative process, acting as a partner rather than a replacement.​
  • "AI is a tool that can help us create more and better content, but it's not going to replace the human touch." Delegating Routine Tasks: AI can handle repetitive tasks, allowing professionals to focus on areas that require human expertise and creativity.​
  • "We should be focusing on what we do best and let AI handle the rest."
  • Managing Content Overload: With the exponential increase in AI-generated content, tools are needed to help navigate and find valuable information.​

63 of 74

Podcast References

  • https://www.thecreativepenn.com/2025/01/27/fair-use-copyright-and-licensing-ai-and-the-author-business-with-alicia-wright/
  • https://www.thecreativepenn.com/2024/08/12/artificial-intelligence-ai-in-publishing-with-thad-mcilroy/
  • https://www.thecreativepenn.com/2024/06/21/collaborative-writing-with-ai-with-rachelle-ayala/
  • https://www.thecreativepenn.com/2023/08/11/how-ai-tools-are-useful-for-writers-with-disabilities-and-health-issues-with-s-j-pajonas/
  • https://www.thecreativepenn.com/2023/07/03/9-ways-that-artificial-intelligence-ai-will-disrupt-authors-and-the-publishing-industry-an-update-with-joanna-penn-and-nick-thacker/

64 of 74

Thanks for listening

Any questions?

65 of 74

FUTURE-AI: international consensus guideline for trustworthy and deployable artificial intelligence in healthcare

66 of 74

FUTURE-AI: international consensus guideline for trustworthy and deployable artificial intelligence in healthcare

  • comprises 117 interdisciplinary experts from 50 countries including AI scientists, clinical researchers, biomedical ethicists, and social scientists.
  • six guiding principles—fairness, universality, traceability, usability, robustness, and explainability.
  • 30 best practices were defined, addressing technical, clinical, socioethical, and legal dimensions.

https://www.bmj.com/content/388/bmj-2024-081554

https://future-ai.eu/

67 of 74

Health Innovation Network's (HIN) report on the Primary Care AI and Automation Grants Scheme Pilot

  • Objective: The initiative aimed to alleviate administrative burdens on GP practices by automating routine tasks, thereby allowing healthcare professionals to focus more on direct patient care. ​
  • Pilot Overview: Eleven pilot sites received up to £65,000 each to implement AI and automation solutions targeting areas such as pathology, patient registrations, call and recall processes, workforce rota management, clinical safety, and document management. ​
  • Benefits Observed:
  • Time Savings: Automation of routine tasks led to measurable time efficiencies, enabling staff to allocate more time to complex patient interactions.​
  • Consistency and Accuracy: Automated processes reduced human error, ensuring more consistent and reliable outcomes.​
  • Staff Satisfaction: By reducing repetitive tasks, staff experienced increased job satisfaction and could engage in more meaningful work.​

68 of 74

Health Innovation Network's (HIN) report on the Primary Care AI and Automation Grants Scheme Pilot

  • Challenges Encountered:
  • Implementation Delays: Some pilots faced delays due to technical issues and the need for staff training.​
  • Scalability Concerns: Ensuring that AI solutions could be effectively scaled across different practices presented challenges.​
  • Financial Realization: Translating time savings into direct financial benefits was complex and required further development.​
  • Recommendations:
  • Develop National Guidelines: Establish clear frameworks for AI and automation adoption in primary care to ensure ethical and safe implementation.​
  • Engage Stakeholders: Involve all relevant parties, including patients, in the planning and implementation phases to address concerns and enhance acceptance.​
  • Continuous Evaluation: Regularly assess the impact of AI tools to identify areas for improvement and ensure they meet ethical standards.​

69 of 74

Fairness

  • 1: Define sources of bias
  • 2: Collect information on individual and data attributes
  • 3: Evaluate fairness

Universality

  • 1: Define clinical settings
  • 2: Use existing standards
  • 3: Evaluate using external data
  • 4: Evaluate local clinical validity

70 of 74

Traceability

  • 1: Implement risk management
  • 2: Provide documentation
  • 3: Implement continuous quality control
  • 4: Implement periodic auditing and updating
  • 5: Implement AI logging
  • 6: Implement AI governance

Usability

  • 1: Define user requirements
  • 2: Define human-AI interactions and oversight
  • 3: Provide training
  • 4: Evaluate clinical usability
  • 5: Evaluate clinical utility

71 of 74

Robustness

  • 1: Define sources of data variations
  • 2: Train with representative data
  • 3: Evaluate robustness

Explainability

  • 1: Define explainability needs
  • 2: Evaluate explainability

72 of 74

Key Details about DCB0160:

  • Purpose:DCB0160 is part of theNHS Digital frameworkthat sets out the requirements for healthcare organizations to demonstrate that they are safeguarding sensitive patient information when using IT systems, particularly in digital health and care contexts.
  • Scope:It is specifically focused on ensuring that digital systems used in the healthcare sector aresecureandprotect patient datafrom breaches, leaks, and misuse.
  • Criteria: The DCB0160 standard typically addresses several areas of data security, such as:
  • Information governance policies
  • Cybersecurity measures and controls
  • Data protection and compliance (e.g., GDPR)
  • Secure access to systems
  • Incident reporting and management
  • Encryption and secure data storage practices
  • Implementation:Organizations must adhere to this standard as part of their efforts to achieve DSPT compliance, demonstrating that they have taken necessary actions to protect patient data when using IT or digital health tools.
  • Relevance to AI:For AI technologies used in the healthcare sector, DCB0160 helps ensure that any AI-driven systems that process personal health information (e.g., diagnostic tools, predictive models) comply with data protection, security, and privacy standards.

73 of 74

Purpose of a Data Protection Impact Assessment in AI/Medicine:

  • Risk Assessment: DPIAs help organizations assesshow the processing of personal health data might affect individuals' privacyand data protection rights. This is crucial in AI, where algorithms may process large volumes of sensitive data (e.g., medical records, genetic information, diagnostic data).
  • Transparency: The process helps ensure transparency by requiringclear explanations of how personal data will be collected, used, stored, and shared. This is especially important in medical AI, where data is often sensitive and individuals need to be informed about how their health data will be used.
  • Mitigating Risks: A DPIAhelps identify potential risks to patient privacy(e.g., unauthorized access, misuse of data, or incorrect AI predictions) andestablishes measures to mitigate those risks. These could include encryption, data anonymization, or regular audits.

74 of 74

Steps Involved in a DPIA:

  • Description of Processing:Outline how AI systems will process personal data, including the types of data used (e.g., health records, diagnostic images) and the purposes of the processing (e.g., diagnosis, treatment recommendations).
  • Assessment of Necessity and Proportionality: Determine whether the processing of sensitive health data is necessary for the intended purpose and whether the data being processed is proportional to the goals.
  • Risk Identification and Evaluation: Identify potential risks to the privacy of individuals whose data is being processed, including the risk of data breaches or harmful impacts due to AI errors in healthcare decisions.
  • Mitigation Measures: Propose strategies to minimize or eliminate these risks, such as anonymizing data, implementing secure AI models, or ensuring human oversight in clinical decision-making.
  • Consultation: If the DPIA identifies high risks that cannot be mitigated, the organization may need to consult with a data protection authority before proceeding with the AI project.