What this service covers

DrGeorgeAI provides educational guidance on using general-purpose AI tools (such as ChatGPT, Claude, and Gemini) for CPD learning and documentation. This includes:

  • Writing CPD reflections and learning summaries
  • Drafting and refining learning plans
  • Summarising and digesting journal articles and guideline updates
  • Preparing for peer discussions and self-assessment
  • Structuring practice improvement documentation
  • Processing conference, webinar, and podcast notes
  • Preparing CPD summaries and annual statements

All guidance is designed around de-identified, non-patient-specific content. The workflows help you engage more deeply with your learning and produce documentation that honestly represents what you took away from it.

What this service does not cover

No clinical decision support. This resource does not teach you how to use AI to diagnose patients, choose treatments, interpret investigations, or make clinical recommendations. That is a different domain with different risks, different regulatory considerations, and different stakes.

No patient-specific work. None of the workflows involve putting patient-identifiable information into an AI tool. Not names, not hospital numbers, not dates of birth, not clinical details that could identify someone. De-identification happens before AI, not during.

No CPD gaming. The purpose is not to generate documentation faster so you can tick a box without learning anything. The workflows are designed to help you genuinely engage with learning material and produce documentation that honestly represents your professional development.

No regulatory endorsement. This is an independent educational resource. It does not represent the views of, or imply endorsement by, any medical college, CPD home, employer, or regulatory body. As of early 2026, no Australian medical college has published specific guidance on whether AI may be used to assist with CPD documentation.

Privacy and data guidance

When using AI tools for CPD work, follow these principles:

  • De-identify before you begin. Remove all patient-identifiable information before entering any content into an AI tool. This includes names, hospital numbers, dates of birth, locations, and any combination of details that could identify an individual.
  • Use a separate project or conversation. Keep your CPD work in a dedicated AI project or conversation, separate from any other use of the tool. This reduces the risk of unintended data mixing.
  • Check your employer's policies. Your hospital or health service may have specific policies about which AI tools are approved for professional use and what data may be processed. Follow those policies.
  • Understand the tool's data handling. Review the privacy policy and data handling practices of any AI tool you use. Some tools may use your inputs for model training unless you opt out. The guide covers how to check and adjust these settings for major platforms.

Clinician responsibility

AI is a tool. You are the professional.

Every piece of CPD documentation that carries your name is your responsibility. This means:

  • You review and edit every AI output. AI tools generate text that can sound confident and polished while being factually wrong, inappropriately generic, or not reflective of your actual experience. Your clinical judgment is the quality-control layer.
  • You own the final document. If your CPD home audits your portfolio, you need to be able to stand behind what is in it. The reflection, the learning, and the professional growth described must genuinely be yours.
  • You decide what is appropriate. The workflows and prompts in this resource are suggestions. You know your practice context, your CPD home's requirements, and your professional obligations better than any guide can. Use your judgment.

CPD home and college requirements

Different CPD homes and colleges have different documentation requirements, activity categories, and audit expectations. This resource provides general workflows that can be adapted to any Australian CPD program, but it is your responsibility to ensure your documentation meets the specific requirements of your CPD home.

If your CPD home or college publishes guidance on AI use in CPD documentation, follow that guidance. Where no specific guidance exists, the conservative approach in this resource is designed to keep your work defensible: be transparent about your process, ensure the learning is genuine, and make sure the final documentation accurately represents your professional development.

Transparency about AI use

This resource takes a conservative position on transparency. If you use AI tools to assist with your CPD documentation, consider keeping a brief note of how you used them. This does not mean declaring AI assistance on every entry, but maintaining an awareness of your process and being prepared to explain it if asked.

The goal is honest, defensible CPD practice. AI helps with the structure and writing. The learning, the reflection, and the professional judgment are yours.

The book covers governance guidance for each CPD use case, including de-identification walkthroughs and worked examples.

About the book