Interview

AI Ethics & Mental Health - A Conversation with John Walter

Guest: John Walter · Topic: AI ethics, coaching, and mental health
Format: Online Meeting Q&A · Published: 4/3/2026

Artificial intelligence (AI) is beginning to appear in many professional settings, but its role in psychotherapy remains uncertain and is often debated. In a recent conversation I had with a practising Counsellor, John Walter, he shared how AI tools are currently used in his work, the ethical considerations surrounding his use, and what future therapists may need to understand when navigating AI-assisted therapy sessions.

How AI Is Used in John's Therapeutic Sessions

Contrary to my assumption, AI is not widely used as a direct component of John's therapy sessions. Instead, AI tends to function as a assistive tool for him, rather than a replacement for interaction between him and the clients. John explained that AI is only occasionally used to organise and refine his case notes. After sessions, he might also dictate notes and use AI tools to tidy the structure or identify themes that may have been missed. AI can also assist in writing articles, structuring educational material for children, or preparing resources for clients.

In some cases, clients themselves introduce AI into their reflective process between sessions. For example, one client interested in dream analysis explored AI-generated interpretations and later discuss these reflections within therapy. For John, this highlights how AI can become part of the broader reflective process without replacing the therapeutic work itself.


AI as an Assistive Tool Rather than a 'Non-Human Therapist'

When I asked if John thinks whether AI should function primarily as a tool for therapists or as an interactive system for clients, John described it as a dynamic balance. To John, AI can support therapists in structuring information and identifying patterns, but ultimately, its role depends on how therapists choose to integrate it into their work ; the responsibility remains with the therapist to decide when and how AI should be used. John noted therapists should ensure AI supports the therapeutic process rather than interfering with it.


How are Confidentiality and Ethical safeguarding ensured?

Because psychtherapies involve highly sensitive information, therapists must ensure confidentiality when interacting with AI tools.

John emphasised that identifiable client information should never be entered into AI systems; therapists should use initials or numbers to make sure clients are non-identifable. John suggested disable data-training features should also be turned on within AI platforms to prevent information from being stored, reused, or leaked.

To John, AI tools should primarily be used for formatting or identifying missing elements within notes, rather than storing clients' data, as these data should only be stored in secure clinical systems. For John, maintaining ethical practice fundamentally depends on therapists' professional judgement and their transparencies with clients.


Should Clear Ethical Guidelines be Established for Therapists to Follow?

Professional guidance around AI in therapy is still developing. In the United Kingdom, organisations such as the British Association for Counselling and Psychotherapy (BACP) are beginning to address the issue. John noted that different organisations may take different approaches: While some organisations lke BACP emphasise stronger regulation, others such as The National Counselling and Psychotherapy Society (NCPS) adopt a more flexible stance, focusing on therapists' responsibility and consent from client. Across these perspectives, John's stance remains clear: AI should not replace professional decision-making within therapy.


Clients' Typical Response to AI in Therapy Sessions

Interestingly, John's client responses to the use of AI vary widely. Whilst some clients are comfortable discussing or using AI tools between sessions, others feel uncertain or uneasy about the idea. Because of this, John approaches AI cautiously: AI tools are introduced only when they appear helpful for a particular client and is remained optional within the therapeutic process.


Can AI Replace the Therapeutic Relationship?

Despite the rapid evolvement of AI systems, John strongly emphasised that the therapeutic relationship remains irreplaceable. He noted the bond between therapist and client forms the foundation of many therapeutic approaches. While AI may assist with reflective exercises, it cannot replicate the depth of human connection that remains as the key concept of many therapeutic approaches through decades. He suggested structured approaches such as aspects of Cognitive-Behavioural Therapy (CBT), may be more adaptable to AI tools. However, John believes that human oversight will always remain essential.


Limitations of AI in Therapy

One challenge with AI systems, as John suggested, is their tendency to generate responses based on assumptions rather than genuine understanding. AI models often fill in gaps by predicting the most likely continuation of a sentence or an idea. While this can produce useful suggestions, it also means that AI may make assumptions or oversimplify complex psychological experiences. For John, this reinforces the importance of maintaining critical oversight when using AI tools.


John's Advice to Future Therapist Intended to Use AI in Sessions

For students and therapists interested in AI-assisted sessions, John emphasised the importance of developing technical literacy.

Rather than relying on others to set up AI tools, therapists should learn how these systems function and configure them responsibly. Understanding how to train AI systems, manage data privacy settings, and adapt prompts to align with therapeutic approaches will likely become increasingly important skills in the future. In John’s view, AI should be treated like any other professional tool: something that must be understood, managed, and used with care.

Kumi's Reflection from the Discussion

What I found most interesting in this conversation was how cautiously AI is currently being integrated into therapeutic practice.

Before the interview, I had assumed that AI might be used more actively during sessions. Instead, John described it primarily as an assistive tool — something that helps organise thoughts, structure notes, or support reflection between sessions rather than replacing the therapeutic process itself.

This made me reflect on how much of psychotherapy relies on the human interaction between therapist and client. I resonate a lot with John's stance: AI cannot, and will not replicate the emotional understanding or rapport that develops in a therapeutic relationship. Hearing John emphasising this point reinforced the idea that although technology may support therapies on surface level, it cannot substitute the human connection at its core.

At the same time, the discussion raised interesting questions about the potential of AI tools' development. If AI tools become more common in professional practice, clinical skills might not be the only essential skills to develop, but also technical literacy: that being said, understanding how these systems work and how to use them responsibly is likely to be the key future therapists will need to work on when utilising AI as an assistive tool in my point of view.

For me, this conversation highlighted how psychology is entering a period where technology and psychotherapy will slowly interconnect with each other. Rather than replacing therapists, AI may challenge future therapists to think more critically about how tools can support, rather than undermining ethical and relational aspects of therapy.

本文僅作教育與作品集用途,內容不構成任何專業或臨床建議。

撰寫:Kumi Lam (Cheng U) · Psych with Kumi
© 2026 Kumi Lam (Cheng U)。版權所有。