by Julie-Anne Peake, Clinical Psychologist
More and more people are turning to AI tools to support their mental health. Global estimates suggest millions of users are now engaging with AI chatbots for emotional support, with some platforms reporting usage in the tens of millions. The appeal is clear. AI is available 24/7, responds instantly, and feels private, accessible, and free from judgement.
But this shift raises some important concerns.
Most AI mental health tools are not designed within a clinical framework. They can simulate empathy, but they do not hold responsibility for care. There is no real-time clinical oversight, no continuity of therapeutic relationship, and no duty of care if someone becomes distressed or unsafe. For people already vulnerable, this can create a false sense of being supported while actually being alone with an algorithm.
There are also significant privacy issues. Many widely used AI tools are not compliant with health privacy standards such as HIPAA, and may store or process sensitive conversations in ways users do not fully understand. This means deeply personal information may not be protected in the same way it would be in therapy.
This is where a different model becomes essential.
ANTSA is an integrated platform designed by Australian psychologists for between session care. The platform contains many features but its therapist overseen AI chatbot (JAImee) is probably one of its most popular tools.
JAImee is not designed to replace therapy. It is built to support it.
• It operates within a clinically informed framework
• It adheres to Australian privacy and confidentiality standards
• It is integrated with the therapeutic process, not separate from it
• Importantly, your psychologist is alerted to messages indicating moderate to severe distress, allowing timely human intervention
This changes the role of AI entirely. It becomes a supported extension of care, rather than a standalone substitute.
AI in mental health is not going away. Used well, it can increase access, provide support in difficult moments, and help bridge the gaps between sessions.
But without clinical oversight, transparency, and safeguards, it also carries real risks.
The difference lies in how it is built, how it is used, and whether there is a human still holding responsibility for care.
If you are an existing client of mine, feel free to ask me about how you can get free access to jAImee as part of your therapy journey.
References
Australian Institute of Health and Welfare. (2023). Mental health services in Australia. Canberra: AIHW.
Black Dog Institute. (2020). Digital mental health in Australia: Current trends and future directions. Sydney: Black Dog Institute.
Christensen, H., & Hickie, I. B. (2010). Using e-health applications to deliver new mental health services. Medical Journal of Australia, 192(S11), S53–S56.
D’Alfonso, S. (2020). Artificial intelligence in mental health. Current Opinion in Psychology, 36, 112–117.
Fitzpatrick, K. K., Darcy, A., & Vierhile, M. (2017). Delivering cognitive behavior therapy to young adults with symptoms of depression and anxiety using a fully automated conversational agent (Woebot): A randomized controlled trial. JMIR Mental Health, 4(2), e19.
Inkster, B., Sarda, S., & Subramanian, V. (2018). An empathy-driven, conversational artificial intelligence agent for digital mental well-being (Wysa): Real-world data evaluation. JMIR mHealth and uHealth, 6(11), e12106.
Luxton, D. D. (2014). Artificial intelligence in psychological practice: Current and future applications and implications. Professional Psychology: Research and Practice, 45(5), 332–339.
Torous, J., & Roberts, L. W. (2017). Needed innovation in digital health and smartphone applications for mental health. JAMA Psychiatry, 74(5), 437–438.
Bickmore, T., Trinh, H., et al. (2018). Safety and ethical considerations in conversational agents for mental health. Journal of Biomedical Informatics, 82, 70–79.
Office of the Australian Information Commissioner. (2020). Australian Privacy Principles guidelines. Canberra: OAIC.
Australian Digital Health Agency. (2021). National Digital Health Strategy. Canberra: ADHA.
Commonwealth of Australia. (1988). Privacy Act 1988 (Cth).
Add comment
Comments