Photo: The implementation of artificial intelligence technology in correctional healthcare carries significant ethical and operational considerations. | Photo Credit: ChatGPT-5
By Pia Puolakka
Editor’s Note: This article is follow-up to “The Potential of Artificial Intelligence in Prisons,” which was written by the author and published in the September-October Maintenance & Operations edition of Correctional News.
As artificial intelligence (AI) begins to shape every aspect of correctional systems, healthcare stands out as one of its most promising yet complex frontiers. Prisons face chronic healthcare crises: overcrowding, staff shortages and limited access to specialized treatment. The COVID-19 pandemic accelerated digitalization across correctional systems, making telemedicine, digital self-help and AI-assisted monitoring a permanent part of the landscape (Puolakka, 2025a). Within this new ecosystem, AI is not merely a technical upgrade — it has become an ethical and operational imperative.
The Healthcare Challenge Behind Bars
Prisoners exhibit significantly higher rates of chronic disease, substance abuse and mental health disorders compared with the general population. Access to healthcare professionals is limited, and treatments are often fragmented. The Council of Europe’s recommendation on AI in prisons and probation from October 2024 identifies healthcare as one of the key areas where AI may support diagnosis and follow-up medical treatment — provided it does not replace face-to-face care. The principle of “human-centered AI” is therefore critical: technology should enhance medical access and safety, not automate human care.
Applications of AI in Prison Healthcare
AI systems are now used to predict medical risks, assist diagnostic assessment, support mental healthcare and deliver therapeutic interventions through immersive environments such as virtual reality and chatbot-based solutions.
Predictive Analytics and Early Warning Systems
Machine-learning models can identify inmates at risk of self-harm or physical deterioration. Studies from Australia and New Zealand have used interpretable AI models to detect suicidal behavior in correctional settings (Akhtar et al., 2024). Neural networks are also applied to predict mental health decline during imprisonment (Allahyari & Moshtagh, 2021). Integrating such models into Offender Management Systems (OMS) allows healthcare staff to intervene early and manage crises proactively (Puolakka, 2025b).
Diagnostic and Clinical Support Tools
AI supports remote diagnosis in cases where on-site medical specialists are unavailable. According to the Council of Europe’s principles, AI may assist in automated and remote medical diagnosis and follow-up treatment, but must never substitute professional care (Council of Europe, 2024). For example, in Finland’s Smart Prisons, prisoners already contact the prison polyclinic through secure digital platforms, and similar self-service systems are improving the daily management of healthcare polyclinics worldwide (Puolakka, 2025a).
Telemedicine and Digital Therapy
Telehealth solutions, accelerated during the pandemic, now include AI-driven chatbots offering cognitive-behavioral support or trauma-informed dialogue in prisons (Eye on Annapolis, 2025). Systems such as Echo and Therapii simulate human interaction, provide psychoeducation, and track well-being through adaptive algorithms (Fitzpatrick et al., 2025). The growing use of such therapy bots demonstrates the potential of AI to deliver scalable, individualized mental-health support when human resources are limited.
Virtual Reality and Avatar-Based Interventions
Virtual reality (VR) has shown strong potential in prison rehabilitation and clinical psychology. Preliminary findings suggest that combining Compassion-Focused Therapy with virtual reality (VR) has reduced aggressive behavior and increased empathy among young male offenders (Finnish Institute for Health and Welfare [THL], IMAGINE Project, 2025). VR- and avatar-based interventions (i.e., the use of digital or virtual representations of people — avatars — as active elements in the therapeutic process) also offer opportunities for skills training, stress management and well-being promotion. Outside correctional settings, a specific avatar therapy allows schizophrenia patients to externalize internal voices and rehearse social interactions safely (Craig et al., 2018). These approaches merge psychological insight with technological immersion, creating new pathways for behavioral change and mental well-being.
Ethical and Legal Considerations
The ethical implications of AI in correctional healthcare are significant. Data privacy, consent and the potential overreach of surveillance technology require careful governance. The Council of Europe recommendation emphasizes transparency, accountability and the right to human review of AI-assisted decisions. Developing AI literacy among both staff and prisoners is crucial to ensure informed participation. However, there remains a risk that emotional attachment to “therapy bots” or virtual therapists may blur boundaries between reality and simulation, underscoring the need for human oversight.
Future Directions and Conclusion
AI in prison healthcare is shifting from reactive treatment toward preventive, data-driven care. Predictive analytics, telemedicine and immersive digital therapies can transform how correctional health services operate — provided that ethical frameworks and human judgment remain central. Aligned with human rights and clinical integrity, ethical AI can ensure that prisoners — often among society’s most vulnerable — receive dignified, continuous and effective care.
References
Akhtar, K., Yaseen, M. U., Imran, M., Khattak, S. B. A., & Nasralla, M. M. (2024). Predicting inmate suicidal behavior with an interpretable machine learning approach. Australian & New Zealand Journal of Psychiatry. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11232594/
Allahyari, E., & Moshtagh, M. (2021). Predicting mental health of prisoners by artificial neural network. BioMedicine, 11(1), 26–33.
Council of Europe. (2024). Ethical and organisational aspects of the use of Artificial Intelligence and related digital technologies by prison and probation services. Strasbourg: Council of Europe.
Craig, T. K. J., Rus-Calafell, M., Ward, T., Leff, J. P., Huckvale, M., Howarth, E., Emsley, R., & Garety, P. (2018). AVATAR therapy for auditory verbal hallucinations in people with psychosis: A single-blind, randomised controlled trial. The Lancet Psychiatry, 5(1), 31–40.
Eye on Annapolis. (2025, June). Behind bars, beyond limits: AI therapy brings new hope to inmates. https://www.eyeonannapolis.net/2025/06/behind-bars-beyond-limits-ai-therapy-brings-new-hope-to-inmates/
Finnish Institute for Health and Welfare. (2025, May 2). IMAGINE Project: Compassion-focused therapy combined with virtual reality exposure (CFT-VR). https://stnimagine.fi/en/about-the-project/compassion-focused-therapy-combined-with-virtual-reality-exposure-cftvr/
Fitzpatrick, K. K., Darcy, A., & Vierhile, M. (2025). Randomized trial of a generative AI chatbot for mental health. New England Journal of Medicine AI, 2(3). https://ai.nejm.org/doi/full/10.1056/AIoa2400802
Puolakka, P. (2025a). Smart Prisons: Digital Rehabilitation and Education. Conference presentation, Universitäre Strafvollzugstage, Graz, Austria.
Puolakka, P. (2025b). Artificial Intelligence and Offender Management Systems – Future Visions. Conference presentation, EuroPris ICT Workshop, Split, Croatia.

