Writings

Knowledge, relationship, and innovation: artificial intelligence in clinical and psychoanalytic practice

Publisher's note
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.
Published: 30 April 2026
11
Views
23
Downloads

Authors

This paper seeks to analyze the transformative impact of large language models (LLMs), defined as “alien co-intelligence”, on mental health and clinical practice. Artificial intelligence (AI) poses a radical challenge to human bonds, offering the illusion of connection and support, known as “artificial intimacy”, without the demands of an authentic relationship. This dynamic carries serious risks of emotional dependence, erosion of social skills, and, in extreme cases, can lead to fatal consequences or the aggravation of “AI-associated psychosis”. In the context of psychotherapy, the introduction of AI is examined with a spirit of “critical trust”. The algorithm cannot in any way replace the therapeutic relationship, as it is not a desiring subject and cannot be the object of authentic transference. Questions are raised about the risks associated with AI interference (e.g., in preliminary screening), such as the possibility that it may end up colonizing the clinician’s mind and compromising the initial contact, distorting the foundational process of mutual not-knowing that is crucial to analysis. The most insidious clinical danger is the de-subjectification of the patient, reduced to a set of data and patterns to be optimized rather than a subject of desire. There is an urgent need to develop rigorous ethical guidelines and digital safety plans to preserve the centrality of the human being and the irreplaceable value of authentic encounters in healthcare.

Downloads

Download data is not yet available.

Citations

Abraham, N., & Torok, M. (1994). The Shell and the Kernel: Renewals of Psychoanalysis. University of Chicago Press.
Downey, J. I., & Alfonso, C. A. (2023). The Impact of Patient Suicide on Clinicians. Psychodynamic Psychiatry, 51(4), 381-385.
Kochanek, K. D., Murphy, S., Xu, J., & Arias, E. (2017). Mortality in the United States, 2016. NCHS data brief, (293), 1-8.
Lasri, S., Nfaoui, E. H., & El Haoussi, F. (2022). Suicide Ideation Detection on Social Networks: Short Literature Review. Procedia Computer Science, 215, 713-721.
Lemma, A. (2024). Mourning, melancholia and machines: An applied psychoanalytic investigation of mourning in the age of griefbots. The International Journal of Psycho-Analysis, 105(4), 542-563.
Maples, B., Cerit, M., Vishwanath, A., & Pea, R. (2024). Loneliness and suicide mitigation for students using GPT3-enabled chatbots. Npj Mental Health Research, 3(1), 4.
Mollick, E. (2025). L’intelligenza condivisa. Vivere e lavorare insieme all’AI. Luiss University Press.
Morrin, H., Nicholls, L., Levin, M., Yiend, J., Iyengar, U., DelGuidice, F., Bhattacharya, S., Tognin, S., MacCabe, J., Twumasi, R., Alderson-Day, B., & Pollak, T. A. (2025). Delusions by design? How everyday AIs might be fuelling psychosis (and what can be done about it). PsyArXiv.
Nock, M. K., Millner, A. J., Ross, E. L., Kennedy, C. J., Al-Suwaidi, M., Barak-Corren, Y., Castro, V. M., Castro-Ramirez, F., Lauricella, T., Murman, N., Petukhova, M., Bird, S. A., Reis, B., Smoller, J. W., & Kessler, R. C. (2022). Prediction of Suicide Attempts Using Clinician Assessment, Patient Self-report, and Electronic Health Records. JAMA Network Open, 5(1), e2144373.
Rizvi, A., Harmer, B., & Saadabadi, A. (2024). Suicidal Ideation. In StatPearls. StatPearls Publishing.
Turkle, S. (2019). Insieme ma soli. Perché ci aspettiamo sempre più dalla tecnologia e sempre meno dagli altri. Torino: Einaudi.
Tatti, G. (2015). Il lato oscuro della maternità. Rivista Pratica Psicoterapeutica – Il Mestiere dell’Analista, 12(1).
SITOGRAPHY
https://youtu.be/6F2O23J1waw?si=30oZb5VxYlAQIyYo [consultato il 30 ottobre 2025].
https://youtu.be/dYjUURAfZu8?si=v_woD9vd3AaMkZXt [consultato il 30 ottobre 2025].
SUGGESTED REFERENCES
Oseguera, O., Rinaldi, A., Tuazon, J., & Cruz, A. C. (2017). Automatic Quantification of the Veracity of Suicidal Ideation in Counseling Transcripts. Communications in Computer and Information Science, 473-479.
Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., Kaiser, Ł., & Polosukhin, I. (2017). Attention is all you need. In I. Guyon, U. V. Luxburg, S. Bengio, H. Wallach, R. Fergus, S. Vishwanathan, & R. Garnett (Eds.), Advances in Neural Information Processing Systems (Vol. 30). Curran Associates, Inc.

How to Cite



Knowledge, relationship, and innovation: artificial intelligence in clinical and psychoanalytic practice. (2026). Ricerca Psicoanalitica, 37(1). https://doi.org/10.4081/rp.2026.1101