Home » Technology against depression? – Health check

Technology against depression? – Health check

by admin

The 6th Tutzinger Depression Conference took place yesterday and today at the Evangelical Academy in Tutzing, under the title “Depression 4.0 – Being mentally healthy with your smartphone?”

Digitalization is currently gaining ground in the healthcare system. This also applies in the area of ​​mental health. There is currently a real boom in apps. Some can already be prescribed with a prescription. If apps have been successfully tested by the Federal Institute for Drugs and Medical Devices, they are considered DiGAs, digital health applications that can be prescribed by doctors and psychotherapists at the expense of the health insurance companies. These are then legally medical devices. There are currently around two dozen to support mental health, including help with depression.

The advantages that are often mentioned are: They can bridge the waiting time for a therapy place, they enable support to be flexible in terms of time, there are fewer social inhibitions and they are cost-effective. A speaker at the conference put the prescription costs at 192 – 620 euros, with the manufacturers setting the prices themselves in the first year. The development costs would be between 3 and 4 million euros. The benefit of DiGAs then has to be proven, but this is easier than with medication.

Some of the DiGAs rely on behavioral data, for example combined with a SmartWatch that counts steps, monitors sleep or records other data. This is a growth market for the industry and, together with the development of supply data and the development of AI applications, there is something of a gold rush atmosphere. After the brain research hype, now the digital and AI hype?

Whether the DiGAs can really be seen as entirely positive, or whether, like all means, they have side effects, whether, like conventional psychosocial aids, they are located in an area of ​​tension between support and control – and whether they represent a mechanization, i.e. a dehumanization of the psychotherapeutic relationship only represent a simulation of a human relationship, these were the core topics of the conference in Tutzing.

See also  «I have a rare autoimmune disease, I have a mass on my umbilical aorta, I am at risk every day»

Some studies show that chatbots can do some things better than human therapists, for example that they are sometimes perceived as more empathetic. But isn’t empathy a very specific quality of interpersonal relationships? How “real” is artificial empathy? Isn’t she just faking something? And if you say yes, because the AI ​​doesn’t really empathize, is this “digital animism”, as one of the speakers called it, bad? Ultimately, when it comes to mental health, doesn’t it matter how patients feel? What else is the measure of success for depression? Or would it be like homeopathy: if people believe that it helps, then the health insurance companies should pay for it?

In any case, it means something when human therapists are perceived in studies to be less empathetic than an AI. One should not idealize the “human relationship” in therapy too much, especially when it comes to psychiatry and not just when it comes to its murderous past.

This is all the more true because up to now, even outside of therapeutic relationships, humans have always represented the absolutely inhuman. In general: Is “technology” really the completely different aspect of humanity? Or is technology perhaps more human than one initially believes, a part of being human, and AI is just projecting human abilities into a new form of realization?

Be that as it may, human relationships in their entirety cannot replace DiGAs and their relatives. We’re not Cylons. But they can probably carry out some tasks that were previously seen as inescapable in the human relationship between therapist and patient.

See also  Tumors: colorectal screening, growing adhesions in Fvg - Medicine

It is becoming apparent that DiGAs and other digital products will find their place in the structure of psychosocial care. Over time, it will become clearer what they are suitable for, what they are not suitable for, and what regulatory guardrails are needed to ensure that the benefit to industry is not greater than that to patients. This requires studies, especially industry-independent studies, that examine the benefits and risks in specific fields of application, while also involving patients and therapists in a participatory manner, and discussion forums like the one in Tutzing are necessary in order to understand and understand the revolutionary technical developments to classify. In addition to their specific services, AI-supported applications in particular also change people’s self-image. On the other hand, you don’t need AI to predict that this was not the last conference on this topic in Tutzing.

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy