Calculating the Value of Mental Health Chatbots

Posted on
August 23, 2021

Two-thirds of those who die by suicide struggle with depression (American Association of Suicidology). As a significant public health issue and growing societal concern during the COVID-19 pandemic, providing preemptive and equitable care is imperative when addressing mental health solutions.

The success of any healthcare intervention is measured by the quality of the care provided. High quality care is accessible, timely, and patient focused, but often requires many resources to ensure this standard of care.  As the demand for mental healthcare increases and is vastly lacking in areas around the globe, mental health chatbots can reduce barriers to care.

How can chatbot support be as effective as in-person therapy with a human?

To understand how we calculate Tess’ impact, let’s break down how researchers assess in-person clinical models of patient recovery and success.

At the beginning of a treatment for anxiety and depression, a psychologist will measure the reported symptoms of the patient using the Patient Health Questionnaire-9 (PHQ-9). The PHQ-9 is a tool that care providers use to diagnose, measure, and manage a patient’s symptoms and the subsequent reduction of these symptoms. This records the patient’s baseline coming into therapy, similar to measuring a patient’s vitals at the beginning of a doctor’s visit.

Tess delivers the PHQ-9 assessment through the first conversation with the patient, and again after a few weeks of chatting to measure the impact of the intervention on a patient’s symptoms. Based on the patient’s answers to the assessment, they are determined to have no symptoms, mild symptoms, moderate or severe symptoms.

The goal of effective therapy is symptom reduction. A successful treatment is categorized as a reduction in the patient’s symptoms by 50% or more, and takes between 8-16 sessions to conclude the treatment. 

Knowing this, let’s look at the cost breakdown, focusing on the investments of traditional symptom reduction. Depending on the scenario, the numbers provided below are not indicative of all scenarios and are subject to change with varying inputs and variables.

A PhD psychologist typically charges $275 dollars per session with the goal of achieving symptom reduction by 50% or more . This amounts to $2,200 on average per person seeking treatment for two months.

Alternatively, Tess costs just $5 a month per person to implement and use. After chatting with Tess 22% of patients experience a reduction in the severity of their symptoms by 50%.  

What about high risk situations that require human intervention?

When any patient expresses suicidal ideation, a mental health professional must act quickly to provide support. Statistically, 5-8% of crisis situations lead to a suicide, making it imperative that quick, effective contact is made to intervene and de-escalate the situation.

Since 2016, Tess has identified 4,875 crisis situations and has immediately connected the patient to a counselor who took over the conversation. The on-demand and integrated capabilities of Tess allows the AI to screen for these instances and connects the patient to a crisis counselor who steps in instantaneously. 

Tess saves valuable time. Within seconds, suicidal speech can be identified and the patient is routed to a trained professional who can ensure that the crisis is averted. The immediate nature of Tess’ intervention eliminates the step of picking up the phone and calling a crisis counselor,  a step that can be obscured by a patient’s state of mind or the stigma around requesting support.

 Another constraint is the time associated with finding a psychologist, and in many cases patients may be dissuaded by long wait times for an appointment or are met with a waitlist. Particularly in areas with limited access to care, waiting weeks or losing seconds can be fatal.

By providing a low cost, on-demand solution that can employ proven psychological practices, Tess bridges a profound gap between clinical effectiveness and utilization savings.

Interested in learning more about Tess? Schedule a demo today.

Posted on
in
Research
category
Contact us
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

You Might Also Like

Company Updates

Mental Health Chatbots are Here to Help

As the medical field rapidly changes, so does the technology. Healthcare and mental health chatbots are here to help patients and doctors through the medical process.

Company Updates

The Importance of Talking About Mental Health in the Workplace

Employees are facing an increase of anxiety and depression due to the effects of the Pandemic. Talking about mental health in the workplace...

Company Updates

Helping Hand for Psychologists While Combating the Mental Health Crisis

While the end of the pandemic is in sight, the mental health crisis is far from over. With mental health professionals being stretched thin...

Company Updates

Psychodynamic Psychotherapy Treatment Delivered by Artificial Intelligence

Tess, AI Chatbot, has been trained in a variety of emotional support evidence-based conversations. Among the different interventions...

Company Updates

AI chat assistant supports hundreds of thousands people everyday

Tess is an AI-Chatbot that has proven results of supporting emotionally different communities.

Company Updates

Dealing with Mental Health Issues After COVID-19

The COVID-19 pandemic, mental health concerns have been raised worldwide. Anxiety, depression, suicide, and domestic violence concerns...

Company Updates

Mental Health impact during COVID-19 | X2AI

The way we view mental health during COVID has shown the benefits and importance of integrating chatbots as a part of mental health services is that help can be escalated by lowering rates.

Company Updates

X2 collaborates with Balance from Johns Hopkins

The Johns Hopkins Balance program uses Tess - learn more about how they implemented it.

Research

Ethical Artificial Intelligence for Digital Health Organizations

This report id on the methods undertaken by X2 to develop an ethical code for organizations delivering emotional artificial intelligence (AI) services.