Eating disorders are a group of severe psychological illnesses that involve disordered relationships with eating, and often (but not always) strong dissatisfaction with body size and shape.

In-person therapy is a traditional component of eating disorders treatment.

Source: Christina Morillo/Pexels

Regardless of type, eating disorders are difficult to treat, with fewer than half of individuals with an eating disorder fully recovering from their illness (e.g., complete symptom remission).1 This low recovery rate is concerning because eating disorders are common in children and adolescents and can, therefore, dramatically interfere with brain and body development.

article continues after advertisement

Treatments for eating disorders vary but usually include some type of psychotherapy (e.g., cognitive behavioral therapy; CBT). Traditionally, psychotherapy for eating disorders has consisted of in-person sessions between a patient and clinician. With increased access to new digital platforms (e.g., Zoom), however, teletherapy is becoming a possible alternative to in-person therapy.2

Mental Health Teletherapy

Mental health teletherapy is a health service that provides mental health support through non-traditional, remote channels (e.g., text or video). With teletherapy, people seeking mental health assistance can choose to interact with a human (e.g., clinician) or with an artificial intelligence (AI) agent, such as a chatbot.

Chatbots are computer programs that use AI to analyze and respond to information requests – these information analyses allow chatbots to “converse” with humans in “human-like” ways.

Chatbots are computer programs that use artificial intelligence to “interact” with humans.

Source: Alexandra Koch/Pixabay

The human-like conversational abilities of chatbots position them to be useful and affordable agents in mental health therapy, potentially increasing public access to these services.

Nonetheless, despite these promises, there are numerous challenges we need to consider before implementing chatbots into mental health therapy.

article continues after advertisement

One challenge is creating a chatbot that people respond favorably to. Exploring how people respond to chatbots is important because the patient-clinician relationship (i.e., therapeutic alliance) strongly influences patient adherence to treatment.3 For example, patients who have therapists who lack perceived warmth, trustworthiness, honesty, and empathy have lower treatment adherence compared to patients who have therapists with more perceived empathy.

Moreover, humans expect artificial agents (i.e., chatbots) to have human-like characteristics and, therefore, respond most favorably to the chatbots that meet these expectations.4 Consequently, chatbots that are too robotic might disrupt patient adherence to therapy.

Being Human…But Not Too Human

Considering how important it is for therapists to express “human” characteristics (i.e., empathy) during therapy, it seems equally important for chatbots to display these social characteristics when assisting with mental health therapy.

Based on research by Chaves and Gerosa (2022), the social characteristics that humans value most in chatbots include conversational intelligence (i.e., how a chatbot manages the conversation), social intelligence (i.e., human-like social behavior), and personification (i.e., human-like personality and identity).5 Chatbots with these characteristics would, seemingly, be the most effective therapeutic artificial agents.

article continues after advertisement

This logic, however, might be flawed, as some people see artificial agents that are too human as “creepy”. This threshold between favorable and creepy chatbots is known as the “uncanny valley”.

The Uncanny Valley

The uncanny valley effect refers to the “creepiness” humans feel when encountering a nearly human artificial agent.

Source: Possessed Photography/Unsplash

The uncanny valley refers to the uneasiness humans feel when they encounter an artificial agent that is close to being human, but not quite human.4,6

Why the uncanny valley effect occurs is unclear, but one explanation could be that humans find it challenging to categorize an almost human entity (i.e., what is it?). Because artificial agents aren’t quite human, people might struggle to understand the motivations, needs, and “feelings” of these entities (i.e., does an artificial agent have a mind of its own?).4,6

The difficulties humans have with “understanding” artificial agents could be attributed to how humans mentalize their interactions with these entities.4 Mentalizing is the ability to comprehend our own, as well as others’, mental states. Through mentalization, we can better understand the motivations, feelings, and behaviors of ourselves and others.

Eating Disorders Essential Reads

When encountering nearly human artificial agents, it becomes challenging for humans to determine what the motivations of these entities are – are their motivations similar to or different than humans’ motivations?

A review study by Vaitonyté et al. (2023) verifies that the human brain struggles to process interactions with human-like artificial agents. Across several studies it was found that certain areas of the brain involved in mentalizing (e.g., ventromedial prefrontal cortex; vmPFC) are activated more when humans interact with artificial agents compared to when they interact with other humans. The vmPFC processes information that helps humans understand their social interactions, including what threats exist in their environment. These findings reinforce that the human brain struggles to understand artificial agents.

article continues after advertisement

Chatbot-Assisted Eating Disorders Therapy?

Currently, no studies have explored whether patients with eating disorders experience the uncanny valley effect during chatbot-assisted therapy. Understanding how these individuals perceive human-like chatbots during therapy is important because human characteristics (e.g., empathy) encourage patient adherence to therapy.3 We, therefore, need to know whether chatbot-assisted eating disorders therapy makes patients uncomfortable.

Before using chatbots in eating disorders therapy, we need to understand how people with eating disorders perceive chatbots used in therapy.

Source: Alexandra Koch/Pixabay

An important consideration is that people with eating disorders often have difficulties mentalizing.7,8 Consequently, we cannot directly apply previous research findings for human-chatbot interactions to this population. We, therefore, need to better understand how people with eating disorders respond to artificial agents before incorporating chatbots into eating disorders therapy. Such research will be difficult because mentalization challenges vary across people with eating disorders.8 Researchers will, therefore, need to consider how individual differences in social processing within the eating disorders population influence chatbot interactions.

Finally, we need to remember that chatbots aren’t 100% reliable, which could lead to disastrous consequences in mental health therapy. This was evident with the National Eating Disorders Association’s (NEDA) chatbot-assisted helpline. Despite a careful design, NEDA’s chatbot offered weight loss advice to several people using the helpline.9 This early experiment demonstrates how cautious we need to be when using chatbots in eating disorders therapy.

References

1) BEAT. (2024). Statistics for journalists. BEAT. Retrieved on: https://www.beateatingdisorders.org.uk/media-centre/eating-disorder-sta…

2) Sproch, LE, & Anderson, KP. (2019). Clinician-delivered teletherapy for eating disorders. Psychiatr Clin N Am, 42, 243-252. https://doi.org/10.1016/j.psc.2019.01.008.

3) Ackerman, SJ, & Hilsenroth, MJ. (2003). A review of therapist characteristics and techniques positively impacting the therapeutic alliance. Clinical Psychology Review, 23, 1-33. https://doi.org/10.1016/S0272-7358(02)00146-0.

4) Vaitonyté, J, Alimardani, M, & Louwerse, MM. (2023). Scoping review of the neural evidence on the uncanny valley. Computers in Human Behavior Reports, 9. https://doi.org/10.1016/j.chbr.2022.100263.

5) Chaves AP, & Gerosa, MA. (2020). How should my chatbot interact? A survey on social characteristics in human-chatbot interaction design. International Journal of Human-Chatbot Interaction, 37, 729-758. https://doi.org/10.1080/10447318.2020.1841438.

6) Masahiro, M. (2012). The uncanny valley. IEEE Robotics & Automation Magazine, 19, 98-100.

7) Luyten, P, Campbel, C, Allison, E,…et al. (2020). The mentalizing approach to psychopathology: state of the art and future directions. Annual Review of Clinical Psychology, 16, 297-325. https://doi.org/10.1146/annurev-clinpsy-071919-015355.

8) Gagliardini, G, Gullo, S, & Tinozzi, V. (2020). Front Psychol, 11. https://doi.org/10.3389/fpsyg.2020.564291.

9) Wells, K. (2023). An eating disorders chatbot offered dieting advice, raising fears about AI in health. NPR. Retrieved from: https://www.npr.org/sections/health-shots/2023/06/08/1180838096/an-eati…



Source link

Share.
Leave A Reply