byEdith Cowan University

Credit: Matheus Bertelli from Pexels

New Edith Cowan University (ECU) research suggests artificial intelligence chatbots like ChatGPT may help reduce mental health stigma, particularly for people hesitant to seek traditional face-to-face support.

Thestudy, published inBehavioral Sciences, was led by ECU Master of Clinical Psychology student Scott Hannah, with supervision from Professor Joanne Dickson. It is one of the first to examine how using ChatGPT for mental health concerns relates to stigma.

Researchers surveyed 73 people who had used ChatGPT for personal mental health support, investigating ChatGPT use and its perceived effectiveness related to stigma.

"The findings suggest that believing the tool is effective plays an important role in reducing concerns about external judgment," Hannah said.

Stigma is a major barrier to seeking mental health help. It can worsen symptoms and discourage people from accessing support.

The study focused on two forms of stigma: anticipated stigma, which is the fear of being judged or discriminated against, and self-stigma, which is internalizing negative stereotypes, which reduces confidence and help-seeking.

The study found that people who felt ChatGPT was effective were more likely to use it and also more likely to reportreduced anticipated stigma, meaning less fear of being judged.

AsAI toolsbecome more common, people are using chatbots for private, anonymous conversations about their mental health concerns.

"From a sample of almost 400 participants in this study, almost 20% were engaging or had already engaged withChatGPT for mental healthpurposes, and almost 30% were open to the idea if faced with a mental health difficulty," Hannah said.

"These results suggest that, despite not being designed for these purposes, AI tools such as ChatGPT are becoming more widely used for mental health purposes."

Hannah said anonymous digital tools may offer early support to those reluctant to seek help. "Many people still worry about being judged for struggling with their mental health," he said. "When people feel ChatGPT is helpful, it may ease some of that fear and encourage them to open up.

"However, there are importantethical considerations, as ChatGPT was not designed for therapeutic purposes, and recent research has shown that its responses can sometimes be inappropriate or inaccurate. Therefore, we encourage users to engage with AI-based mental health tools critically and responsibly."

Professor Dickson said AI may provide an accessible bridge for people facing stigma-related barriers. "AI isn't a replacement for professional care, but perceptions of support can help reduce stigma," she said.

Professor Dickson said more work is required to understand how AI can safely complement mental health services. "As AI grows, it's crucial we understand its impact so we can guide best practice."

More information Scott N. Hannah et al, As Effective as You Perceive It: The Relationship Between ChatGPT's Perceived Effectiveness and Mental Health Stigma, Behavioral Sciences (2025). DOI: 10.3390/bs15121724