Artificial intelligence, wellness apps alone cannot solve mental health crisis
Introduction
People are turning to generative AI chatbots and wellness apps for emotional support with growing frequency, but these tools currently lack both scientific backing and adequate regulations to protect users, warns a new health advisory from the American Psychological Association.
Limitations of AI Chatbots and Wellness Applications for Mental Health
The APA Health Advisory on the Use of Generative AI Chatbots and Wellness Applications for Mental Health reviewed consumer technologies that individuals are using for mental health guidance and treatment—even when that is not their designed purpose. Their low cost and easy access make them attractive alternatives for those who cannot obtain or afford help from licensed providers.
“We are facing a major mental health crisis that demands systemic solutions, not just technological quick fixes,” stated APA CEO Arthur C. Evans Jr., PhD. “Although chatbots may appear accessible for offering support and validation, their capacity to safely direct someone in crisis remains limited and unreliable.”
The advisory stresses that while technology holds great promise for helping psychologists tackle the mental health crisis, it should not divert attention from the pressing need to repair the core of America’s mental health system.
Recommendations to help users in navigating digital spaces
The report provides guidance for the public, policymakers, tech firms, researchers, clinicians, parents, caregivers, and other stakeholders to clarify their roles in a fast-evolving technological environment, so users do not bear alone the risk of navigating unproven and unregulated digital tools. Primary recommendations include:
-
Do not replace care from a qualified mental health professional with chatbots or wellness apps, given the unpredictable nature of these technologies.
-
Avoid allowing unhealthy attachments or dependencies to form between users and these tools.
-
Implement specific protections for children, adolescents, and other vulnerable groups.
“AI development has moved faster than our understanding of its impacts and limitations. As a result, we are learning of serious harm affecting teens and other vulnerable people,” Evans said. “For some, this can be life-threatening, which highlights why psychologists and psychological science must be included throughout the development process.”
Advisory on the Use of AI Chatbots for Mental Health
Even generative AI tools built with sound psychological science and best practices lack sufficient evidence to prove they are effective or safe for mental health care, according to the advisory. Researchers need to assess these chatbots and apps through randomized clinical trials and longitudinal studies tracking outcomes over time. To enable this, tech companies and policymakers must ensure transparency in how these technologies are developed and deployed.
Describing current regulations as insufficient for addressing AI in mental health, the advisory urges policymakers—especially at the federal level—to:
-
update regulatory frameworks;
-
establish evidence-based standards for each type of digital tool;
-
close gaps in FDA oversight;
-
support laws that stop AI chatbots from impersonating licensed professionals; and
-
pass comprehensive data privacy laws and require “safe-by-default” settings.
Conclusion
The advisory observes that many clinicians are not well-versed in AI and recommends that professional organizations and health systems provide training on AI, bias, data privacy, and responsible use in practice. Clinicians should also adhere to existing ethical guidelines and inquire proactively with patients about their use of AI chatbots and wellness apps.
“AI will be essential to the future of health care, but it cannot deliver on that potential unless we also address persistent mental health system shortcomings,” Evans emphasized. “We must advocate for systemic changes to make care more affordable, accessible, and timely—and to ensure AI supports, rather than replaces, human professionals.”
Copyright 2025 American Psychological Association. All Rights Reserved.
Contact us via email for more information.


