Mental health and mental wellness, though often used interchangeably, cater to distinct markets. Mental health services are primarily focused on diagnosing, treating, and managing mental illnesses. As such, labels like "psychologist," "psychiatrist," and "therapist’ are protected titles that describe individuals who are qualified to deliver specific mental health services. State licensing boards oversee these mental health providers in the U.S. By contrast, mental wellness services embrace a more holistic and altogether less regulated approach to enhance overall well-being through stress relief, mindfulness, and lifestyle improvement.
Both these industries drive positive social impact, but they are not the same thing. Despite clear industry distinctions, the digital world often blurs the lines between mental health and wellness solutions. In popular app stores, both types of apps are found under the same categories, such as "Health & Fitness," without clear differentiation. This conflation can be detrimental to users, with many seeking clinical results from wellness apps. This raises concerns about the mismanagement of serious health conditions and undermines the impact of more validated digital tools.
Applying the rigorous standards of healthcare to digital apps would see many fall short on real-world clinical evidence. There are concerns that these apps could be downright harmful, as was found by the National Eating Disorder Association when an untested chatbot named Tessa recommended calorie restriction and dieting to users with a diagnosed eating disorder. Overnight, many mental health apps would need to reclassify themselves as mental wellness apps due to their lack of regulation or insufficient evidence to meet the high bar for healthcare services. In particular, I would anticipate that many so-called ‘mental health’ chatbots would hastily update their websites to read ‘mental wellness’ chatbots.
In the regulated healthcare industry, business models are clear and often scrutinised to protect patient privacy and prevent conflicts of interest. Many health and wellness apps, however, rely on business models that include selling data to third parties and generating revenue from advertising, practices that would be unacceptable and unethical in a regulated healthcare context. With stricter regulation, care providers would gain greater confidence around incorporating digital solutions into their treatment plans, knowing that "health" apps meet the necessary standards for quality assurance and safety.
This isn't just about restriction; it's about ensuring quality and safety, giving stakeholders confidence in digital tools and allowing them to realise their benefit in practice. For example, recent research showed how a regulated AI tool improved clinical outcomes and health economics across 250,000 patients in NHS mental health services. Yet, for almost two years, experts remained unconvinced of the efficacy of wellness apps for mental health and continued to look for clinical evidence that demonstrates otherwise.
The frequent use of taglines like “an AI psychologist in your pocket” for products with no clinical basis undermines the professionalism and governance of certified mental health providers. In a more regulated digital health environment, the use of protected titles or implications of clinical efficacy by uncertified apps would be curtailed, reinforcing the value of professional mental health services and ensuring users are not misled by inaccurate advertising or promotion.
I firmly believe wellness apps can benefit many people. However, we need to be realistic about what these solutions can and cannot do. We must not allow ourselves to slip into an over-reliance on wellness tools for serious health problems. Healthcare deserves its own tooling, backed by solid evidence and meeting the highest standards. Let’s redefine our current digital landscape to make it clearer for everyone.
- Psychology Today