More than half of top 100 mental health TikToks contain misinformation, study finds

More than half of all the top trending videos offering mental health advice on TikTok contain misinformation, a Guardian investigation has found.
People are increasingly turning to social media for mental health support, yet research has revealed that many influencers are peddling misinformation, including misused therapeutic language, “quick fix” solutions and false claims.
Those seeking help are confronted with dubious advice, such as eating an orange in the shower to reduce anxiety; the promotion of supplements with a limited evidence base for alleviating anxiety, such as saffron, magnesium glycinate and holy basil; methods to heal trauma within an hour; and guidance presenting normal emotional experiences as a sign of borderline personality disorder or abuse.
MPs and experts said the findings that social media platforms were riddled with unhelpful, harmful and sometimes dangerous mental health advice were “damning” and “concerning”, and urged the government to strengthen regulation to protect the public from the spread of misinformation.
The Guardian took the top 100 videos posted under the #mentalhealthtips hashtag on TikTok and shared them with psychologists, psychiatrists and academic experts, who took a view on whether the posts contained misinformation.
The experts established that 52 out of 100 videos offering advice on dealing with trauma, neurodivergence, anxiety, depression and severe mental illness contained some misinformation, and that many others were vague or unhelpful.
David Okai, a consultant neuropsychiatrist and researcher in psychological medicine at King’s College London who reviewed the anxiety- and depression-related videos, said some posts misused therapeutic language, for example using wellbeing, anxiety and mental disorder interchangeably, “which can lead to confusion about what mental illness actually entails”, he said.
Many videos offered general advice based on narrow personal experience and anecdotal evidence, which “may not be universally applicable”, he added.
The posts reflected how “short-form, attention-grabbing soundbites can sometimes overshadow the more nuanced realities of qualified therapeutic work” on social media. The videos also over-emphasised therapy. “While there is strong evidence supporting the effectiveness of therapy, it’s important to emphasise that it’s not magic, a quick fix or a one-size-fits-all solution,” he said.
Dan Poulter, a former health minister and NHS psychiatrist who reviewed the videos about severe mental illness, said some of them “pathologise everyday experiences and emotions, suggesting that they equate to a diagnosis of serious mental illness”.
“This is providing misinformation to impressionable people and can also trivialise the life experiences of people living with serious mental illnesses.”
Amber Johnston, a British Psychological Society-accredited psychologist who reviewed the trauma videos, said that while most videos contained a nugget of truth, they tended to over-generalise while minimising the complexity of post-traumatic stress disorder or trauma symptoms.
“Each video is guilty of suggesting that everyone has the same experience of PTSD with similar symptoms that can easily be explained in a 30-second reel. The truth is that PTSD and trauma symptoms are highly individual experiences that cannot be compared across people and require a trained and accredited clinician to help a person understand the individual nature of their distress,” she said.
“TikTok is spreading misinformation by suggesting that there are secret universal tips and truths that may actually make a viewer feel even worse, like a failure, when these tips don’t simply cure.”
TikTok said videos were taken down if they discouraged people from seeking medical support or promoted dangerous treatments. When people in the UK search for terms linked to mental health conditions, such as depression, anxiety, autism or post-traumatic stress disorder, they are also directed to NHS information.
Chi Onwurah, a Labour MP, said the technology committee she chaired was investigating misinformation on social media. “Significant concerns” had been raised in the inquiry about the effectiveness of the Online Safety Act in “tackling false and/or harmful content online, and the algorithms that recommend it”, she said.
“Content recommender systems used by platforms like TikTok have been found to amplify potentially harmful misinformation, like this misleading or false mental health advice,” she added. “There’s clearly an urgent need to address shortcomings in the OSA to make sure it can protect the public’s online safety and their health.”
The Liberal Democrat MP Victoria Collins agreed the findings were “damning”, and urged the government to act to keep people safe from “harmful misinformation”.
Paulette Hamilton, the Labour MP who chairs the health and social care select committee, said mental health misinformation on social media was “concerning” . “These ‘tips’ on social media should not be relied upon in place of professional, suitably qualified support,” she said.
Prof Bernadka Dubicka, the online safety lead for the Royal College of Psychiatrists, said that although social media could increase awareness, it was important that people were able to access up-to-date, evidence-based health information from trusted sources. Mental illness could only be diagnosed through a “comprehensive assessment from a qualified mental health professional”, she added.
A TikTok spokesperson said: “TikTok is a place where millions of people express themselves, come to share their authentic mental health journeys, and find a supportive community. There are clear limitations to the methodology of this study, which opposes this free expression and suggests that people should not be allowed to share their own stories.
“We proactively work with health experts at the World Health Organization and NHS to promote reliable information on our platform and remove 98% of harmful misinformation before it’s reported to us.”
A government spokesperson said ministers were “taking action to reduce the impact of harmful mis- and disinformation content online” through the Online Safety Act, which requires platforms to tackle such material if it was illegal or harmful to children.