In December, a Wisconsin man who goes by the username the_alpha_k9 on TikTok uploaded a testimonial-style video to the platform, telling his thousands of followers that he wouldn’t be taking a Covid-19 vaccine.
“You’re telling me in 40 years of research there is no vaccine for HIV … for cancer, no vaccine … the common cold, no vaccine,” he said. “Yet in one year we’ve developed a vaccine for COVID-19 and you want me to take that … thanks, but no thanks.”
It was one of the many debunked, run-of-the-mill anti-vaccination talking points that have permeated many social platforms during the Covid-19 pandemic. But on TikTok, where users regularly reuse popular audio tracks to make their own videos, it took on a life of its own. More than 4,500 videos featuring the audio have been made, which have been viewed more than 16 million times, according to a report published Monday by the Institute for Strategic Dialogue, a London-based organization that tracks disinformation.
It’s an example of what researchers say is a problem unique to the video platform, which has emerged in recent years as a wildly popular destination for viral dance routines to comedy skits and confessional content.
“People are using TikTok to post and host harmful Covid misinformation, and it’s highly popular,” said Ciaran O’Connor, an analyst for the Institute of Strategic Dialogue and lead author of a new report about misinformation on the app. “This function is being used exactly as TikTok designed it. The audio is being shared and reacted to. But the consequence is that it creates a feedback loop of anti-vaccine narratives.”
After a request for comment, TikTok removed or limited the distribution of the videos and audio shared in the report.
A spokesperson from TikTok said in a statement: “We strive to promote an authentic TikTok experience by limiting the spread of misleading content, including audio, and promoting authoritative information about COVID-19 and vaccines across our app. Misinformation is an industry-wide challenge, and we are grateful for reports that help us take action on violations.”
The Institute for Strategic Dialogue tracked Covid-19 vaccine misinformation spread through TikTok’s sounds feature. It found that anti-vaccination audio tracks have gone viral as a kind of chain message, with the original claims and content often hidden by TikTok. In other words, a TikTok function is being used to post or amplify content that violates TikTok’s policy against Covid-19 misinformation.
The man behind the_alpha_k9, a relatively small account by TikTok standards, with about 28,000 followers, didn’t respond to requests for comment, and the original video has been deleted. The video, a recitation of a popular anti-vaccine meme, would be flagged later in December by multiple platforms for misinformation and debunked by fact-checkers who noted significant differences between the illnesses it cited and Covid-19, as well as a misunderstanding of mRNA vaccine development.
TikTok is the world’s fastest-growing social media app, with about 100 million monthly active U.S. users and 2 billion global downloads, according to the company. The app provides an easy way to make videos to existing backing tracks, and it shows users videos based on a powerful recommendation algorithm.
The Institute of Strategic Dialogue analyzed 124 TikTok videos featuring vaccine misinformation for its report. The videos garnered more than 20 million views and 2 million likes, comments and shares. Only two videos featured a label referring users to factual information, a safety feature rolled out in December to combat increasing vaccine misinformation on the platform.
TikTok has promoted its crackdown on Covid-19 misinformation since then as part of a commitment to “keeping TikTok safe for creative expression throughout the pandemic,” according to a company blog post. “We take our responsibility to keep harmful misinformation off TikTok incredibly seriously.”
The report’s findings are likely to be only the tip of an iceberg.
“TikTok is a bit of a walled garden,” O’Connor said, noting the challenges of tracking content on the platform. “Misinformation is harder to find but also harder for TikTok or fact-checkers to combat.”
The videos are also being used to target specific communities, O’Connor said. Several users had translated the audio into other languages. While the sounds are less common and less popular, some people used them to react with videos by fact-checking or rebutting the claims.
One video — which used audio from removed content in which a woman who claimed to be a nurse said she suffered from Bell’s palsy after being vaccinated — was indicative of how misinformation seemed to target Black users, according to the report. A fact-check by The Associated Press determined “details from the video do not add up,” including that there was no record of a registered nurse under the woman’s name. Still, the video spread across many platforms. The video, captioned with “they want black people to take it first for a reason,” was removed on TikTok, but the sound is still available, and it has been used to create new anti-vaccination content.
The user behind another sound, who described herself as a mother of three, claimed in a post that TikTok removed her video “for community violations.” She had played a recording purported to be from the Centers for Disease Control and Prevention, urging certain people not to be vaccinated for a year and spreading false claims of widespread vaccine-caused deaths.
Her original video was removed in April, but the sound is still available, and it has been used in 375 videos, the most popular of them racking up tens of thousands of views. None feature information labels.