Spotify Rushes to Remove Dozens of Fake Podcasts Promoting Online Drug Sales
Spotify is under pressure to clamp down on a wave of fake podcasts promoting the illegal sale of prescription drugs through questionable online pharmacies.
A recent investigation revealed that Spotify’s podcast platform hosted dozens of phony shows that advertised the sale of medications like Adderall, Xanax, Oxycodone, and Percocet—often without requiring a prescription. Many of these listings appeared among the top search results when users looked up drug-related terms, directing listeners to external websites potentially operating outside the law.
Several of these podcasts featured titles that blatantly hinted at their purpose, such as “My Adderall Store.” Short, AI-generated episodes promoted drug sales and linked to websites claiming to sell controlled substances without prescriptions, a clear violation of U.S. law.
After being alerted by CNN, Spotify swiftly removed at least 26 such podcasts, admitting that they breached the platform’s content policies. “We are constantly working to detect and remove violating content across our service,” a Spotify spokesperson said.
However, even after those removals, new examples surfaced the following day – highlighting the scale and persistence of the issue.
The discovery raises questions about Spotify’s content moderation capabilities, especially as generative AI tools make it easier to flood platforms with convincing, machine-voiced spam. Some podcasts identified in the review had been online for months, suggesting gaps in enforcement despite the company’s rules against spam, illegal content, and unlicensed drug sales.
Spotify’s current guidelines prohibit using the platform to promote illicit goods or spammy services and warn that violations can lead to content removal. Yet enforcement remains challenging. As the platform continues to promote open access to podcast creation and distribution, the ease with which bad actors can exploit that openness has come under increasing scrutiny.
The issue has sparked criticism from online safety advocates and tech commentators alike. “Podcasts have a bigger blind spot,” said Katie Paul, director of the Tech Transparency Project. “Voice makes it much more difficult for moderation.”
The problem isn’t new. Tech platforms have faced pressure for years to combat the sale of unregulated or counterfeit medications online. In 2011, Google was fined $500 million for allowing ads from Canadian pharmacies illegally targeting U.S. consumers. The FDA has since pushed social media companies to act more decisively to prevent similar abuses.
Despite these warnings, platforms like Spotify remain largely shielded from legal liability for user-generated content under Section 230 of the Communications Decency Act, which protects internet companies from being held responsible for content posted by users.
Some of the discovered podcasts were disturbingly specific. One titled “Xtrapharma.com” offered a robotic pitch for Xanax and Oxycodone in eight brief episodes, touting “FDA-approved delivery without prescription.” Another, named “Order Xanax 2 mg Online Big Deal On Christmas Season,” featured a short, text-to-speech voiceover directing users to an external site promoting unverified medication delivery.
Even when shows were taken down after user interaction or media inquiries, dozens remained live for extended periods. Search terms like “Adderall,” “Xanax,” “Valium,” “Vyvanse,” and “Percocet” returned results linking to similar pharmacy websites. Many of the shows were indistinguishable aside from the drug name and featured no listener reviews, making it unclear how widely they had been accessed.
The fake podcast surge has reignited concerns about Spotify’s broader responsibility to moderate content. The platform previously faced backlash in 2022 over health misinformation on “The Joe Rogan Experience,” prompting the company to implement advisories on COVID-related content and expand its safety infrastructure.
Spotify has since taken steps to improve moderation, including the formation of a Safety Advisory Council and the acquisition of Kinzen, a firm specializing in audio content risk detection. But critics say the latest drug-selling scheme proves that more proactive safeguards are still needed.
“Anywhere people can post user-generated content, you will find people trying to sell drugs,” said Sarah Gardner, CEO of the Heat Initiative. “That part is, unfortunately, pretty consistent. The real test is in how companies respond.”
As AI-generated content becomes more sophisticated, platforms like Spotify may face increasing difficulty distinguishing between legitimate creators and those exploiting its open system to promote dangerous and illegal services.