Hundred Heroines charity’s Facebook page reinstated after being wrongly flagged for drug c
November 18, 2025
When the UK charity Hundred Heroines had its Facebook group taken down it was accompanied by a message from the social media company that simply said the page “goes against our community standards on drugs”.
Now, after more than a month of appealing, the photography charity is celebrating the reinstatement of its group after the tech company’s AI tools mistook it for an organisation promoting the class-A opioid heroin.
The Gloucestershire-based organisation, which celebrates female photographers, has had its Facebook group taken down twice in 2025 for apparent breaches of community guidelines related to the promotion of drugs.
The latest takedown came in September. After a second appeal in 12 months, the Hundred Heroines: Women in Photography Today page was restored with no explanation or apology last week.
The charity’s founder and former president of the Royal Photographic Society, Dr Del Barrett, said the decision had had a “devastating” effect on an organisation that relies on Facebook for attracting visitors.
“AI technology picks up the word heroin without an ‘e’, so we get banned for breaching community guidelines,” she said. “Then no matter what you do, you can’t get hold of anyone and it really affects us because we rely on Facebook to get our local audience.”
Founded in 2020, the charity has a physical space in Nailsworth, near Stroud, with about 8,000 items in its collection that focuses on the work of female photographers and spans the history of the art form.
In 2024, Meta increased its vigilance of groups related to drugs in light of the opioid crisis in the US, where 80,000 people died after overdoses last year.
Meta says buying and selling drugs is strictly prohibited on its platforms, and claims to have “robust measures” in place to detect and remove such content.
In a statement on its website, Meta says: “We recognise the significance of the drug crisis and are committed to using our platforms to keep people safe … and strict enforcement of our community standards.”
But when its software incorrectly identifies groups that breach its standards, the outcome can be Kafkaesque, according to users, with feedback forms often the only way to flag errors.
Meta says AI tools are “central to [its] content review process”, adding that “AI can detect and remove content that goes against our community standards before anyone reports it”.
Sometimes the technology flags content for its “human review teams”, although Barrett said when Hundred Heroines complained they had no human interaction.
“We thought, ‘should we change our name?’ But why should we? Why have we got to mess with our brand just because of Facebook?”, said Barrett, who estimates that about 75% of Hundred Heroines’ visitors come via Facebook.
She added: “It sort of verges on scary and laughable. You think these bots are running the world and they can’t tell the difference between a woman and an opioid. Heaven help us.”
Earlier this year Meta was heavily criticised over the mass banning or suspension of accounts on Facebook and Instagram.
Users blamed its AI moderation tools for the erroneous bans, but while the tech company acknowledged a “technical error” affecting Facebook Groups, it denied an increase in incorrect enforcement of its rules across its platforms.
Meta said it was fixing the issue that emerged in the summer after groups, including one that shared memes about bugs, were allegedly told they did not follow standards on “dangerous organisations or individuals”.
Meta has been approached for comment.
Search
RECENT PRESS RELEASES
Related Post
