CanTEST uses social media to share warnings about illicit drugs, like when methamphetamine was detected in what was expected to be diet pills. (Supplied: Directions Health Services)
In short:
Australian drug checking services say their Facebook and Instagram posts warning the community about dangerous substances are being removed by Meta.
Organisations such as Canberra’s CanTEST use social media to distribute public health warnings when dangerous drug combinations are circulating.
What’s next?
The organisations are calling on Meta to change its approach to educational posts about illicit drugs and for the e-Safety Commissioner to intervene.
Public health workers are warning lives could be at risk due to social media sites automatically censoring educational posts about illicit drugs circulating in the community.
The Australian Injecting and Illicit Drug Users League (AIVL) said social media companies had taken down a string of “critical alerts”, preventing them from reaching the people who needed to see them.
Organisations including Pill Testing Australia, CanTEST, and New Zealand-based company KnowYourStuffNZ have had posts removed and accounts suspended.
In some cases, AIVL said entire pages and personal accounts had been permanently deleted.
AIVL said Meta’s automated content moderation systems were flagging the alerts on Facebook and Instagram as promoting drug use.
Illicit drugs never previously identified have been discovered via CanTEST. (ABC News: Matt Roberts)
“These services rely on social media to tell people where they are, what’s in circulation and how to stay safe,”
AIVL chief executive John Gobeil said.
“If those messages are blocked, people don’t know the service exists, and they lose the chance to make safer decisions.”
‘Time for them to grow up’
David Caldicott, the clinical lead at CanTEST and Pill Testing Australia who is also an emergency doctor, said posts were sometimes being “withheld, altered, or censored because they contain information about drugs”.
“This is health information, but unfortunately, the transition from a human-mediated system to an AI-mediated system means that it just gets pinged and pulled,” Dr Caldicott said.
David Caldicott says some posts are being “withheld, altered, or censored”. (ABC News: Donal Sheil)
With Meta an American-owned company, Dr Caldicott said he believed the problem was also that the United States had a different approach to education around drug use.
Organisations such as CanTEST aim to not only warn drug users about dangerous drugs that could cause serious illness or death, but also to come to their service and have their drugs tested before using them.
The goal is to educate, rather than punish, drug users and prevent drug-related deaths.
“There’s an imperative to prevent any conversation about illicit drugs, even if it’s health related,” Dr Caldicott said of Meta.
While AI had increased the problem, Dr Caldicott said it had been an obstacle from the start.
Meta is being accused of censoring potentially life-saving information about illicit drugs. (Reuters: Dado Ruvic)
“What has happened really is that we have advanced in how we deal medically with drugs-related problems and social media,” he said.
“Social media clearly has the technical capacity to keep up, but the moral oversight has not kept up.
“So it’s really time for them to grow up and catch up with what is now available to young people.”
He described the actions of social media companies as “unacceptable”, especially when coupled with the fact that younger people typically got their news and information from social media rather than legacy media.
Services such as CanTEST offer drug checking and provide information about illicit drugs circulating in the community. (ABC News: Matt Roberts)
“We’re providing health-related information to a group of young people who obviously require it, and people without any health qualifications are interfering with that message,” Dr Caldicott said.
“If people can’t get the information about dangerous drugs available on the market, they might consume those drugs uninformed, as they were prior to our introduction of pill testing.
“And if they do so, death is always a possibility.”
Calls for e-Safety Commissioner to intervene
Dr Caldicott is calling on social media companies to make a commitment to “engage with healthcare providers and allow proper information to be distributed in a timely fashion”.
The organisations affected are also calling on the e-Safety Commissioner to intervene and “compel” Meta to “restore all accounts and content of member organisations that have been removed or suspended on the basis of drug-related community standards violations”.
In the meantime, Pill Testing Australia has developed an app called Night Coach, which doesn’t rely on social media to spread awareness and information.
Protonitazene, a potent synthetic opioid significantly stronger than fentanyl, was found in a counterfeit ‘oxycodone’ pill by CanTEST. (Supplied: CanTEST)
“We need to think more about how we can engage with these social media companies because essentially they’re pulling all the strings as far as the health information is concerned in this field,”
Dr Caldicott said.
Stephanie Stephens from Directions Health Services, which runs CanTEST, said they were continuing to look for ways to resolve the social media issues.
“Our attempt is to raise it with Meta, have that resolved — obviously that’s not been effective to date,”
she said.
“We attempt to repost, reinstate things, we use other platforms like our website, but … the reach of social media in particular is so helpful, so we’ll be exploring avenues to have it better resolved.”
Stephanie Stephens says urgent information is being censored. (ABC News: Greg Nelson ACS)
Ms Stephens said tens of thousands of people followed CanTEST on social media.
She said that in December, a post about a potentially lethal opioid was taken down just before Split Milk festival was held in Canberra.
She urged people looking for alerts and information to go to the CanTEST website.
In a statement an eSafety spokesperson said as the independent regulator for online safety in Australia, eSafety’s role was “to implement and enforce the Online Safety Act”.
“This includes obligations under the Online Safety Codes and Standards which require service providers to take certain steps to reduce the risk of unlawful and highly harmful material on their services,” the spokesperson said.
“They do not require services to remove lawful material such as public health information relating to drugs. The removal of material of this nature is a matter for the services involved and should be raised directly with those services.
“eSafety encourages organisations to take a multi-channel approach when sharing important public health information. Social media platforms set and enforce their own terms of service, which means lawful content may still be subject to moderation or removal.
“Relying on a range of communication channels can help ensure critical information remains accessible to the communities who need it.”
In response to questions from the ABC, Meta declined to comment.