Senators Decry Adtech Failures as Ads Appear On CSAM Site

February 8, 2025

Social media is evolving. Are you adapting? Connect with a community of brand pros and content creators at Social Media Week, May 12–14 in NYC, to learn how to keep pace with new trends and technology. Register now to save 20% on your pass.

Senators Marsha Blackburn (R-TN) and Richard Blumenthal (D-CT) on Friday launched a probe into the practices of four adtech leaders and two industry accreditation bodies, in response to a new report from research firm Adalyticsindicating that global brands may be inadvertently funding child sexual abuse material (CSAM) online. 

The probe was launched with the support of adtech watchdog Check My Ads.

Blackburn and Blumenthal on Friday published letters addressed to six major players in the ad ecosystem, including Google CEO Sundar Pichai and Amazon CEO Andy Jassy. Letters were also sent to the CEOs of top brand safety firms Integral Ad Science (IAS) and DoubleVerify as well as the leaders of certification bodies, Media Rating Council and Trustworthy Accountability Group

“We write to express our grave concern that Google’s advertising technology has supported the
monetization of websites that have been known to host child sexual abuse material (CSAM),” the senators wrote in the letter addressed to Pichai. 

The letter continued: “Where digital advertiser networks like Google place advertisements on websites that are known to host such activity, they have in effect created a funding stream that perpetuates criminal operations and irreparable harm to our children.”

The letters follow peer-reviewed research from Adalytics, which finds that since 2017, hundreds of ads for major brands—including Unilever, Sony, PepsiCo, and others—as well as governmental organizations including the U.S. Department of Homeland Security, were placed on image-hosting site ibb.co and its affiliate imgbb.com. The National Center for Missing & Exploited Children (NCMEC) had flagged this site for hosting CSAM in 2021, 2022, and 2023. 

ADWEEK reviewed nine live URLs with an ibb.co domain in which ads appeared alongside explicit sexual content. Brands included Adobe, Amazon, and Arizona State University.

In one case, 197 video ads co-branded by the NFL and FanDuel were served on a page promoting an “online multiplayer sex game.” 

According to four buy-side sources ADWEEK spoke to who were found transacting on the sites, the brand safety checks they had paid for failed.

The website, which garners over 40 million page views per month, according to Semrush, includes both explicit adult content as well as illegal CSAM. Researchers were unable to determine ownership of the sites. 

Adtech firms respond, pointing to ‘strict policies’

A significant number of the ad placements in question were facilitated by leading adtech demand-side platforms (DSPs) including Amazon, Google, Microsoft, Criteo, Quantcast, Nexxen, TripleLift, and others—despite the fact that most of these organizations, including Google and Amazon, have strict ad inventory policies that forbid transactions on pages hosting CSAM. 

Although some adtech platforms, including Google, appear to have ceased transacting on these platforms as of January, others continue to serve ads, per the report.

“We have zero tolerance when it comes to contentpromoting child sexual abuse and exploitation and both of the accounts in question are terminated,” a Google spokesperson told ADWEEK in a statement, adding that the company uses both human reviewers and AI-powered enforcement systems to help enforce Google policies globally. 

An Amazon spokesperson said: “We regret that this occurred and have swiftly taken action to block these websites from showing our ads. We have strict policies in place against serving ads on content of this nature, and we are taking additional steps to help ensure this does not happen in the future.”

In their letters, lawmakers not only denounced what they view as substantial and dangerous failures in the adtech ecosystem, but also called on leaders of the organizations they addressed to provide responses to a slate of questions about their policies and practices. 

Executives at Google, Amazon, IAS, DoubleVerify, MRC, and TAG have been asked to respond to specific questions by Feb. 14.

Read the full text of the letters below. 

Letter to Google

Dear Mr. Pichai, 

We write to express our grave concern that Google’s advertising technology has supported the monetization of websites that have been known to host child sexual abuse material (CSAM). Recent research indicates that Google, as recently as recently as March 2024, has facilitated the placement of advertising on imgbb.com, a website that has been known to host CSAM since at least 2021, according to transparency reports released by the National Center for Missing & Exploited Children (NCMEC). Just as concerning are reports that the United States government’s own advertising has appeared on this website. The dissemination of CSAM is a heinous crime that inflicts irreparable harm on its victims. Where digital advertiser networks like Google place advertisements on websites that are known to host such activity, they have in effect created a funding stream that perpetuates criminal operations and irreparable harm to our children. 

Google’s actions here—or in best case, inaction—are problematic for several reasons. First, the instances of ads being served on a website known to host illegal CSAM via Google’s advertising technologies violates Google’s own policies. As you are aware, the production, distribution, sale, and possession of materials depicting CSAM is violates federal law. Google’s own publisher policies further prohibit the monetization of content that “is illegal, promotes illegal activity, or infringes on the legal rights of others” and explicitly prohibit “[c]hild sexual abuse and exploitation,” stating that it does not allow content that “[s]exually exploits or abuses children or content that promotes the sexual exploitation or abuse of children…[including] all child sexual abuse materials.” While Google’s policies state that the company will take “appropriate action, which may include reporting to the National Center for Missing & Exploited Children and disabling accounts.” It remains unclear, however, whether Google has ceased its relationship with the website identified in this report, 9 and it is deeply troubling that the largest advertising technology company continued to monetize the website for at least three years since NCMEC first identified the website as a purveyor of CSAM. 

Additionally, Google has failed to perform due diligence in identifying businesses that conduct illegal activity using its products. The website in question does not publicly disclose its ownership. We have seen previous instances where Google’s apparent failure to perform due diligence of the customers monetizing their websites via Google’s advertising products have resulted in advertising revenue inadvertently funding OFAC-sanctioned websites.

Further, Google executives recently testified in the United States District Court for the Eastern District of Virginia about the company’s extensive investment in vetting publishers and advertisers who use their products. Yet, as Google appears to be funding a website that does not declare its ownership, and has been known to host CSAM, these statements are irreconcilable with the indisputable evidence we have seen. 

Just as troubling, reporting also indicates that advertisers—including the federal government— that use Google products cannot comprehensively track what businesses and content their ad dollars fund. Many advertisers reportedly cannot readily access page URL-level reporting that would allow them to identify which pages their ads have appeared on, including if they had appeared on imgbb.com. Imgbb.com is an anonymous photo sharing website that hosts user-generated content. Without access to the URLs on which their ads appeared, advertisers have no ability to understand whether their ads have appeared on content that violates Google’s policies, their own policies, or federal law. 

It is imperative that your company take immediate and comprehensive action to address this issue and ensure that you are not funding these heinous crimes against children. To better understand how this occurred and to determine appropriate corrective actions, please answer the following questions by February 14, 2025: 

  1. What steps does Google take to perform due diligence on the entities that monetize their websites or content using Google’s advertising technologies?
  2. Since becoming aware that advertising was placed via Google’s products on a website known to host CSAM, what actions have you taken to address or remedy this issue? Please include details on any refunds to advertisers, account suspensions, or broader policy changes implemented in response, including exact figures of how much you have refunded companies or the United States government for all ads served on imgbb.com and ibb.co and when the refunds were issued. 
  3. Why are advertisers unable to readily view the exact URLs of the pages where their advertisements appear through each of Google’s different advertising technologies? If such capability exists, please include documentation for how advertisers can do this across Google products, including DV360, Google Ads, and Google’s Performance Max.
  4. How much advertising revenue has been derived by Google annually in relation to advertising served on websites that are identified by NCMEC as having hosted CSAM?
  5. How much revenue has Google paid to companies that own or operate sites that host CSAM?
  6. How often do you review NCMEC’s transparency reports to ensure that you are not monetizing websites that host CSAM? 
  7. Was Google aware in this particular instance that imgbb.com was hosting CSAM? If so, what processes did Google implement to stop the placements of advertisements on that site? 
  8. How many websites which are known to host CSAM according to NCMEC is Google currently monetizing? Please include details on the process that Google uses to confirm this figure. 
  9. What additional steps will your company take to ensure that advertising dollars do not fund illegal content in the future? Please include a specific timeline for implementing these measures. 

Your cooperation and transparency are essential to safeguarding the safety of our children. Thank you for your attention to this urgent matter. 

Sincerely, 

Marsha Blackburn, United States Senator

Richard Blumenthal, United States Senator 

Letter to Amazon

Dear Mr. Jassy, 

We write to express our profound concern that Amazon’s technology has been used to monetize websites that have been known to host child sexual abuse material (CSAM). Recent research indicates that Amazon has facilitated the placement of advertising on imgbb.com, a website that has been known to host CSAM since at least 2021, according to transparency reports released by the National Center for Missing & Exploited Children (NCMEC). The dissemination of CSAM is a heinous crime that inflicts irreparable harm on its victims. When digital advertising technologies place advertisements on websites that are known to host such activity, they have in effect created a funding stream that perpetuates criminal operations and irreparable harm to our children. 

Amazon’s actions here—or in best case, inaction—are problematic for several reasons. First, the instances of advertisements being served on a website known to host illegal CSAM via Amazon’s advertising technologies violates Amazon’s own policies. As you are aware, the production, distribution, sale, and possession of materials depicting CSAM violates federal law. Amazon’s own policies further prohibit ads from appearing on websites that host “illegal content” and “adult and explicit sexual content.” It remains unclear, however, whether Amazon has ceased its relationship with the website identified in this report, and it is deeply troubling that you have continued to monetize the website for at least three years since NCMEC first identified the website as a purveyor of CSAM. 

Additionally, Amazon has failed to perform due diligence in identifying businesses that conduct illegal activity using its products. The website in question does not publicly disclose its ownership. Amazon states on its website that “Amazon applies brand safety measures to help deliver your offAmazon ads to trustworthy placements next to appropriate and relevant content.” Yet Amazon has delivered ads on a website with no publicly disclosed owner that has been known to host CSAM. 

Just as troubling, reporting also indicates that advertisers—including government advertisers— that use Amazon products cannot comprehensively track what businesses and content their ad dollars fund. Many advertisers reportedly cannot readily access page URL-level reports that would allow them to identify which pages their ads have appeared on, including if they had appeared on imgbb.com. 8 Imgbb.com is an anonymous photo sharing website that hosts user-generated content. Without access to the URLs on which their ads appeared, advertisers have no ability to understand whether their ads have appeared on content that violates Amazon’s policies, their own policies, or federal law. 

It is imperative that your company take immediate and comprehensive action to address this issue and ensure that you are not funding these heinous crimes against children. To better understand how this occurred and to determine appropriate corrective actions, please answer the following questions by February 14, 2025: 

  1. What steps does Amazon take to perform due diligence on the entities that monetize their websites or content using Amazon’s advertising technologies? 
  2. Since becoming aware that advertising was placed via your tools on a website known to host CSAM, what actions have you taken to address or remedy this issue? Please include details on any refunds to advertisers, account suspensions, or broader policy changes implemented in response, including exact figures of how much you have refunded companies or the United States government for all ads served on imgbb.com and ibb.co and when the refunds were issued.
  3. Why are advertisers unable to easily view the exact URLs of the pages where their advertisements appear through Amazon’s advertising technologies? If this capability exists, please provide documentation for how advertisers can do this. 
  4. How much advertising revenue has been derived by Amazon annually in relation to advertising served on websites that are identified by NCMEC as having hosted CSAM?
  5. How much revenue has Amazon paid to companies that own or operate sites that host CSAM?
  6. How often do you review NCMEC’s transparency reports to ensure that you are not monetizing websites that host CSAM? 
  7. How many websites which are known to host CSAM is Amazon currently monetizing? Please include details on the process that Amazon uses to confirm this figure.
  8. What additional steps will your company take to ensure that advertising dollars do not fund illegal content in the future? Please include a timeline for implementing such measures. 

Your cooperation and transparency are essential to safeguarding the safety of our children. Thank you for your attention to this urgent matter.

Sincerely, 

Marsha Blackburn, United States Senator

Richard Blumenthal, United States Senator 

Letter to DoubleVerify

Dear Mr. Zagorski, 

We write to express serious concerns that DoubleVerify’s advertising verification and brand safety products have led advertisers to inadvertently fund websites that are known to host child sexual abuse material (CSAM). Recent research indicates that DoubleVerify has known advertising has appeared on Imgbb.com, a website that has been known to host CSAM since at least 2021.

The dissemination of CSAM is a heinous crime that inflicts irreparable harm on its victims. When digital advertisers place content on websites that are known to host such activity, they have in effect created a funding stream that perpetuates criminal operations and irreparable harm to our children.  

We are particularly concerned that advertisers relying on DoubleVerify’s brand safety and verification technologies have had their ads served on a website known to host CSAM. Many advertisers rely on DoubleVerify’s services to place their ads and operate under the assumption that their ads will not appear adjacent to or fund harmful content and illicit websites. 

DoubleVerify states that its “Universal Content Intelligence” capabilities “provide a holistic approach to content analysis and evaluation,” asserting that “this sophisticated tool leverages AI and relies on DoubleVerify robust and proprietary content policy to provide advertisers with accurate content evaluation, broad coverage and brand suitability protection at scale.” Yet, DoubleVerify advertiser customers paying for its sophisticated and “industry-leading” technology have had their ads served on a website that hosts content involving heinous crimes against children. 

Advertisers who use DoubleVerify’s brand safety and verification products are still unable to verify where their advertising appears and what their dollars are funding. As a vendor whose code appears directly in ads that serve on a given page, DoubleVerify should have visibility into the full-page URL where an ad is rendered. However, we understand that DoubleVerify generally withholds long-term, granular page-level data from its clients. 

It is imperative that your company take immediate and comprehensive action to address this issue and ensure that you are not funding these heinous crimes against children. To better understand how this occurred and to determine appropriate corrective actions, please answer the following questions by February 14, 2025: 

  1. How did advertisements monitored and “verified” by your platform appear on websites hosting CSAM? Please provide a thorough explanation of the mechanisms in place for identifying and blocking unlawful content and why they did not work in this instance. 
  2. Since becoming aware that advertising measured by your Company appeared on a website known to host CSAM, what specific actions has your company taken to remedy this issue? Include details on updates to your verification processes, blocking of offending sites, and outreach to impacted advertisers.
  3. Do you annually review the National Center for Missing & Exploited Children’s (NCMEC) transparency reports to ensure that you are appropriately classifying or blocking websites that host CSAM?
    1. If so, why did you continue to allow client ads to serve on imgbb.com?
    2. If not, why not? 
  4. How many URLs or pages has DoubleVerify reported to NCMEC since 2021?
    1. Did DoubleVerify report the website in question?
    2. If so, on what date(s) and to which authorities? 
  5. How much revenue has DoubleVerify derived from measuring, monitoring, or otherwise deploying your technologies on advertising served on websites known by NCMEC to host CSAM?
    1. What is your policy on revenue derived from monitoring advertising on illicit websites? 

Your cooperation and transparency are essential to safeguarding the safety of our children. Thank you for your attention to this urgent matter.

Sincerely, 

Marsha Blackburn, United States Senator

Richard Blumenthal, United States Senator 

Letter to IAS

Dear Ms. Utzschneider, 

We write to express serious concerns that Integral Ad Science’s advertising verification and brand safety products have led to advertisers inadvertently funding websites known to host child sexual abuse material (CSAM). Recent reports indicates that Integral Ad Science is aware that advertising has appeared on imgbb.com, a website that has been known to host CSAM since at least 2021. 

Reports indicating that digital advertising technology companies are placing advertisements on websites known to host CSAM are deeply concerning. The dissemination of CSAM is a heinous crime that inflicts irreparable harm on children, and creating a funding stream that perpetuates criminal activity only worsens such harm. 

We are concerned that many advertising companies rely on your services to place their ads without knowing exactly where those ads are shown. It is important that you provide advertisers with transparency. You have previously told your investors that “marketers trust [Integral Ad Science] to protect, measure, inform, and optimize their brand campaigns” and that “Data science is at the heart of our business strategy. Our AI systems enable models that deliver classifications and analytics at greater speed that are scalable with extremely high precision.” Yet, these reports show that Integral Ad Science advertiser customers paying for this “precise” technology have had ads served on a website that was identified by the National Center for Missing & Exploited Children (NCMEC) on a publicly available list of websites known to host CSAM.

Advertisers who use Integral Ad Science’s brand safety and verification technologies are unable to verify exactly where their advertising appears and what their dollars fund. This is information you have but do not share with your advertiser customers. 

While Integral Ad Science’s failure to prevent advertisers from inadvertently subsidizing a website known to engage in illegal activity is unacceptable, to withhold this data from advertiser customers that would give them more autonomy to prevent their ads from funding illicit activity is inexcusable. 

It is imperative that your company take immediate and comprehensive action to address this issue and ensure that you are not funding these heinous crimes against children. To better understand how this occurred and to determine appropriate corrective actions, please answer the following questions by February 14, 2025: 

  1. How did advertisements monitored and “verified” by your platform appear on websites hosting CSAM?
    1. What mechanisms are in place for identifying and blocking unlawful content and why they did not work in this instance? 
  2. Since becoming aware that advertising measured by your company appeared on a website known to host CSAM, what specific actions has your company taken to remedy this issue? Include details on updates to your verification processes, blocking of offending sites, and outreach to impacted advertisers. 
  3. Do you annually review NCMEC’s transparency reports to ensure that you are appropriately classifying or blocking websites that host CSAM?
    1. If so, why did you continue to allow client ads to serve on imgbb.com?
    2. If not, why not?
  4. How many URLs or pages has Integral Ad Science reported to NCMEC since 2021?
    1. Did Integral Ad Science report the website in question? If so, to which authorities and on what exact date(s)? 
  5. How much revenue has Integral Ad Science derived from measuring, monitoring, or otherwise deploying your technologies on advertising served on websites known by NCMEC to host CSAM?
    1. What is your policy on revenue derived from monitoring advertising on illicit websites? 
  6. How does your company ensure comprehensive monitoring and vetting of websites in the ad supply chain? Explain the methodologies and tools used to prevent illegal content from being monetized. 
  7. Why were your systems unable to identify and block overtly unlawful websites hosting CSAM? Identify any systemic gaps and steps you are taking to address these vulnerabilities. 
  8. What additional transparency can you provide to advertisers regarding the specific URLs where their ads appear? Outline plans, if any, to improve URL-level visibility for your clients to enhance their ability to evaluate brand safety. 

Your cooperation and transparency are essential to safeguarding the safety of our children. Thank you for your attention to this urgent matter.

Sincerely, 

Marsha Blackburn, United States Senator

Richard Blumenthal, United States Senator 

Letter to the Media Rating Council 

Dear Mr. Ivie, 

We write to express profound concern about Media Rating Council (MRC)-accredited entities that have participated in the delivery of advertising on a website that has been known to host child sexual abuse material (CSAM) since at least 2021, resulting in the inadvertent funding of criminal activity. The dissemination of CSAM is a heinous crime that inflicts irreparable harm on its victims. When digital advertisers place content on websites that are known to host such activity, they have in effect created a funding stream that perpetuates criminal operations and irreparable harm to our children We urge your organization to strengthen and adequately enforce its standards such that accredited vendors are no longer allowed to support the funding of CSAM and other illegal websites. 

The MRC’s actions here—or at best, inaction—have raised several concerns. First, advertisers have unwittingly advertised on a website known to host CSAM, despite relying on technology vendors accredited by the MRC. DoubleVerify and Integral Ad Science are MRC-accredited for various metrics, including for “Ad Verification Processes.” The MRC’s “Supplement to IAB Guidelines for the Conduct of Ad Verification” state that “[p]rocedures related to determining the legality of sources and content should include initial qualification using [i]ndustry and local sources of known illegal entities, as well as ongoing evaluation linked with ad verification results and periodic internal auditing of content sources.” The guidelines further state that “[a]d verification organizations seeking accreditation will be required to provide evidence of source vetting processes where applicable during accreditation audit processes.” 

The MRC supplement also requires that “to the extent that ad verification organizations have identified illegal or illegitimate sources that are either not included in [i]ndustry or local sources of known illegal entities processes should be put in place to routinely communicate these sources to legal authorities, oversight bodies and the industry at large.”Despite these guidelines, reports indicate advertising served on imgbb.com—a website known to host CSAM since at least 2021— where MRC-accredited vendors’ DoubleVerify and Integral Ad Science products were used by advertisers. 

Additionally, the MRC has failed to adequately enforce its standards and investigate noncompliance, resulting in years of continued funding of CSAM and other criminal activity. This is not the first time that MRC-accredited vendors were found to have been involved in the delivery of advertising on illicit websites. Nor is this the first instance of MRC-accredited vendors that have violated the MRC’s own standards. Nevertheless, these companies have maintained their accreditation status and remain in good standing. 

MRC-accredited vendors have previously pointed to their accreditation status to evade scrutiny for their failures, including where they have failed to prevent advertising from funding illegal websites. The MRC’s own public statements have spoken to its close ties to the United States government, insinuating a certain level of rigor and esteem associated with accreditation. Yet others have raised concern about the MRC’s failure to investigate or remedy non-compliance by accredited vendors, or to make clear the scope of the MRC’s accreditation.

It is imperative that your company take immediate and comprehensive action to address this issue and ensure that you are not funding these heinous crimes against children. To better understand how this occurred and to determine appropriate corrective actions, please answer the following questions by February 14, 2025: 

  1. What is the MRC’s plan to review the accreditation status of the measurement companies Integral Ad Science and DoubleVerify that have measured and/or verified advertising that has appeared on and funded a website known to host CSAM? What is MRC’s standard for reviewing or revoking an entity’s accreditation(s) where such a company fails to identify or prevent ads from appearing on known CSAM or otherwise unlawful websites? Please detail any immediate corrective actions, including reviews of certified companies and potential revocation of certifications where necessary. 
  2. Have MRC-accredited vendors Integral Ad Science or DoubleVerify reported URLs containing CSAM to the National Center for Missing & Exploited Children (NCMEC) in accordance with MRC’s requirements?13 If so, how many URLs have been reported since 2021? From 2021 to 2025, how many instances did IAS and DoubleVerify specifically report imgbb.com or ibb.co to NCMEC? 
  3. How has MRC responded to vendors who make false or misleading assertions or mischaracterizations about their accreditation status?
    1. What specific audits, monitoring, or oversight mechanisms does MRC employ to ensure certified companies comply with these standards? How many vendors’ accreditation status has MRC revoked for non-compliance with standards?
    2. Has MRC conducted a review of DoubleVerify’s or IAS’ accreditation status with respect to potential non-compliance? Please provide information regarding any such review that has occurred since 2021. Please provide details of the brand safety accreditation process and any audit, investigation, or review of accreditation status conducted of IAS and DoubleVerify since 2021. 
  4. Why does advertising continue to be served on CSAM-hosting and other illicit websites despite the use of MRC-accredited vendors? Does this represent a shortcoming of the technology vendors or their advertiser customers?
  5. Do the MRC’s guidelines speak to whether accredited vendors are permitted to maintain their revenue share or fees when they measure ads served on CSAM or other illegal websites? 
  6. What additional measures is MRC considering to strengthen its certification and enforcement process and prevent similar failures in the future? 

Please provide a timeline for implementing these measures. Your cooperation and transparency are essential to safeguarding the safety of our children. Thank you for your attention to this urgent matter.

Sincerely, 

Marsha Blackburn, United States Senator

Richard Blumenthal, United States Senator

Letter to the Trustworthy Accountability Group

Dear Mr. Zaneis, 

We write to express profound concern that entities certified by the Trustworthy Accountability Group (TAG) have participated in the delivery of advertising, including by United States government advertisers, on a website that has been known to host child sexual abuse material (CSAM) since at least 2021, according to transparency reports released by the National Center for Missing & Exploited Children (NCMEC). The dissemination of CSAM is a heinous crime that inflicts irreparable harm on its victims. When digital advertisers place content on websites that are known to host such activity, they have in effect created a funding stream that perpetuates criminal operations and irreparable harm to our children We urge your organization to strengthen and adequately enforce its standards so accredited vendors are no longer allowed to support the funding of CSAM and other illegal websites.

TAG’s actions here—or at best, inaction—have raised several concerns. First, advertisers have unwittingly advertised on a website known to host CSAM, despite working with advertising technology vendors accredited by TAG for “brand safety.” TAG’s “Brand Safety Certified” Guidelines identify a “floor” for content that is prohibited from receiving advertising. This “floor” includes “Crime & Harmful Acts to Individuals and Society and Human Right Violations” and “Adult & Explicit Sexual Content.” Nevertheless, reporting indicates that at least nine TAGcertified vendors participated in the placement of advertising on imgbb.com—a website that has been known to host CSAM since 2021.

Additionally, TAG has failed to enforce its standards and investigate non-compliance, resulting in the continued funding of CSAM and other criminal activity for years. While this is not the first instance where TAG-certified vendors have reportedly participated in delivery or measurement of advertising on websites engaged in illegal activity, these vendors have remained in good standing. 

Further, while TAG publicly states that it has “worked closely” with the United States government on several of its efforts to combat digital threats, TAG-certified companies failed to prevent the government’s own advertising from appearing on a website known to host CSAM. 

You have personally referred to TAG as being “like the Good Housekeeping seal of approval for digital advertising.” But, to the contrary, recent witness testimony in the United States District Court for the Eastern District of Virginia identified TAG as “the minimum bar” and points to its “lax standards.” Whether TAG holds itself out as the ceiling or the floor, the failure of TAGcertified entities to prevent advertising from appearing on and funding a website known to host illegal CSAM is unacceptable. 

It is imperative that your company take immediate and comprehensive action to address this issue and ensure that you are not funding these heinous crimes against children. To better understand how this occurred and to determine appropriate corrective actions, please answer the following questions by February 14, 2025:

  1. What is TAG’s plan to review the certification status of vendors that have participated in delivery or measurement of advertising on CSAM-hosting websites?
  2. What is TAG’s standard for reviewing or revoking an entity’s certification for “brand safety” where such a company fails to identify or prevent ads from appearing on known CSAM or otherwise unlawful websites? Please outline any immediate corrective actions, including reviews of certified companies and potential revocation of certifications.
  3. Have TAG-certified vendors have reported URLs containing CSAM to NCMEC? How many URLs have been reported since 2021?
  4. How many vendors’ certification statuses has TAG suspended or revoked for noncompliance with standards? Please outline any such enforcement actions or remediations resulting from TAG’s inquiries or certification standing reviews. 
  5. Has TAG received complaints related to any of the entities identified as having participated in the delivery or measurement of advertising on imgbb.com since 2021?
  6. What specific audits, monitoring, or oversight mechanisms does TAG employ to ensure certified companies comply with these standards?
  7. Why does advertising continue to be served on CSAM-hosting and other illicit websites despite the use of TAG-certified vendors? 
  8. TAG’s Brand Safety guidelines specifically identify certain “content” that is prohibited from monetization, including illegal pirated content. Yet, the guidelines make no mention of standards relating to the monetization of websites engaged in illegal activity, as is the case with imgbb.com.
    1. Please explain TAG’s position as it relates to websites that distribute CSAM or engage in other illegal activity.
    2. Is TAG’s position that it is acceptable for Brand Safety certified members to participate in the delivery of advertising, and thus advertising revenue, to websites engaged in illegal activity, as long as the advertising does not appear next to “content” that is prohibited? 
    3. Do TAG standards address whether certified vendors are permitted to maintain their revenue share or fees when they measure advertisements served on CSAM or other illegal websites? 
  9. What additional measures is TAG considering to strengthen its certification and enforcement process and prevent similar failures in the future? 

Please provide a timeline for implementing such measures. Your cooperation and transparency are essential to safeguarding the safety of our children. Thank you for your attention to this urgent matter. 

Sincerely, 

Marsha Blackburn, United States Senator

Richard Blumenthal, United States Senator


Enjoying Adweek’s Content? Register for More Access!

 

Search

RECENT PRESS RELEASES