GumGum’s Study of Ad Supported Streaming Content Reveals 20 Percent of Ads Streamed on Children’s Programing is Inappropriate For Younger Audiences

Human Review Study of CTV Exposes Major Gap in CTV Supply Chain

GumGum, a contextual-first, global digital advertising platform, has unveiled the results of a new study that found significant brand safety violations in advertising on kids’ CTV content. The research found that 20 percent of all ad breaks targeted to children contained at least one inappropriate ad, including ads for alcohol, casinos, gambling, adult hygiene, pharmaceuticals, and foods with high sugar/fat content.

GumGum conducted a human review of over 100 childrens’ shows that aired on a representative sample of leading video streaming apps, including both free and paid streaming services. The study was conducted to test what audiences see in multiple states over the span of 4 months. The types of ads flagged as inappropriate for children were compiled according to the Federal Trade Commission (FTC)’s rules, regulations, & recommendations.

“We are living in a video-first world where basing insight on just a simple keyword or generic metadata description isn’t going to work – not only to avoid specific content, like childrens’ shows, but for targeting and placing relevant ads as well,” said GumGum, CEO, Phil Schraeder. “There is a huge gap in the CTV ecosystem that most advertisers and publishers aren’t aware of and there is something we can do about it.”

Most advertisers today still rely on basic ad verification and brand safety technologies that only analyze the generic metadata descriptions for videos. When content information is shared, it is self-declared and not consistent across supply sources leaving advertisers to make decisions based on limited metadata, such as genre or channel name. This results in ads intended for adults routinely appearing alongside children’s content, violating strict regulations in the United States.

Marketing Technology News: Zembra Chosen for Newchip’s Intensive Global Pre-Seed Accelerator Program

Advances in artificial intelligence mean advertisers now don’t need to rely on patchy video description data and can analyze video content at scale on a much deeper and forensic level. GumGum’s accredited contextual intelligence platform, Verity, for example, can evaluate videos on a content level or a frame-by-frame basis, without relying on the presence of metadata and video descriptions, giving a more precise reading of what the video content is about. GumGum is developing a machine learning model for Verity specifically trained to identify made-for-kids content, which will result in a specialized classifier to predict whether a web page or video is made-for-kids.

“There is a major gap in the CTV supply chain and it’s something we can’t ignore,” said Schraeder. “We are working with people like IRIS.TV, who have developed the IRIS_ID, a content identifier that publishers can use to securely share their content’s video-level data. We are also encouraging advertisers to evaluate the tech they are using to support their growing video strategy. Having technology that can understand all the elements of a video, is critical now but also a key component to navigating video and future interactive environments like In-Game and the Metaverse.”

GumGum recently announced that following a rigorous testing process, it is the first ad tech provider to earn an accreditation from the Media Rating Council (MRC) accreditation for content-level analysis for brand safety, suitability, and contextual analysis for CTV inventory. This accreditation further proves GumGum’s commitment to transparency and ensuring advertisers can trust the technology they are using to place digital ads across all digital environments.

Marketing Technology News: MarTech Interview with Leslie Marshall, Chief Marketing Officer at Mesmerise

Brought to you by
For Sales, write to: contact@martechseries.com
Copyright © 2024 MarTech Series. All Rights Reserved.Privacy Policy
To repurpose or use any of the content or material on this and our sister sites, explicit written permission needs to be sought.