Urban One teamed up with Sounder earlier this year to conduct research they had a premise that was black and white: brand safety tools, while deployed with good intentions by marketers, are instead limiting the ad dollars being spent on shows by Black creators. The results of the study are now in, and they confirm that traditional keyword-based approaches to brand safety and suitability are often inaccurate and can lead to the under-monetization of diverse content.
"This research is a step in the right direction to help advertisers see the bias that exists in legacy brand safety solutions which disproportionately impact Black-owned and Black-targeted media,” said Josh Rahmani, Chief Revenue Officer for Urban One’s Audio Division.
In a white paper released today, Urban One and Sounder say their analysis showed several incidences of Black English being incorrectly transcribed including one case where the wrong word brought a higher risk assessment to a show when the word “that” was instead transcribed as “dead” which caused the segment to be classified as medium risk under the “death, injury and military conflict” label.
Current brand safety tools often present an all-or-nothing option to clients, which Black creators say misrepresent their content and short-change them on ad buys in the process. The report says that when those standard transcription-based brand safety tools are used, an overwhelming 92% of Urban One’s podcast episodes were removed from available inventory, compared to a 63% average block rate for episodes overall. But when Sounder used its semantically and contextually-focused artificial intelligence-based model to determine suitability, nearly 82% more Urban One episodes became available to monetize. It is a difference the companies say vastly increases the opportunities for podcasts to be considered suitable for a brand while also challenging the long-held belief that Black-owned and targeted audiences lack the scale for advertisers.
“Our research uncovered the critical need for Black English to be legitimized through transcription that supports diverse creator voices,” the report says. “As any other language is trained with its own unique dictionary, so too should Black English and other minority creator dialects. Doing so makes transcription and resulting brand safety and suitability data more accurate, reflective, and inclusive of diverse vernacular languages.”
Beyond just word choices, the white paper also points to a deeper misunderstanding by podcasting’s gatekeepers and their technology of the cultural nuances that exist. It points out that 11% of episodes in Urban One’s network triggered the categorization for shows that debate sensitive social issues while jut one percent of shows overall analyzed by Sounder were also labeled for a potentially offensive tone. The report concludes however that many of the discussions were instead low and medium risk based on their informational intent. “Racism, and similar topics, are highly debated with passion and intertwined with actions resulting from such passionate conversations,” it says.
Urban One and Sounder say their research findings have implications for the entire podcasting industry. As podcasting continues to grow in popularity, they argue it is essential that the industry to finds ways to ensure that all voices are represented. One way is through the use of AI-driven tools that are designed without bias, which the companies think will allow the podcasting industry can create a more inclusive and equitable space for all creators.
Sounder CEO Kal Amin sees AI and machine learning as a way to improve brand safety and suitability models while also equal monetization opportunities for diverse creators in podcasting. "This research is a significant step forward in ensuring equal monetization opportunities for diverse creators in podcasting," he said.
Download the white paper HERE.