Misinformation is a growing issue for Americans with just seven percent of people in a new survey saying they rarely encounter it as part of their media diet. Roughly seven in ten people agree that misinformation is “an issue” and that it is also getting “out of control.” But when it comes to where they are finding it, the study by Magna and Zefr shows most Americans believe audio is by far not the problem.
Traditional AM/FM radio ranks dead last for where people surveyed say they most often encounter misinformation. Podcasts rank second to last. Across media types, people reported they encounter misinformation the most on social media (94 over index), followed by television (57 over index).
The low numbers for radio and podcasters is especially good news since rather than reporting misinformation to a publisher, the most common response by people when they encounter bogus content is to simply boycott the platform or publisher going forward. Among those surveyed, 37% said that is the move they make compared to 23% who said they send a message to the platform alerting them. “Platforms cannot rely on user reporting to detect misinformation, as most people either do not message the platform or ignore misinformation altogether,” the report says.
The study also helps to justify why marketers are focused on misinformation as a brand safety issue. That is because Americans spread the blame equally when an ad appears adjacent to bogus news.
The survey results show a majority (53%) put the blame on the publisher or author, but not far behind is the 49% who blame the platform and 44% that attach some blame to the adjacent advertiser. And nearly two-thirds (63%) say that misinformation has a negative impact on how they see the marketer’s brand.
To gauge the real-world impact of misinformation on brands, researchers asked people about real brands using hypothetical scenarios. What they found was half of people surveyed said they were less likely to buy a brand that they connected with misinformation. And 53% were less likely to have a favorable view of the brand. The same number said they would be less likely to recommend it to others. The data points to consumers seeing brands as just as responsible as the perpetrators of misinformation.
“People want brands to be proactive in tackling misinformation,” the report says. The survey results reveal that when it comes to misinformation, 87% of people believe brands need to take responsibility when associated with misinformation. And 86% said brands need to take every effort to avoid being next to misinformation.
“In a fractured news environment, 93% of respondents told us they view misinformation as being ubiquitous, and while they’ve developed skills to identify it, our study revealed that consumers are not reporting it, which puts even more responsibility on brands to be aware of ad environments,” said Elijah Harris, EVP Global Digital Partnerships & Media Responsibility at Magna. “A resounding 87% of respondents told us they expect brands to take responsibility for misinformation, and combatting it is becoming a critical industry issue of our time.”
Magna conducted focus groups with individuals across the U.S. as well as surveyed 2,045 people online for the study. If there is an upside in a divided America, the results show misinformation is an issue everyone can agree on.
The study discovered that political ideology does not necessarily shape opinions when it comes to issues around misinformation. Left-leaning (72%), right-leaning (73%), in the middle (66%), and apolitical (66%) identifying respondents all agreed within a similar survey range that misinformation is “out of control.”
The findings are also good news for study partner Zefr, which works with advertisers on brand suitability targeting and measurement. “It’s clear the time is now for brands to demand third-party verification for misinformation and help protect their long-term brand values as well as their short-term business results,” said Zefr Senior VP Christopher Murphy.
Download the full “Voices on Misinformation” study HERE.
Comments