Top Advertisers Pullout From YouTube Over Child Videos, Comments
Reports has it that AT&T is the latest company to pull its ads from YouTube following reports that pedophiles have latched onto videos of young children, often girls, marking time stamps that show child nudity and objectifying the children in YouTube’s comments section.
An AT&T spokesperson told CNBC that; “Until Google can protect our brand from offensive content of any kind, we are removing all advertising from YouTube.” The company originally pulled its entire ad spend from YouTube in 2017 after revelations that its ads were appearing alongside offensive content, including terrorist content, but resumed advertising in January.
On Wednesday, Nestle and “Fortnite” maker Epic Games pulled some advertising. Disney reportedly also paused its ads.
There’s no evidence that AT&T ads ran before any of the videos brought into question by recent reports. Advertisers such as Grammarly and Peloton, which did see their ads placed alongside the videos, told CNBC they were in conversations with YouTube to resolve the issue.
As at press time, YouTube declined to comment on any specific advertisers, but said in a statement on Wednesday, “Any content — including comments — that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube. We took immediate action by deleting accounts and channels, reporting illegal activity to authorities and disabling violative comments.”
Also on Thursday, AdWeek obtained a memo YouTube sent to advertisers that outlines immediate changes YouTube says it’s making in an effort to protect its younger audience. CNBC confirmed its authenticity with one of the brands that received the memo.
YouTube has revealed that it is suspending comments on millions of videos that “could be subject to predatory comments.” It’s also making it harder for “innocent content to attract bad actors” through changes in discovery algorithms, making sure ads aren’t appearing on videos that could attract this sort of behavior, and removing accounts “that belonged to bad actors.” YouTube is also alerting authorities as needed.