Is Google Profiting From COVID-19 Conspiracies

1

Google, along with many of the web and social media platforms, have taken great efforts to stamp out wild – and even potentially dangerous – conspiracy theories surrounding COVID-19. This has included pulling down videos including on Google-owned YouTube, but despite these efforts just as misinformation continues to spread, so too do the videos.

This has even been described as an “infodemic” – a worrisome side effect of the ongoing novel coronavirus pandemic.

Even as YouTube cracked down on COVID-19 conspiracy theorists, their videos were still getting millions of views. In many cases it was because such content can often go viral long before YouTube or other platforms can react.

What is just as disconcerting is the fact that Google has continued to place advertisements on websites and on YouTube that are publishing those theories. Bloomberg reported that this allows the site’s owners to generate revenue and thus continue to operate. And in at least one case, Google even ran ads featuring a promoter of the very conspiracies it had banned.

Bloomberg cited a number of examples that seem downright concerning, such as an ad for telecommunications provider 02 showing up on an article linking the virus to 5G networks, which happens to be a largely debunked yet still common theory. An example of ad adjacency involved Microsoft 365 backup service, when the ad ran on a site that suggested that Microsoft founder Bill Gates was involved in charitable efforts as part of a world domination plot.

The ads were placed through Google’s automatic system for matching marketers with websites, but it doesn’t take a conspiracy theory to suggest that there is clearly a problem with the algorithms mismatching ads simply based metadata and keywords.

“It’s difficult to say exactly what is leading to the disparity between Google’s stated policies and these problematic results,” said technology industry analyst Charles King of Pund-IT.

“However, I suspect a couple of issues are probably involved,” King explained. “First is the complexity of online advertising, which many people think of as a definable, manageable entity. Instead, the online marketplace is vastly complicated, involves thousands of markets and regions and hundreds of thousands of companies and interested parties, more than a few of which are actively working to circumvent or subvert Google’s and other advertising platforms’ policies.”

Google’s ad platform works with a plethora of sites – and as King noted, each has their own specific rules. Moreover, Google places ads for large brands that do routinely publish conspiracy theories. But while such information could cross the line into “misinformation” or even “disinformation,” in most cases it is generally harmless and this makes it hard for a platform such as Google’s to distinguish.

A site that suggests that the Titanic purposely hit the iceberg or President Kennedy was assassinated by a sinister cabal isn’t exactly “misinformation.” In some cases conspiracy theories are actually entertaining.

That’s why a lot of people view and read these theories, not to be informed but to simply see a different side of the story, no matter how unbelievable it may be presented.

“In addition, it’s worth noting that while the hundreds of thousands and millions dollars that the Global Disinformation Index says are being paid out to these sites are substantial in sum, they are generated by billions of page views across innumerable web sites,” said King.

“In other words, despite the best will and efforts of Google or any other online advertising platform things are going to fall or knowingly escape through the cracks,” he added. “Which leads to what the company intends to do now that it’s aware of the problem. That’s a question well worth asking and deserving of an answer.”

gen1

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: