| The game is rigged: A former marketer shows you how Big Tech’s advertising practices harm us all|
by Lisa Macpherson
July 10, 2020
As of this writing, it appears the U.S. Justice Department and a group of state attorneys general likely will file antitrust lawsuits against Alphabet Inc.’s Google for an array of anti-competitive practices in its search and ad technologybusinesses. At the same time, the Federal Trade Commission, Justice Department and a group of 47 state attorneys general have opened inquiries into Facebook, looking at antitrust behavior as well as possible privacy law violations.
It’s about time.
For 23 years, I held a front-row seat to the advent and evolution of digital marketing, witnessing its inevitable rise and twisted consequences. I specialized in helping companies adapt to the digital revolution but abandoned the field when I realized that the capabilities we marketers supported, like harvesting data, profiling consumers and personalizing content, were damaging society and our democracy. This is why I no longer help brands, including several who made their mark bringing people together, tear our communities apart.
I also left because I realized the game is rigged. The field is tilted, in favor of the two dominant players — Google and Facebook — who set the rules. This is the story of how we got here and what we can do about it.
Advertising abandons context for eyeballs: In the 1990s, advertisers negotiated directly with online publishers to place ads on websites in context that was relevant to their products, just as they had with traditional publishers. This was generally a real negotiation on an even playing field. But as we gained more data about our customers’ behavior, we started placing ads wherever we could reach their eyeballs — regardless of the content or location.
As the number of websites exploded, including thousands of small sites reaching niche audiences, we had to automate the process. By the early 2010s, a “negotiation” had transformed into a blind bidding war facilitated by layers of software programs — and every software provider levied a fee. These layers of advertising technology often meant advertisers did not even know where their ads were running or who was actually seeing them.
Ad costs rise while ad value and consumer benefits plummet: In 2017, a scathing book about digital advertising called “ Bad Men” explained how “the amazing world of ‘ad tech’ magically turns a dollar of online advertising into three cents of value.” Fraud provided one explanation for this value loss. Ads could end up on any of thousands upon thousands of small, unknown sites. We were told we were gaining “reach” and finding new audiences. Then we discovered that many of these sites were gaining traffic from bots — software programs designed to repeatedly load webpages — and not actually people. To combat this, we added “verification services” to our ad-tech stack — for another fee.
We also learned that these unknown sites could be toxic. We coined a new phrase, “brand safety,” to refer to managing the risk of advertising on sites that spread conspiracy theories, hate speech and propaganda. Now we needed “white lists” and “black lists” of content and sites where our advertising could or couldn’t appear. And thanks to the opacity and complexity of the supply chain of ad inventory, these lists didn’t always work.
They still don’t. Recently, we’ve seen reports of how Google’s ad tech continues to place advertisements of major, unwitting consumer brands on websites with content pushing coronavirus (and other) conspiracy theories. Advertising also lost value as consumers embraced ad blockers to prevent websites from loading advertisements. Ironically, Google launched its own ad blocker for its Chrome browser, essentially enabling the tech giant to set advertising standards, charge advertisers to place ads in content that may violate those standards and then block those same ads from view.
These issues in the supply chain became so egregious that some of the most prominent marketing executives in the world began calling them out. In 2017, Marc Pritchard, chief brand officer of Procter & Gamble, the world’s largest advertiser with a $7.5 billion annual advertising budget, described the digital display advertising supply chain as “ murky at best, fraudulent at worst.” A year later, Keith Weed, the chief marketing officer of Unilever, at the time the world’s second largest advertiser, referred to the digital media supply chain as the “ digital swamp.”
Advertising technology had become so complex and fragmented that Luma Partners, a consulting firm, invented the LumaScape chart to help marketers figure out all the companies that sold trading services, social media monitoring, verification services and more. This iconic chart was used by other clients to determine which service markets had the greatest opportunities for profitable acquisition or consolidation.
And that happened: Google went on a buying spree, and today their share of the key services in a marketer’s ad tech stack ranges from 40% to 90%. The company also holds a significant portion of the supply of all display ad space (20% is held by their YouTube unit alone). Despite the consolidation — or maybe because of it — things haven’t changed much since “Bad Men.” A recent report from the U.K. found that publishers receive only 51% of the money spent by advertisers to reach readers — leaving ad tech fees to swallow up most of the rest (and about 15% seems to just … disappear). Additionally, each of the 15 advertisers in the study appeared on an average of 40,524 websites, most of them classified as being “non-premium” ad inventory.
In other words, now Google is doing what it used to take thousands of ad tech companies to do: take premium ad dollars and somehow magically drain them of most of their value.
Facebook forces marketers to play its game while invading your privacy: That brings me to Facebook, another advertising-fueled business model and one unrivaled in its ability to target consumers at scale. Combined with its holdings Instagram and WhatsApp, Facebook has a 75% share of the social network market, and mostly thanks to Facebook’s 2 billion users, a 50% share of the total display ad supply. For any marketer that needs scale in social media, Facebook is the only game that matters. And since it has 100% control of access and pricing, marketers have to play by Facebook’s rules.
Some of the ways Facebook plays are deeply troubling. For example, due to its reliance on advertising for its business model, the company has perfected user experience design intended to increase “engagement” — that is, keep people scrolling because Facebook’s “inventory” consists of users’ time and attention. People are more likely to “engage” — like, comment on and share — information that creates a visceral emotional response. So highly partisan content, hate speech and conspiracy theories get upranked by Facebook’s algorithms, spreading far more quickly than dry but authoritative information. Ultimately, the ability to microtarget content — including ads — to more precise slices of the user base means the most inflammatory ideas — some designed to foment divisions in society — are virtually invisible to those who might counter them with the usual antidote to false or inflammatory speech: more speech.
Over time, Facebook even took on the creation of our ads: We would provide their ad sales team with hundreds of individual images, headlines and lines of copy, and the algorithm would optimize combinations that made consumers share, click or buy. Since advertisers pay more when a user takes some sort of action, Facebook has strong incentives to predict and then control users’ behavior. Frustratingly, we often had no idea what our “ads” even looked like or who saw them until a campaign was over.
In fact, advertisers have little visibility into or understanding of any of Facebook’s algorithms. We could communicate with our own customers (a “custom audience”), find customers like them (a “look-alike audience”) or describe what kind of customers we wanted to show our ads. Then the algorithm took over, compounding what we had described. Sometimes, the algorithm predicted what kind of customer you were from your activities on (or off) Facebook, even things about you that you might have thought were private.
We came to realize that the optimization combined with the predictive capability created the potential — and the reality — of bias, discrimination and exploitation. (Recently, a list of more than 500 advertisers, including Unilever, Coca-Cola, Levi’s, Starbucks and Ford, announced they would withdraw their advertising from Facebook — some included other social media platforms — for the month of July or longer, responding to a boycott campaign organized by civil rights groups. I see this is a welcome sign that advertisers are finally beginning to apply their leverage. And a series of civil rights audits — the final one published this past week — showed that the company still has a long way to go to address discrimination.)
Facebook’s voracious data collection practices — not only on its own platform but across many platforms and devices — and how it analyzed, packaged and sold access to that data earned them billions of dollars in government penaltiesfor violating users’ privacy rights. Eventually, thanks to Cambridge Analytica and other scandals, consumers came to understand how their privacy was being violated, and how they were losing autonomy and freedom. It took a while, because Facebook’s terms of service — which theoretically spelled all this out — were so hard to find, to read and to understand that it wasn’t really clear. They certainly weren’t clear enough to make you leave Facebook for another big social network that might have treated your privacy with more respect. And soon, there really weren’t any other big social networks.
To compete, advertisers must play by Google and Facebook’s rules: Today, Google and Facebook combined get just over 60% of every U.S. digital marketing dollar. As you might imagine, that kind of power and dominance in a market makes it difficult for advertisers of any size to negotiate.
To be fair, the ad community was complicit in some of this. We chased our customers in their inexorable migration to the internet for information, entertainment and socializing. We loved and needed the cost efficiency of highly targeted digital media and craved more reach. We wanted to influence our customers at every step of their “purchase journey” (a framework we invented to feel good about tracking consumers across devices and platforms).
So it’s been painful to realize that the very same capabilities we helped create — extracting information about users and their online behaviors; segmenting and profiling users for precision targeting; using algorithms to customize content to individual users; and designing user interfaces to increase time and engagement — are being used to distribute and amplify hate speech and propaganda, negate privacy and autonomy, manipulate and exploit behavior, and contribute to social isolation and addiction. No wonder consumers are losing trust in brands.
Regardless of how antitrust law deals with Google and Facebook, there’s one thing that’s crystal clear. Both companies have accrued so much power that they’ve had a lasting negative impact on advertisers, their relationships with their customers and consumers themselves.
We need a strong referee to change the advertising game and protect consumers: How do we solve the problems perpetuated by the dominance of Facebook and Google in the advertising market? It will probably take a system of solutions. The answer to many market problems is more competition and more choice, and I’m hopeful that antitrust enforcement will bring that about. A dedicated U.S. regulatory agency with specialized expertise and the agility it requires could also set and enforce new rules as a referee. Consumers also seem to be waking up to the harms of the digital platforms and may change some of their social behaviors. And given that 98% or more of these companies’ revenues come from advertising, I hope more of my advertising colleagues wake up to their power, too.
Lisa Macpherson is a senior policy fellow at the Washington, D.C.-based nonprofit Public Knowledge. She is a former consumer marketing executive.