Podcast The Lies of Brand Safety Season 1, Episode 5

What The Adtech Brand Safety
Somer Simpson
Host Somer Simpson Responsible Advertising Advocate
Cedar Milazzo Ceo Nobl
GUEST Cedar Milazzo CEO, Nobl
Clair Atkin
GUEST Claire Atkin Co-Founder at Check My Ads Institute / The Adtech Watchdog

Episode Description

In this episode, your host, Somer Simpson, is joined by CEO of Nobl, Cedar Milazzo, and Co-Founder at Check My Ads Institute, Claire Atkin, where they uncover the missteps of big tech, which often start with thinking brand-first instead of customer-first. Responsible media starts with considering customer perspectives and problems and developing solutions specifically for them.

Stay up to date with the latest podcast episodes, blogs, and news.

Transcript:

Somer Simpson 

This is What the Adtech: Let’s Talk Responsible Advertising. Over the past few years, consumers have started holding marketers’ feet over the fire, forcing them to be more conscious about ethics and advertising, and intentional about the content they use, the teams behind the campaigns, and overall investments in media. I’m Somer Simpson, and I’ll be having thought-provoking, honest, and raw discussions with some of today’s top marketing minds about the future of ethics and advertising, and what it means for both marketers and consumers today. 

Okay, welcome to What the Adtech. Today we’re gonna be loosely talking about brand safety. It’s an umbrella term that I would say has changed pretty dramatically over the past 10 years or so, given advances in technology and the growth of social, and the changes in our political climate. Now we’re not so much talking these days about your travel ads appearing on airline crash articles, and we’re more talking about: is this good journalism; is it fake news; making sure that your ads don’t appear next to hate speech. I’m joined today by two fantastic people. So Claire Atkin is co-founder at Check My Ads Institute, and Cedar Milazzo, who’s the CEO of Nobl—welcome to you both. So Claire, let’s start with you, introduce yourself and tell us a little bit about Check My Ads.

Claire Atkin 

Hi, Somer. Thanks so much for having me. My name is Claire, and I am indeed the co-founder of Check My Ads Institution. We are working to solve the disinformation crisis, and the way we do that is by bringing attention to the relationship between ad tech companies and the publishers that are funding disinformation and hate speech throughout the world. We are taking on the $700 billion dollar advertising industry, we believe that it has to be a lot more responsible than it is right now. And it’s working.

Somer Simpson 

Excellent. Cedar?

Cedar Milazzo 

Great, thanks. Thank you for having me on, Somer. I appreciate it. And great to meet you as well, Claire, looking forward to the rest of the conversation. So I’m Cedar Milazzo, I’m the CEO and founder of Nobl Media. I’ve been in the adtech space for about 10 or 12 years now and have a background in technology. We built Nobl Media specifically to address the incentive structure around content online. Specifically, we’re looking to incentivize high-quality content and disincentivize misinformation and other types of toxic content—whether it’s hate speech, misogyny, whatever it is. We’re really focusing on just making sure that advertisers, who are supporting the free internet and basically making it all available for all of us, are actually spending their money responsibly in a way that helps society but also helps themselves to actually connect with their audiences as well.

Somer Simpson 

Excellent. So I’m actually, I’m excited to have you both on because all three of us, but you two in particular, are trying to address the same problem, and you’re approaching it in different ways. And I like both. I think both are necessary. I want to dig deeply into those things. But maybe first, I’d like to get each of your perspectives on: what is the problem that we’re solving? And how did we get here? So, Claire, we’ll start with you.

Claire Atkin 

What a tiny question. Okay, so the problem right now, from my perspective, is that marketers and advertisers have given up the power over their own campaigns to the adtech vendors who have said, “Don’t worry, we’ve got this.” And within the adtech system, which is a $700 billion dollar system, it’s incredibly powerful; it’s incredibly opaque; it’s getting away with a lot. And it’s because advertisers and even consumers actually are having a hard time figuring out what the heck is going on inside. The adtech industry has, again and again, said, “Don’t worry: we’re going to fix these problems. Here are brand safety solutions. We’ve got a keyword list; we’ve got context control,” and every time they’ve done this, they’ve ended up defunding small journalism operations and giving white nationalists a hall pass. And so what we’re seeing is we are in a disinformation crisis. The world over is witnessing the rise of authoritarianism, the rise of fascist myths, the rise of fear mongering, and the adtech industry has not only not solved the problem, but they’ve made it harder to understand and harder to solve.

Cedar Milazzo 

I agree with you in a lot of ways, Claire. I think you’ve hit the nail on the head there. I think, to me, it’s actually a little bit more than brand safety. Somer, you mentioned the word ‘responsibility’ earlier. It’s not just small journalism that’s been defunded. In many cases, you see some of the really large outlets actually being cut off from brand safety as well. We’ve actually talked to several big advertisers who said they just don’t advertise on the news anymore, because they don’t know what’s safe and what’s not; they don’t know what’s real and what’s not. So it’s actually causing an even larger issue, where it’s not just avoiding terms like COVID or global climate change or something like that, it’s also avoiding the entire ecosystem. If we want to have functioning democracy, we really need that information. People can’t make any sort of decisions without strong reporting and strong journalism. And without the ad dollars out there to support it, there’s real problems with being able to actually fund any of that hard work and all that great work that actually is done by the journalism outlets.

Somer Simpson 

So it’s interesting: let’s talk a little bit about the tech that’s used today. Like you said, Claire, marketers trust it, right? The vendors say, “Oh, yeah, this solves your problems; it’ll work.” And they kind of trust it blindly, that it’s gonna do the thing that they’ve been sold that it does. Talk a little bit about some of the options that marketers use today. What are some of the inherent problems with those tools? Cedar, do you want to start?

Cedar Milazzo 

Sure, yeah. There’s everything from just whitelists and blacklists, which are just lists of publishers to avoid or to include in your campaigns, all the way through to a lot of these brand safety companies are now talking about AI-powered algorithms that can supposedly detect whether something is brand safe or not. Marketers look at this, I think, in two ways. First of all, it’s giving up that control, yes, but realistically, marketers don’t have the ability to do anything more than that at this point, especially when they’re looking at programmatic. Either they avoid the entire ecosystem, or they contract with somebody like one of these brand safety companies that basically has what I call a CYA kind of solution. It’s a way to cover yourself, but it doesn’t necessarily help, right? It’s a way to say, “Look, I’m doing all I can. I’m buying the solution; therefore, I must be doing the right thing. You know, obviously there are marketers out there who think that they’re doing everything they can, and doing the best they can. But the fact of the matter is, these technologies are broken and it’s obvious as soon as you look at where some of these ads are shown. It’s not even in doubt, because Claire and her organization as well, Sleeping Giants and other organizations like that, are continually finding examples where big marketers are still running ads next to content that is very obviously not safe and not responsible. There was a report out just the other day about Google and how their ad platform is still supporting Russian propaganda, even after a month and a half of people complaining about it and then saying that they’re fixing it. So it’s not just about Google and the big companies, obviously; other ad tech companies that are focused on brand safety, specifically, are contributing to the problem just as much by not having solutions that really work.

Somer Simpson 

Claire, what are your thoughts on tech that’s available today that marketers tend to use most often?

Claire Atkin 

Yeah, what Cedar was describing is what I like to call ‘brand safety washing.’ It’s like buying something that is pink because you think somebody will go back to solving breast cancer. Or buying something that looks like it might be more environmental, because you think it might be helping the planet, but actually is just a continuation of the same old capitalist march. What brand safety washing is doing is telling advertisers that they don’t actually have to do the work to understand where their placements go. And as a marketer, that is wild. Marketers put all of their effort towards understanding their consumer, towards making sure that their messaging is on point. We do focus groups; we talk to influencers; we build partnerships. We are detailed in the way that we write copy and the way that we run our aesthetic; we know what it means to be representative of a brand. And then, at the last minute, to throw that into the hands of an adtech company and to say, place these ads wherever you want, is insane. That’s how the industry has been operating: we as marketers have given up control over the most important thing, which is where our brand is associating, who our brand is associating with. And that’s why we’re finding that our brands are associating with white nationalism.

Cedar Milazzo 

Just a quick comment on that, if you don’t mind. I think that you’re absolutely right about that. But I don’t think it’s the marketers’ fault.

Claire Atkin 

I agree.

Cedar Milazzo 

I think that at the beginning, marketers probably very much wanted to know where their ads are being placed. But our experience, when we work with large marketing companies especially, where they’re spending tens of millions of dollars, they’re not able to even get a list of places where their ads are run from their providers. Whether that’s the big ad tech company like Google or something like that, or even some of the smaller platforms, they don’t even have that ability to get the list of where their ads were run. And without that, how do you improve it? How do you make it better without having that information? You know, all the things you’re talking about, where you’re identifying exactly the target and the person and everybody you wanted to target to—without information, you couldn’t do that. So they’re happy to provide information on the audience; they’re happy to track everything that everybody does across everything in both online and offline and use that to build a profile on that consumer. But they’re absolutely not going to give you information that might potentially embarrass them or embarrass the marketer themselves, when they see where those ads are being placed.

Claire Atkin 

So true, the disinformation thrives in those high-level reports: here are the impression numbers; here are your CPMs. And if you don’t search through the site list of, you know, could be 200,000, could be 400,000 sites that you are on, then you’re not going to realize that your ads are associating with things that you would never accept in your own office or in your own community. It is absolutely the adtech industry that is responsible for the disinformation crisis that we see today. And I agree with you, Cedar: it’s not the marketers. But then, on the other side, we have consumers, people who are purchasing products, whose revenue of those products are going to these ads. And I think what is missing within this conversation is that, of course, it is a business conversation. But we’re also talking about the rights of consumers to know where their money is going, and to be safe in a society. So what we do at Check My Ads is actually have email campaigns where we say to everyone: you can write an email to an adtech executive today. Sign up for our ‘defund the insurrectionists’ campaign or whatever campaign we’re running, and you can actually ask adtech executives, why—when they say to their clients, they would never work with publishers who promote violence—they’re working with Steve Bannon, or Glenn Beck, or Charlie Kirk. And I think that’s actually a very powerful message. It is absolutely the adtech system that is responsible for this mess. But we as a whole group of people, we as anyone, you don’t have to even be technical, can actually work to fix the problem.

Somer Simpson 

Do you think that this is a matter of perspective? Something that I found interesting—I’ve not spent my entire career at adtech; I’ve come out of media—and just really, in the past four years, I’ve really dug deep into building product for an adtech company. I found that the majority of adtech is very reactive, right? It’s like we’re all playing this massive game of Whack a Mole. And when you’re reactive, it’s a lot harder to be sure and confident that your perspective and what you want to achieve is representative and in the tech. If you flip it on its head and you’re more proactive, and you say, “I, as a marketer, I want to spend my money; I want to advertise my campaign or advertise my product and reach these consumers, and I want to do it in this way, in these specific places.” That’s a much more powerful message in terms of spinning your marketing dollars. And it also offers a lot more control. Because you’re saying positively, “this is where I want to appear.” And you don’t have to say “but don’t put me in all of these places” and have to figure out how to define that and chase that mole.

Cedar Milazzo 

Yeah, you’re absolutely right, Somer. That’s actually the philosophy at Nobl Media that we take is we look specifically to find really high-quality content and there’s a lot of it out there, right? I mean, the internet is a huge place. There’s plenty of things out there that are high quality that you can advertise on and not have to worry about your reach or other issues, right? And so what we specifically do is we look for that high-quality content and say, “Okay, here’s the stuff that’s really good; why don’t you advertise on that?” And we’re working with the kind of adtech platforms to build that in from the start and filter out all the junk by basically focusing on the high quality, and then you’re not playing Whack a Mole anymore. It’s more about: this is something great; let’s go and make sure to support that with our ad dollars. And, you know, you mentioned perspective: you’re absolutely right about perspective. I mean, if you’re a marketer, you don’t feel like you have control because the adtech companies don’t necessarily provide the visibility you need. If you’re an adtech provider, quite often the results or the responses we get is: “we have no way of doing this. It’s just difficult when we’re looking at hundreds of millions of pages a day or billions of pages a day: how do we actually verify what’s right and what’s not? Or what’s okay, and what’s not.” And it’s not a matter of, you know, just Steve Bannon or the right wing conservatives, or anybody like that, it’s also, on the left, it’s everywhere, right? You get misinformation, and a hate speech, and all things bad across the spectrum. And so this affects everybody: it’s not just about one side being damaged; it’s about everybody in our society being damaged by it.

Claire Atkin 

I agree. And what we look at is: are they truthful? Are they consistently publishing narratives that scapegoat and divide us? Or are they reporting factual information? And it’s important to say that there is a group of people in this world, a very small group of very powerful people who call themselves the alt-right, and who are working loudly and proudly to spread fascist ideas. And those are the Steve Bannons. We don’t play in the gray area, Check My Ads; we’re not about the misinformation—we are about the intentional spreading of lies and deceit. When a publication is leading their readers into an alternate reality, that’s when it becomes so brand unsafe that it is appalling that adtech companies even consider working with them.

Somer Simpson 

What I find interesting… and I think both approaches are very needed, right? Because this trend that we’re seeing, as I call it, kind of like the new IVT. People aren’t just faking bid requests and faking domains in the bidstream to try to, like, steal money, pretending to be CNN. Now they’re spinning up entirely new websites, faster than you can keep track of them. So I think you need both approaches, because the behavior is insidious.You need that high-level perspective of someone that really, really digs deep and says, “Hey, did you know that this publication is owned by this company, which is owned by this company, which is owned by these people who have these viewpoints,” to give people that information so they can make those decisions. Then you also need to have that ability to, to Cedar’s point, to dig into billions and billions of pages and bid requests and impressions and all the things that we’re constantly looking at in programmatic. And you could be in a completely legitimate article, and then down in the comment section, there’s somebody going off, trolling people and saying really, really hateful things. That’s a problem, too. And it is in every sort of like nook and cranny and crevice. And how do you deal with that?

Cedar Milazzo 

Well, that’s exactly when we built our technology around at Nobl is being able to kind of analyze things at scale. We’re doing tens of millions of web pages a day that we look at and evaluate, then we’re looking for exactly those sorts of things, those hateful comments, or those kind of agenda-driven statements that just take away from the legitimacy of the actual content that’s there. Whether it’s a news story, or an opinion story, whatever it is, there are certain linguistic kind of clues that we look for. And we look in-depth at about 35 different criteria on every single web page. And we look for these things that are really subtle, but are really telling aspects of whether something’s really high quality or whether it’s really kind of junk, and misinformation, toxic-type stuff. So only when you do that at scale, at least get a good feel for things—it’s not perfect; no technology is gonna be perfect, of course—but you can at least cut down significantly on the amount. And you mentioned IVT, which is, I think, very interesting, because those are… IVT is actually very closely linked with this type of content, when you have some of the Steve Bannon type stuff. For example, one of the things, one of the tricks they do to generate more revenue is they’ll actually go buy traffic. So it’s not even humans looking at the pages themselves, right? And when you actually are looking at high-quality content, high-quality journalism, for example, those outlets typically don’t have the need to pay for that, or the desire, to pay for bots to come and load their ad. So you actually end up with a lot less fraud when you have actual content that humans want to read and that people actually trust in and find credible.

Somer Simpson 

So not only are they spreading lies and hate, they’re stealing marketers’ money.

Cedar Milazzo 

Exactly, both at the same time. They’re in it for money, just as much as idealistic reasons, right? I mean, a lot of these people are out there making millions and millions of dollars. And so whatever they can do to spread the word to more people, great. And if they have to buy more traffic via bots, then they’ll do that, too.

Claire Atkin 

Yeah, the adtech industry is a propagandist’s dream. When you get ads, you get legitimacy. Blue Chip brands give legitimacy to the news outlets or the publications that they consort with. When you get money, you can sustain and grow your operations; you can buy more bots; you can buy more writers, and you don’t even have to purchase an editor or a fact checker or a legal team—you don’t care about any of those things. It’s actually very affordable to expand your operations with not even that much money. When you get access to personal identifiable information of Americans, you can better target them in advance of elections. So the adtech industry is actually perfect for a propaganda outlet. Whether or not it’s domestic, it could also be Russia or China or Iran, anyone who is interested in undermining the fabric of American society. If they are on the adtech system, they have the tools to do that.

Cedar Milazzo 

The other thing that’s interesting about that, Claire, is the Ukraine invasion is a perfect example of that, right, when the Russia propaganda machine kind of just went crazy right before and during the invasion itself. And what we found is: there’s a lot of people out there who are just trying to make money, and who don’t have an editor and don’t even have a writer. And they’re just pulling information that they think people might view from other sites and just basically copying it or basically parroting what’s happening out there. So in addition to all that kind of official Russian propaganda channels, we saw thousands of websites out there that were republishing the exact same talking points, and making money off of it. So it becomes a money-making machine, not just because they believe in the propaganda, but also because people are reading it. And they can actually get legitimacy from taking something that was published somewhere else and republishing it.

Somer Simpson 

People are reading it and sharing it blindly, which is how it just continues to spread. Thank you, Facebook newsfeed.

Cedar Milazzo 

Exactly. And you know, bringing up Facebook is really, really important, Somer, because a lot of the spread of this stuff is due to algorithms, right? I mean, just the social media algorithms are not focused on spreading truthfulness or credibility; they’re focused on getting as many people to look at something as possible. And they’ve found over the years that the more they can target specific people with specific types of information, the more those people will engage with it and reshare it and comment on it, and so on. And that creates these bubbles, right? I think everybody is pretty aware now of the social media bubbles, where more and more you kind of just end up in an echo chamber, where everything you say is repeated back and what you already believed, it gets reinforced with more content. And that’s how the social media algorithms really gonna make their money. 

Somer Simpson 

So we’ve talked about how tech can be, AI can be used for good, and the stuff that Nobl is doing; we’ve talked about how AI can feed AI and create this bubble of lies and everything else. There’s some things that tech can’t solve, right? So, Claire, I’d love to understand a little bit more about how you and your team do your work. There’s a lot of human muscle involved in what you do. Can you kind of share that?

Claire Atkin 

Yeah, our unofficial slogan is ‘you can’t automate your way to brand safety,’ which comes from the brand safety tech companies who claimed that they could. What we do is we look at what is happening in the real world. And that is where we start. Who were the loudest people about an insurrection? It was Steve Bannon, Charlie Kirk, Tim Pool, Glenn Beck, Dan Bongino, and Fox News. Those are the six insurrectionists who made the most money off of propagating the big lie before an insurrection that advertisers the world over were funding ahead of the insurrection, and who are at risk of funding a second insurrection, if these publishers do not get kicked off the supply chain. Well, in the last few months, we have defunded Steve Bannon, Charlie Kirk, Glenn Beck, Dan Bongino, and Tim Pool. And we think that what that means is that the adtech industry overall has agreed with us that funding a violent insurrection is not brand safe. So how do we get there? What we did is we worked back from that, and we went to the websites themselves, and we saw in their text files that they were working with specific adtech vendors. And then we went to the adtech vendors’ text files, and we saw that they, too, agreed that they were working with these publishers. So what we did is we set up a website—it’s checkmyads.org/j6—and we had a message there, a video message for the adtech vendors. And we allowed, or welcomed rather, anyone who wanted to write emails to the adtech vendors with us, one after another, to ask them why their supply policy said one thing, and they were doing another; why they said they would never put brands next to things that incited violence, and yet, we’re doing exactly that. And that’s how we’ve taken millions out of the disinformation economy and taken millions away from the disinformation that propagated a violent insurrection.

Somer Simpson 

Very cool.

Claire Atkin 

Thank you.

Somer Simpson 

Let’s talk a little bit more about it. Because there are a lot of tools that marketers use, and there’s a lot of—I don’t know if you’ve both seen the lumascape for digital advertising. There’s a lot of logos in there and a lot of boxes.

Cedar Milazzo 

You have to actually understand it, right, because there’s so many.

Somer Simpson 

Right. And honestly, that’s like a lot of change being taken out of that dollar before it gets to the publisher anyway. Let’s focus in, specifically, on the ad verification in the brand safety vendors. So the tools that marketers have in their belt today, they can use allow lists, they can use block lists, they can use brand safety vendors, like IAS, Moat, DoubleVerify, to address IVT or, you know, block their ads from appearing on categories of content. 

Claire Atkin

It’s very effective at blocking the news. 

Somer Simpson 

Yes, it is. It’s very effective at blocking that.

Claire Atkin 

If you want to dismantle democracy, use brand safety technology that defunds the news, by all means.

Somer Simpson 

Exactly! Exactly.

Cedar Milazzo 

100% agree with that, Claire.

Somer Simpson 

Yeah, that’s my whole pitch: I work with Quantcast but then I’m trying to defend democracy. It takes me a little while to get there. But the one that I’m seeing an increase in usage is contextual. And that is the one that actually worries me the most, because you have human beings putting together lists of words that go into quote, unquote, “algorithms”—I really hate using that term for this technology. But it just kind of blindly says, “I found this word on this page. Therefore, I’m not going to show your ad.” And if you look at some examples of people’s contextual lists, you start to see bias. I won’t go into exact examples, but human beings, not knowing better, introduce bias into the things that they’re blocking.

Claire Atkin 

We need examples.

Cedar Milazzo 

You’re absolutely right. I mean, there’s no way around that, right? It’s just human nature. 

Claire Atkin 

I’ll give you examples. We’ve seen LGBTQ publications be blocked. We’ve seen publications that are owned by black people be blocked. They’re working with Steve Bannon, but they’re blocking or just choosing not to work with publications that are owned by people of color. So one of the biggest criticisms that we get at Check My Ads from mostly technologist people, mostly with sort of a libertarian stance, they say, “Oh, you’re affecting free speech. You’re anti-free speech.” And we say, “No, every time this is about the advertiser’s choice in where they place their ads, and that choice has been taken away from them.” And when you affect the advertiser’s free speech, that’s far worse than what we’re doing, which is calling attention to the fact that their free speech is not being respected.

Cedar Milazzo 

Yeah, I think this whole free speech thing is just a red herring, honestly. None of the efforts that we, as a group here and our compatriots out there, because there are a number of other groups doing similar types of things, none of us are against free speech. Quite the opposite, right? I mean, what we’re doing is calling attention to the speech that’s out there, and there’s nothing anywhere that I think anybody would ever say, “You can say whatever you want without consequences.” That’s not free speech. That’s being able to yell “fire” in a crowded theater, which is specifically not protected speech, that’s for sure. And I think when you look at these approaches that are letting massive amounts of toxic content get through, but also incorrectly marking news and other high-quality information that we need as a democracy, it’s attacking our society from both sides, right? It’s letting the junk and the really bad stuff get through and blocking the really good stuff. Now, obviously, it’s not 100%. There’s still—I’m sure they’re blocking some pretty horrible stuff; and I’m sure they’re letting some good news get through. But the fact that they are having this effect, in general, I think should be concerning to everybody, right? I mean, when we look at, how do you actually do this in a way that’s actually, first of all, not just yelling about it, but actually encouraging people to actually go change their behavior, whether that’s a marketer, whether that’s an adtech company, whether that’s even a publisher. It’s much more effective to go off and say, “Look, here’s the problem: you guys are doing this or that. And instead, you should be doing this other thing, which will actually be helpful to society, and helpful to the publishers, and helpful to the advertisers, and helpful to the consumers and the audiences.” That, I think, is where we as a group need to focus more. I mean, ourselves, Check My Ads, Sleeping Giants, all the other companies out there and organizations out there really need to focus on these solutions of saying, “Okay, we really can make a difference; we really can fix this issue. And we can do it at scale.” We have to be able to make sure that advertisers are still able to reach the audiences they need to make their money and go on about their happy capitalistic lives. I mean, that’s the business we’re in; I mean, we are an adtech company; Somer, I know you work in an adtech company, too. We’ve talked a lot of bad things about adtech companies. But reality is, adtech is not going away; we need to change the incentive structure, as opposed to getting rid of adtech. And to me, it’s about making sure that high-quality content is more valuable, and is viewed as being more valuable to the marketers and to the adtech companies themselves, than the crap that’s out there. And that’s how we can really fundamentally change where that money is flowing.

Somer Simpson 

And we also need to address the thing that Claire brought up earlier about the opaqueness. We need full bloody transparency from both sides of the—or actually all three sides of this ecosystem, because consumers have a seat at the table too.

Cedar Milazzo 

Well, and meaningful transparency, because when you have a big advertiser, like, let’s say, Procter & Gamble, and they run $100 million dollars in ads, giving them a list of every website that they ran their ad on is pretty much useless, right? I mean, there’s no way anybody’s going to be able to go through a list of millions and millions of pages on a daily basis. So you have to use some technology on—I know, there’s problems with technology—but you have to use some sort of technology to be able to figure out what it is that’s actually helpful and what’s harmful to our society.

Claire Atkin 

You know, the industry at large is, for lack of a better word, ratifying what they call the brand safety floor. And across the board, the whole industry has said: don’t put our ads anywhere that uses obscenity and profanity, that engages in online privacy, that promotes crime or drug use or arms and ammunition, and don’t put our ads anywhere where sensitive social issues are treated in an insensitive response, irresponsible and harmful way. This is the entire ad industry that has said these words. And so I think the very—yes, Cedar is right that to check your ads, you need tools, and you need to be able to get full detailed reports. And on top of that, there’s just some websites that should not be on the supply chain, because the entire industry has said they shouldn’t be there. It would be like selling brick bleach in a deli. It doesn’t belong. It’s toxic to everyone; it is brand unsafe to everyone. And so, yeah, it is a scale issue. But it is also a question of drawing a line and taking personal responsibility and the adtech executives that we know have lived in relative obscurity, despite their power for years. And I think that has allowed them to forget that they have a responsibility to society.

Somer Simpson 

You know, back to that lumascape, the boxes. You know who has the box that is the absolute closest to those publishers? It’s the ad servers. Is anybody talking to the ad servers and saying, “Why are you allowing these sites to leverage your technology to participate in this ecosystem?”

Cedar Milazzo 

Well, it’s the ad servers and what they call the supply side platforms, the SSPs. And absolutely, no, we at Nobl are absolutely talking to the major SSPs and doing everything we can to try and get them on board. I know, Claire, your organization has done… I’ve seen some things out there where you’re doing the same sort of thing. There is pressure on them, and honestly, I was talking to an executive at one of these SSPs pretty recently. And he specifically said, “You know, we’ve had a lot of requests from customers to put something like this in place, and to really kind of filter through and find the junk and get rid of it.” So they are starting to kind of come around. I think that the pressure that’s on them is the only thing that’s going to cause them to really change. So I think everything that’s happening right now—with all the pressure from consumers, from groups like ourselves, from marketers—all this is really going to be the only thing that will force change, because otherwise they’re just raking in cash. So why change anything?

Somer Simpson 

Yep. And what I find interesting is, you know, there’s so many brands, right? Nike, Volkswagen, Skyy Vodka, Subaru, like all of these great brands—they have beliefs, they have opinions, and they have ethics. And they have messages that they want to get to communities, and they want to be able to support those communities. I remember like way back in the day, Skyy Vodka was one of the only brands that would advertise in the LGBT magazines that I would pick up and read. And it’s spread since then. I mean, Subaru eventually became kind of known as like the lesbian car. But that’s only because they supported our community, right? And it’s interesting to me that the only thing that stands between their beliefs and their desire to reach and support those communities is this bit of technology that they’re using to be brand safe, that’s cutting them off from that. If it just got out of the way, or if they were just maybe a little smarter about it, they would be able to directly support those communities.

Claire Atkin 

You’re right, Somer. You’re so right. You know, about 15 months ago, we published a couple of branded newsletters about keyword blocking and semantic analysis. And one of the most outrageous findings that we found there is that the word ‘lesbian’ was being blocked at a very high rate. And it was, it was explained to us at the time that it was because of its connotation to do with pornography. And that is so wrong. I mean, we saw that there was a Refinery29 article about lesbian bed death, and it had two like scary brand unsafe words in it: ‘death’ and ‘lesbian.’ And it was just like cultural commentary, and it was actually very educational. And the fact that advertisers were using this technology that interpreted it as a brand unsafe place is outrageous, and it’s still happening today.

Somer Simpson 

You know, it’s interesting… so, I will give an example that I dodged a few minutes ago. During one of the worst years of our collective lives, you saw the Black Lives Matter movement get into the news more, lots of coverage for many good reasons. I saw a dramatic increase in the use of in those contextual block lists of words like, not just the phrase ‘Black Lives Matter,’ but then they broke it out into ‘black’ and ‘black people.’ And I was just, I’m like, I’m reading through these lists—and of course, at the top, you’ve got the standard anti-pornography stuff.—and I had my moment like you did, with like, “Wait, they’re blocking lesbians, but what’s wrong with us?” Right? But then I got to the Black Lives Matter piece. And I was like, “What, who is making these decisions? This is crazy.” Right? You want to sell McDonald’s; you want to sell your product to the black community. But then you don’t actually want to spend your money on their publications or on articles that they’re actually reading and support that community?

Cedar Milazzo 

Or anything that even mentions them. It’s more than just not targeting them. It’s any discussion about, like you said, black people, or lesbians for that matter. It doesn’t matter what the context is. And that’s the problem with these keyword boxes, especially as they have zero context, right? And there’s a lot of these brand safety companies out there, specifically who say that they’re building AI that includes context around it’s not just block words and things like that. But the truth is, that’s kind of in its infancy, and does not work very well at all.

Claire Atkin 

The fact is that the news is read by an educated, relatively engaged readership. It is a wonderful place for your brand to be; it is a trusted source of information for the most part. And to block it at all, as Cedar was talking about very early on in the podcast, to block it en mass or to block it at all using these keyword lists is to take away one of the best segments of your target market. It doesn’t make any sense. Now, if you’re blocking seven words that have to do with your brand, that would be awkward placements, fine. I understand that Samsung doesn’t want to be next to the words ‘lithium ion battery.’ But to block anything more than those few awkward words makes no sense. And that’s why this entire industry is problematic to us. And we’ve told them to their faces that we think that their products are looking for a problem to solve. And actually, they’re creating an incredibly unsafe environment for all of us in the meantime.

Cedar Milazzo 

It’s actually, I think, a little larger than that, too, because we’ve actually worked with some brands, who are kind of in-house and everything, because, okay, they’re there. You know, they said, “Okay, the brand safety stuff doesn’t work.The agencies are not handling it correctly. We’re just gonna do it all ourselves and take care of it and do the right thing.” And they’re honestly trying. And one big marketer we worked with has done this recently. They had a bunch of faux pas in the past year or two, and so they brought everything in-house; they had a huge budget allocated to building the expertise in-house and everything. And we ran trials with them and ran campaigns with them. And not only do we find that we were advertising on higher-quality content, but their performance on news specifically was much better. Because, like you said, Claire, they have a more educated, higher income, just all the demographics of the people who are reading that are more in line with people who can actually buy their products, and who want to buy their products. And so when they saw 45 to 50% increases in just the performance in terms of the numbers of people who were actually buying from them. But the interesting thing was, at the end of those trials, they came back and said, “Well, we were looking through the report,” because we’re completely transparent about where their ads are shown, and they said, “We’re looking through the report and we noticed that you put our ads next to a bunch of news. And our policy is we don’t advertise next to news,” even though we had just shown them proof that it’s 50% better performance, 45% better performance, whatever it was. They were still so nervous about being next to, “Oh, there might be a story about a murder that happened,” that they just canceled every bit of news and said we will not work with any news outlets. And it’s not just us, but there’s been a lot of research out there that shows when you advertise on the news, people do not associate your brand with the story that happened, right? If there’s a murder in Chicago, and you see an advertisement for, I don’t know, some soap, you’re not going to associate that soap with murder, you’re not going to think they’re supporting murder, that’s just not how it works. But exactly that brand on that page, there, you’re actually gonna get a positive uplift, because you’re seeing content that you trust; you’re seeing content that is actually legitimate and providing information that you’re interested in. And so it’s actually an uplift for that brand, not a not a problem. And so I think it’s an education thing for a lot of marketers, not just for the brand safety companies.

Claire Atkin 

You know, when I was a kid, I was told: the news is the bad news. Like, that’s bad news. Everyone knows that. The ads are that good news.

Cedar Milazzo 

You’re right about that, Claire. I mean, it’s the bright spot on the page, right? There’s nothing wrong with it.

Somer Simpson 

I thought that was the lifestyle section. The New York Times book reviews—ah, those can be bad news sometimes, too.

Cedar Milazzo 

Even entertainment news can sometimes be about that stuff, right?

Somer Simpson 

Yeah, I love this topic. Because, you know, I went to school, and got a degree in journalism. I was trained in how to do this properly, as were thousands of other people across the country and globally, who are incredibly talented people and do a real service for our democracy. It really, really, really makes me angry that any yahoo can go and spin up a blog, and monetize the heck out of it, and play a bunch of games, and say really hateful, shitty things, and make a ton of money.

Cedar Milazzo 

To me, what really makes me angry is the backlash against that sort of information includes the real journalists, too. And then you get people saying, “Oh, the media is terrible. The news media is just full of crap.” When in reality, there’s a big difference between a reporter who went to school like you did, and actually understands how to do actual fact checking and real reporting, versus some guy who threw up a blog and bought a bunch of traffic to get ads to start with, and then kind of legitimized themselves.

Claire Atkin 

So may I say, within advertising companies, and within adtech, we have been in countless discussions, hundreds of discussions with people who are standing up and saying, “Enough,” who are drawing the line between what is and is not appropriate behavior of their own companies. And that kind of leadership is so inspiring to us; it is what fuels our work at Check My Ads—is that we know that there are leaders within this industry who are just waiting for the perfect moment to say, “We have to change now.” And the adtech vendors to date have done a really piss poor job of protecting our communities. And I think that we’re going to see that that is changing; we’re going to see that the adtech industry at large is going to have to be more transparent, more responsible. And we’re going to see that some of these adtech vendors, they become household names if they don’t fix their problems, because they affect everyone.

Cedar Milazzo 

We’re hoping that some of them become household names because they do fix the problems —positive press for a lot of these companies as well. And Claire, I think I agree with you. I’m actually pretty upbeat about the outlook here. I think that there’s a lot of pressure coming from consumers and audiences but also from those kinds of leaders within the marketing industry as well who are really trying to stand up to this. And I also see, even within the adtech space, there are people who are poking their head up and saying, “You know, I’ve been a part of this industry for a long time. And I don’t like the way it works, and I want to make it better. I want to actually be proud of what I’m doing.” And we’re seeing more and more of those people standing up and kind of saying, “We need to fix things; we need to do better.” So I’m pretty positive about the outlook.

Somer Simpson 

I’m not selling Quantcast on this podcast. But you know, as somebody who is working at an adtech company, it is fundamentally a part of our DNA to push transparency, to push accountability, and do better, because we are in service of our marketers who want to grow their businesses and reach their audiences; we are in service of publishers who fund their content production with ad dollars; and we are in service of the consumers who make it all happen and respecting their privacy. So those are the fundamentals of our internal belief system. So I’m here to make change.

Cedar Milazzo 

Well, I think there are a lot of people in a lot of these companies who feel the same way as you, Somer, and I think a lot of these companies may not have changed and a lot of them don’t see the need to change yet. But I think as time goes by, and as companies like Quantcast, to really put a focus on it and actually do something about it and make things better, become more successful and get more share in the market, then it’s going to force everybody else to follow.

Somer Simpson 

And we’re not perfect. I mean, there’s, this is really complex stuff. But, you know, if you see something, say something, I will do something. And that’s what everybody else has to do. Right?

Claire Atkin 

No problem, Somer.

Somer Simpson 

So, you know, just to wrap up the conversation, if there’s one thing that you could tell a marketer who’s putting money into this ecosystem, what they should change, what would it be?

Claire Atkin 

Check your ads.

Cedar Milazzo 

I was actually gonna say a variation of that, which is pay attention. You know, it’s all about being responsible with your advertising dollar. And you can’t be responsible, if you’re not paying attention to it. If you’re just saying, “Okay, here’s my budget. I’m going to send it to Google. And I hope I get some results out of it.” That’s not sufficient in today’s world. So you really do need to pay attention. And there are plenty of things out there that can help you. There’s plenty of tools out there, whether they’re lists of places to avoid or whether they’re technologies that are doing things like what we’re doing at Nobl, there are plenty of tools if you look around for them.

Somer Simpson 

Excellent. All right. Well, I think maybe as we post this podcast, we should get a list of suggestions. And I know we’re going to link out to your websites and contact information and such as well. But we will post some ideas for people: things to avoid and things to do instead. Thank you both, Cedar and Claire, for being here. This has been a great conversation. I could literally talk for hours about this—and, you know, over drinks with a lot more profanity. But yeah, we should do this again. 

Cedar Milazzo 

Thank you very much, Somer. I appreciate it is a passion of mine as well, and I could definitely do that. We’ll take you up on the drink offer at some point. 

Claire Atkin

Thanks, Somer.

Somer Simpson 

Great, thank you both. This podcast is brought to you by Quantcast. Our mission is to radically simplify advertising on the open internet. We are the creators of a new and innovative intelligent audience platform that empowers brands, agencies, and publishers to know and grow their audiences online. The Quantcast Platform, powered by our patented AI and machine learning engine, delivers automated insights, marketing performance, and results at scale to drive business growth responsibly. Our solutions are leveling the playing field for our customers when it comes to effectively reaching audiences online and helping them power a thriving free and open internet for everyone. Connect with us today at quantcast.com.