On Content Standards
We’ve written in the past about the things we do to ensure marketers feel comfortable buying advertising inventory through our platform. Inventory quality and brand safety are sprawling topics, but the purpose of this post is to focus on one area in particular: how we decide whether a site or app should be permitted to sell inventory through our technology.
Before we get into the specifics, it’s important to note two things.
First, there are many sites that list us in their ads.txt file that we’ve never done business with, and would never do business with. Second, everything we do in this area is in service of our buyers. We require all publishers to satisfy an array of checks and requirements. We determine platform-wide content policies in response to buyer input, we block exposure to individual sites on their request, and we signal inventory attributes so buyers can make informed decisions.
Our Baseline: No Extreme Content
Our content standards have long rested on the following principle: we do not allow “extreme” content across any of our products. Extreme content, as we define it, is relatively easy to recognize. It includes hateful supremacist speech, anything that directly calls for violence or harassment, depicts extreme violence or hardcore pornography or supports fraud or illegal activities such as piracy.
You might ask, what gives us the right to disallow particular publishers from our platform? Isn’t that illegal censorship? The quick answer is no — as a private business, we can decide who we choose to partner with.
And what about the grey area of content that’s adjacent to what we consider extreme? Isn’t that a gateway to the really ugly stuff? That may be true, and for that reason, we don’t envy the challenges social media platforms or publishers themselves face in setting and enforcing content standards.
But we are neither a social media platform nor a publisher. We don’t host the material in question, and though we’re committed to preventing the monetization of content that’s clearly extreme, we don’t see it as our role to remove content that falls short of that mark but is simply untrue or even offensive.
Different Products, Different Needs
From the outside, Magnite might appear to be one company with one technology platform. This was once true, but today we have several different ways to help publishers monetize their inventory — and several platforms to enable that. We have two core exchanges — one for CTV and another for a diverse set of formats and devices — as well as Demand Manager, which is a set of publisher-facing monetization tools based on Prebid.
The exchanges are quite different from Demand Manager, both in terms of their technical workings and the business obligations they require of us. Consequently, our content policies for each also differ. That said, extreme content isn’t tolerated regardless of platform or circumstances.
In the exchanges, the buyer has no direct relationship with the seller, so as the intermediary we have a greater responsibility to protect them. Therefore, on the exchanges, we go beyond blocking extreme content and also block nudity, incentivized content, and most content focused on firearms, alcohol, and illegal drugs.
Demand Manager, however, is not a marketplace–it’s a set of tools publishers use to manage their inventory. As such, the publisher has the freedom to make its own choices about which (if any) of the above content to include, so long as it’s not extreme. The publisher also has full control over which bidders they add to their Prebid configuration, which or may not include one of our exchanges.
We’re not perfect. Sometimes things slip through the cracks. Monitoring millions of new pieces of content daily is a challenge for companies far bigger than we are. That’s why we rely on feedback from our buyers, partners, and the broader community.
So if you have a concern, reach out to us at firstname.lastname@example.org. We promise we’ll always investigate claims and questions, and reply with what we’ve done.