LOS ANGELES — Google has debuted a new privacy initiative that it hopes will redefine online advertising.
“Privacy is paramount to us, in everything we do,” Chrome Engineering Director Justin Schuh stated, as he outlined Google’s proposal for a set of open standards it is calling a “Privacy Sandbox.”
“Technology that publishers and advertisers use to make advertising even more relevant to people is now being used far beyond its original design intent — to a point where some data practices don’t match up to user expectations for privacy,” Schuh stated, and noted that while other browser makers have looked at the problem, they lacked a common set of standards and as a result, their efforts have had unintended consequences.
Schuh explained that large-scale cookie blocking encourages fingerprinting and other opaque tracking techniques.
“With fingerprinting, developers have found ways to use tiny bits of information that vary between users, such as what device they have or what fonts they have installed to generate a unique identifier which can then be used to match a user across websites,” Schuh revealed. “Unlike cookies, users cannot clear their fingerprint, and therefore cannot control how their information is collected. We think this subverts user choice and is wrong.”
The extreme loss in revenue to publishers due to consumers blocking cookies is driving an arms race between browser makers and marketers, with Schuh pointing to an average 52 percent decline in publisher funding when their advertising is made less relevant by the removal of cookies, threatening the future of the vibrant web.
“Many publishers have been able to continue to invest in freely accessible content because they can be confident that their advertising will fund their costs. If this funding is cut, we are concerned that we will see much less accessible content for everyone,” Schuh explained. “So, we are doing something different.”
The solution the company hopes for will protect user privacy and help content remain freely available online, relying on the improved classification of cookies, more understandable cookie settings, and aggressive moves to block fingerprinting, in the interests of improving transparency, choice and control. It will work with the developer community on new privacy standards and is now sharing its ideas for a Privacy Sandbox, which it views as essential for free access to content on the web.
“While Chrome can take action quickly in some areas — for instance, restrictions on fingerprinting — developing web standards is a complex process, and we know from experience that ecosystem changes of this scope take time,” Schuh concluded. “They require significant thought, debate and input from many stakeholders, and generally take multiple years.”
The takeaway for advertisers and publishers that rely on fingerprinting is that a change is coming and Google is leading the way.
To kick things off, a companion post on the Chromium Blog, “Potential uses for the Privacy Sandbox,” details the company’s initial thoughts on the process, covering how user information is currently used in the ad ecosystem and how browsers can allow publishers to pick relevant content or show a relevant ad while sharing as little info about the user or their browsing history as possible.
“We’re exploring how to deliver ads to large groups of similar people without letting individually identifying data ever leave your browser — building on the Differential Privacy techniques we’ve been using in Chrome for nearly five years to collect anonymous telemetry information,” Schuh explained, adding, “New technologies like Federated Learning show that it’s possible for your browser to avoid revealing that you are a member of a group that likes Beyoncé and sweater vests until it can be sure that group contains thousands of other people.”
Schuh noted that conversion measurement and fraud prevention are also critical areas as online publishers and advertisers need to know if their advertising leads to more business.
“If it’s driving sales, it’s clearly relevant to users, and if it’s not, they need to improve the content and personalization to make it more relevant. Users then benefit from ads centered around their interests, and advertisers benefit from more effective advertising,” Schuh stated, adding, “Publishers today often need to detect and prevent fraudulent behavior, for instance, false transactions or attempts to fake ad activity to steal money from advertisers and publishers.”
While many companies work to detect and prevent fraud, Schuh highlighted that there is room for improvement in their privacy practices, such as the PrivacyPass token introduced by CloudFlare for Tor users.
“Our experience has shown us that removing certain capabilities from the web causes developers to find workarounds to keep their current systems working rather than going down the well-lit path,” Schuh confided, adding that in response to the situation, his company proposes the implementation of what it refers to as a “privacy budget” that will enable websites “to call APIs until those calls have revealed enough information to narrow a user down to a group sufficiently large enough to maintain anonymity. After that, any further attempts to call APIs that would reveal information will cause the browser to intervene and block further calls.”
"Browser intervention" is a phrase that developers don’t want to hear, but adult advertisers and publishers will need to cope with as they carry forward in the face of Google and Chrome’s newest challenge to the status quo.