Google's trouble started when The Wall Street Journal ran a page one story on the company's clever coding that allowed them to work around default security settings on the iPhone's main browser, Safari. Safari's default settings block third-party tracking software from being installed. Safari would only allow cookies if the user "interacted" with a site, like filling out a form. Google created code that sent an invisible form to trick Safari into thinking the user was interacting with an ad.
Google issued a statement to the WSJ denying anything nefarious in their practices.
"The Journal mischaracterizes what happened and why. We used known Safari functionality to provide features that signed-in Google users had enabled. It's important to stress that these advertising cookies do not collect personal information."- Google, Inc. Statement in The Wall Street Journal, February 17, 2012
The quid pro quo of the internet is access to information for privacy. The more information we get, the more of our privacy we relinquish. In theory, users are supposed to have control over how much of their privacy they give up and to whom.
Let me run down the PR challenges Google faces here. The quote above really boils down to:
"We figured out a way around Safari's security settings and didn't tell you. Since we didn't collect personal information you shouldn't worry about it."Later on, Google told the Hill's Technology Blog Hillicon Valley that the tracking was "inadvertent and had been removed." The Los Angeles Times has an extensive piece on the privacy breach.
One of the key tenants of crisis communications (and in case you're wondering, Google, you have a crisis here) is that your operations and your communications have to work in order to successfully navigate a crisis. On the communications front, Google is going to have to choose between this being inadvertent or being ok. So far they've said both of those things.
Operationally, Google created a code that exploited a weakness in the Safari browser and placed that code into ads. This was not done by accident. Even though I'm not a programmer, I'm pretty sure that code didn't write itself.
Google failed operationally by installing the code, and they are failing PR-wise by not having a coherent, believable story about how and why this happened. The most telling detail for me is that Google disabled the offending after The Wall Street Journal raised the issue. Removing the code is the right thing, but the way this looks is that the only reason we aren't being secretly tracked by Google is because it got caught.
I'm a PR guy. Google needs a better story. If your goal is don't be evil, perhaps the truth is a good place to start
Update: Apparently Google tricks Microsoft's Internet Explorer browser, too. See here for details. It appears Google's actions weren't so much inadvertent as totally deliberate.