TOC Who's watching the watchers? Your Turn

April 21, 2009

These fillips weave variegated events picked up from the newspapers into a meaningful fabric. The result may not be too pleasing to the eye as I have become pessimistic as to the will of our modern society to protect its members' privacy,. But truth does not have to be popular.

So should I look at Maija Palmer's full page assessment of the state of eprivacy today (*) as competition? It would be a mistake. So dire is the condition of our privacy rights, how not to welcome the recognition the Financial Times provides to my formula, its message as well as its format? Allow me then the luxury to reverse the process, take Maija Palmer's own tapestry as my starting point today and pull on one of its threads.

Simon Davies, the head of Privacy International, shares my views on behavioral advertising and is quoted as saying "the regulators have become toothless and inept as watchdogs". But the hero of Maija Palmer's fillip is a company named Phorm (1). Phorm, she writes, "has devised a system of targeting ads without retaining any personal data". Could this be the solution to our problems?

Author of a competing technology, I labor here under a stark conflict of interest. Forewarned, let the readers decide if my review remains objective.

The main point is that Phorm's claim is totally credible. Despite contrary practices by Google and others, the right of might is the sole reason to accumulate gigabytes of confidential data about the Internet activities of each one of us. As explained by Phorm on its site, one only needs to compile this "raw data flow" on the fly into the segments targeted by advertisers and discard it afterwards in real time.

Phorm further claims it never uses the IP address of the users it tracks, replacing it with a code which is specific to its system. This is how it should be. By using a private id, Phorm avoids the dangers of turning the IP address into the type of universal ID which makes data theft so lucrative.

Phorm has been criticized for asking consumers to opt out rather than opt in, the more conservative approach preferred by European authorities. Most companies however turn opt in on its head by bundling it with any desirable online activity. Isn't requiring an opt out far less hypocritical than coercing an opt in?

To counter a common objection to behavioral advertising, Phorm is also careful to state that it censors advertising targets and formally declines to observe consumer activities which inform on morals or politics. Note that a side benefit of personalization is to enable this prudent censorship to adjust to the country of residence of each consumer.

Phorm therefore is an excellent solution to a vexing problem. Does it mean I endorse it? Do not misunderstand my position. It is not because I have a better solution that I cannot be satisfied by Phorm. It is because I cannot be satisfied by approaches like the one adopted by Phorm that I developed a better solution.

The crux is that Phorm is asking us to trust it to do no evil. We have been there before. Worse, Phorm itself recognizes that our trust must extend to its paying clients, the Internet Service Providers such as BT. History however tells otherwise. Speaking of the telecommunication industry, have we already forgotten about the scandal at Deutsche Telekom, the egregious errors in judgment at AOL, the human frailty of Pellicano's informers?

In its defense, Phorm will rightfully say that it has minimized the risks to the consumers by limiting the scope of the information it holds in its database. No IP address, no revealing string of numbers, no history of activities... This is indeed why Phorm is a far better solution than current practices. But it does protest too much about storing no personally identifiable information. As time passes, its description of each consumer as an aggregate of targetable interests becomes richer and richer and many such anonymous aggregates become personally identifiable. Grow, baby, grow!

There is also a downside to discarding the raw data flow of user activities. Without it, how can one establish a protest process, a necessary part of any recommendation mechanism, such as targeted advertising? Errors will occur unless Phorm claim to be free from the realities of pattern recognition. Research is said to prove consumers prefer apropos ads to random ads. But have users been asked how they will welcome a wrongheaded stream of wrongfooted ads, say airline promotions to someone scared of flying? Forcing one to opt out is just too blunt a remedy.

The merit of Phorm is to show how far one can really go to protect consumer privacy short of adopting the more radical approach I advocate.

Better put the ad engine on the consumer's device and throw away its control key. Better use the available bandwidth to broadcast a raw ad reference stream to each user device than gather the raw activity stream of each user into some central process. Let the ad engine locally collect user activity, filter the ad stream and, upon request, deliver explanations to the user and accept potential corrections.

Despite what some incompetent experts have written, "trust in the program" is not the same as "trust in the programmer". As any human being, a programmer can be corrupt. But, once written, a program can be audited, sealed against alterations and downloaded for local execution. Having to corrupt both the programmer and the auditor is far more difficult, especially if the programmer is prevented from picking the auditor, à la Madoff. It's the rule of three.

Utopia? "Ultimately it is our decision what the future will look like", the president of Epic, Marc Rotenberg told Maija Palmer. True but at the condition we, consumers, unite to watch the watchers.

Philippe Coueignoux

April 2009
Copyright © 2009 ePrio Inc. All rights reserved.