Accessibility links

Breaking News

The Future Of Censorship


I have never been one to get in a panic about web privacy. I’m too trusting, ready to believe that of course Company X will use my data in an honest and fundamentally decent way. The fact that they have my data can only improve my lot as a consumer, right? But I do remember the first time that I was freaked out by Gmail (I’ve heard other people having similar experiences).

I had been emailing my father, I think telling him about my latest home-improvement disaster, and I had mentioned the word “tools.” Sure enough, when I read my father’s reply the next day, in the sponsored links on the right-hand side of the page were companies trying to sell me tools. Nothing particularly startling there, of course, but it still unnerved me somewhat. It’s the pact I make with Google: you give me a decent web-based email for free and I give you my data. Facebook gives me a great platform to keep in touch with friends and family, and I give them my digital life. Forever.

In a recent essay/blog post, “Identity and The Independent Web,” the writer John Battelle discusses the emergence of “two distinct territories across the web landscape.” He calls one the "Dependent Web" and the other the "Independent Web."

The Dependent Web is dominated by companies that deliver services, content and advertising based on who that service believes you to be: What you see on these sites "depends" on their proprietary model of your identity, including what you've done in the past, what you're doing right now, what "cohorts" you might fall into based on third- or first-party data and algorithms, and any number of other robust signals.

The Independent Web, for the most part, does not shift its content or services based on who you are. However, in the past few years, a large group of these sites have begun to use Dependent Web algorithms and services to deliver advertising based on who you are.

His argument is that increasingly the dependent web is colonizing the independent -- that more and more, the content which is offered to us is tailored to what we previously have been doing on the web, what we’ve looked at, what we’ve clicked on, and how long we’ve lingered on something. The web knows what you want. In fact, in the future, the web will know the things you want before you even know them yourself.

Sure, we navigate around, in control of our experience, but the fact is, the choices provided to us as we navigate are increasingly driven by algorithms modeled on the service's understanding of our identity. We know this, and we're cool with the deal - these services are extremely valuable to us. Of course, when we drop into a friend's pictures of their kid's Barmitzvah, we could care less about the algorithms. But once we've finished with those pictures, the fact that we've viewed them, the amount of time we spent viewing them, the connection we have to the person whose pictures they are, and any number of other data points are noted by Facebook, and added to the infinite secret sauce that predestines the next ad or newsfeed item the service might present to us.

It got me thinking about censorship and something that the writer and blogger Evgeny Morozov said in a recent interview I did with him. Morozov said that in the future censorship will work much like behavioral advertising, in that it is very targeted and customized for each user. Usually, he said, the argument is that censorship isn't compatible with economic growth and globalization because authoritarian countries need their bankers to have access to “The New York Times” in order to do mergers and acquisitions.

But censorship, just like advertising, can be tailored to individual users. For example in China, bankers or government officials could have different web access than, say, human rights activists. Bankers could likely deal with Free Tibet sites (they probably wouldn't want to go there anyway), but certainly not impressionable young students. If your past profile of web browsing suggested some viewing of “problematic” sites, the content you are allowed to see in the future could be shaped accordingly. In censoring states, those fortunate enough to navigate the “independent web” will be much like those privileged party flunkies who were allowed to travel abroad during communism.

Morozov said there is a great incentive for companies or governments to develop censorship solutions that work in more or less the same way as Google or Facebook: by looking at your previous behavior and serving you relevant content. So instead of filtering based on keywords or by simply blocking access to pages/sites outright, it will be a much more customizable experience.

With all that in mind, Google suggesting tool suppliers doesn’t seem so benign. As Morozov said, “If [Google co-founder] Sergey Brin woke up one day and decided that Google should do lots of evil, then Google would be the most powerful censorship intermediary in the world” because they already have all that behavioral data. Now that really might be something to worry about.
XS
SM
MD
LG