Eli Pariser (of MoveOn.org, which I dimly recall was a relevant site when Bush II was on the throne, but seems to have lost some luster since Obama turned out to be Bush III) writes about the way the internet is becoming personalized, and how that distorts our view of reality. He says, “In polls, a huge majority of us assume search engines are unbiased. But that may be just because they’re increasingly biased to share our own views. More and more, your computer monitor is a kind of one-way mirror, reflecting your own interests while algorithmic observers watch what you click”(p. 3). Not only do vendors like Amazon and Netflix use your purchase history to try to predict what you might want to buy next (which seems legitimate, and no more than an on-the-ball shopkeeper in the bricks and mortar days would do), but increasingly info you don’t know is being collected about you is in play in ways you don’t realize. In commerce this info is gathered by “data companies like BlueKai and Acxiom, [which has] accumulated an average of 1,500 pieces of data on each person on its database—which includes 96 percent of Americans” (p. 7). Credit card companies have been profiling us based on what we buy for decades, now cellphones can report on where you go, too. This has some unpleasant implications in terms of direct marketing, and some possibly scary implications in terms of surveillance. But Pariser believes it also has a negative impact on social bonds and even on epistemology.The filter bubble is Pariser’s name for the feedback loop of information that surrounds us, as even Google searches we believe to be unbiased are increasingly tailored to our personal profiles. Although he admits we’ve always selected media that appeals to our interests and preconceptions (hence the echo-chamber cable news channels), Pariser says the filter bubble is different in three ways. “First, you’re alone in it.” Each personal news feed or search is tailored specifically to you, so you’re no longer even part of a narrow affinity group. Second, it’s invisible. “Google’s agenda is opaque. Google doesn’t tell you who it thinks you are or why it’s showing you the results you’re seeing. You don’t know if its assumptions about you are right or wrong—and you might not even know it’s making assumptions about you in the first place” (p. 10). Finally, Pariser says while the viewers of politically slanted media are (presumably) aware there are a range of options and that they’ve chosen one of them, as the filter bubble gives us more and more seemingly objective positive reinforcement in our preferences and prejudices, we begin to believe the world is like us.Because even web searches from commercial sites like Google are increasingly tailored to each user’s profile, Pariser says we are less likely to be exposed to a rich variety of ideas. Politically, this would tend to make us even more obnoxiously American than we already are. But he also claims that it will hinder creativity by promoting a more passive style of info gathering, and by narrowing what he calls “the solution horizon, the limit of the conceptual area that we’re operating in” (p. 95). It’s hard to be outside the box, Pariser believes, if the box is narrow and invisible. If innovation comes from the juxtaposition of unrelated ideas and from some type of creative cross-pollenization that happens when people expose themselves to unfamiliar stimuli, then we could be headed toward a generation of accountants. And this change has been noticed even by some techies: “The shift from exploration and discovery to the intent-based search of today was inconceivable,”Pariser quotes an unnamed Yahoo editor lamenting (p. 103). Some of the comparisons Pariser makes between Google (which profiles you based on your click history) and Facebook (which profiles you based on what you share) are less heavy than his argument above. You might even call them trivial. But the point is still worth remembering, “If you’re not paying for something, you’re not the customer; you’re the product being sold” (p. 21). Overall, though, I think Pariser overstates the danger of the filter bubble because just like the techno-evangelists he criticizes, he overestimates the importance of the technology. The box isn’t invisible – the box is the commercial internet. Creative people have no trouble seeing that. The problem, which Pariser gets close to and then misses, is that we’re training a generation of people not to be creative. 36 hours a week the average American spends watching TV. Switch that to surfing the web and you’ve still got the same problem.It’s a good book and a quick read. Pariser asks some provocative questions. But he doesn’t offer a lot of solutions. A government regulatory agency that supervises these data collectors does not sound like a good idea to me. The only people I want to have my personal info less than salesmen are bureaucrats. Pariser mentions the movie Minority Report – I’m thinking Enemy of the State. RFIDS cost about a nickel apiece, and it’s been nearly 15 years since I sat in a presentation by a semiconductor manufacturer’s rep (I think from National Semi) who talked about all the ways they were thinking of deploying them. So what are some ways of getting out of the filter bubble?First, limit the amount of info you’re giving away. Assume you’re always being watched, and act accordingly. Don’t carry a smartphone everywhere. Use cash. Search on something other than Google. Use TOR or some other anonymizing web service. Get off Facebook. Remember that everything you post to any website you don’t personally own probably becomes someone else’s property, and that the stuff you post on your own site can be copied and saved by anybody. Forever. And from the network perspective, it’s never been easier for regular people to communicate, and it doesn’t have to be through the commercial web. WIMAX base stations are cheap, and can connect entire towns and cities into networks that don’t depend on the AT&Ts and Time Warner Cables of the world. Those networks won’t have Netflix or YouTube on them (or much porn, either), but if that’s all we’re really looking for, then it’s already too late.