Google and Facebook, the new gatekeepers
Companies that make use of these algorithms must take this curative responsibility far more seriously than they have to date. They need to give us control over what we see -- making it clear when they are personalizing, and allowing us to shape and adjust our own filters. We citizens need to uphold our end, too -- developing the "filter literacy" needed to use these tools well and demanding content that broadens our horizons even when it's uncomfortable.
Then came the Internet, which made it possible to communicate with millions of people at little or no cost. Suddenly anyone with an Internet connection could share ideas with the whole world. A new era of democratized news media dawned.
You may have heard that story before -- maybe from the conservative blogger Glenn Reynolds (blogging is "technology undermining the gatekeepers") or the progressive blogger Markos Moulitsas (his book is called "Crashing the Gate"). It's a beautiful story about the revolutionary power of the medium, and as an early practitioner of online politics, I told it to describe what we did at MoveOn.org. But I'm increasingly convinced that we've got the ending wrong -- perhaps dangerously wrong. There is a new group of gatekeepers in town, and this time, they're not people, they're code.
Today's Internet giants -- Google, Facebook, Yahoo and Microsoft -- see the remarkable rise of available information as an opportunity. If they can provide services that sift though the data and supply us with the most personally relevant and appealing results, they'll get the most users and the most ad views. As a result, they're racing to offer personalized filters that show us the Internet that they think we want to see. These filters, in effect, control and limit the information that reaches our screens.
Like the old gatekeepers, the engineers who write the new gatekeeping code have enormous power to determine what we know about the world. But unlike the best of the old gatekeepers, they don't see themselves as keepers of the public trust. There is no algorithmic equivalent to journalistic ethics.
Mark Zuckerberg, Facebook's chief executive, once told colleagues that "a squirrel dying in your front yard may be more relevant to your interests right now than people dying in Africa." At Facebook, "relevance" is virtually the sole criterion that determines what users see. Focusing on the most personally relevant news -- the squirrel -- is a great business strategy. But it leaves us staring at our front yard instead of reading about suffering, genocide and revolution.
OPINION
When the Internet Thinks It Knows You
By ELI PARISER
Published: May 22, 2011
Personalized information filters pose a threat to democracy.