By Eli Pariser, the president of the board of MoveOn.org and the author of The Filter Bubble: What the Internet Is Hiding From You (THE NEW YORK TIMES, 23/05/11):
Once upon a time, the story goes, we lived in a broadcast society. In that dusty pre-Internet age, the tools for sharing information weren’t widely available. If you wanted to share your thoughts with the masses, you had to own a printing press or a chunk of the airwaves, or have access to someone who did. Controlling the flow of information was an elite class of editors, producers and media moguls who decided what people would see and hear about the world. They were the Gatekeepers.
Then came the Internet, which made it possible to communicate with millions of people at little or no cost. Suddenly anyone with an Internet connection could share ideas with the whole world. A new era of democratized news media dawned.
You may have heard that story before — maybe from the conservative blogger Glenn Reynolds (blogging is “technology undermining the gatekeepers”) or the progressive blogger Markos Moulitsas (his book is called “Crashing the Gate”). It’s a beautiful story about the revolutionary power of the medium, and as an early practitioner of online politics, I told it to describe what we did at MoveOn.org. But I’m increasingly convinced that we’ve got the ending wrong — perhaps dangerously wrong. There is a new group of gatekeepers in town, and this time, they’re not people, they’re code.
Today’s Internet giants — Google, Facebook, Yahoo and Microsoft — see the remarkable rise of available information as an opportunity. If they can provide services that sift though the data and supply us with the most personally relevant and appealing results, they’ll get the most users and the most ad views. As a result, they’re racing to offer personalized filters that show us the Internet that they think we want to see. These filters, in effect, control and limit the information that reaches our screens.
By now, we’re familiar with ads that follow us around online based on our recent clicks on commercial Web sites. But increasingly, and nearly invisibly, our searches for information are being personalized too. Two people who each search on Google for “Egypt” may get significantly different results, based on their past clicks. Both Yahoo News and Google News make adjustments to their home pages for each individual visitor. And just last month, this technology began making inroads on the Web sites of newspapers like The Washington Post and The New York Times.
All of this is fairly harmless when information about consumer products is filtered into and out of your personal universe. But when personalization affects not just what you buy but how you think, different issues arise. Democracy depends on the citizen’s ability to engage with multiple viewpoints; the Internet limits such engagement when it offers up only information that reflects your already established point of view. While it’s sometimes convenient to see only what you want to see, it’s critical at other times that you see things that you don’t.
Like the old gatekeepers, the engineers who write the new gatekeeping code have enormous power to determine what we know about the world. But unlike the best of the old gatekeepers, they don’t see themselves as keepers of the public trust. There is no algorithmic equivalent to journalistic ethics.
Mark Zuckerberg, Facebook’s chief executive, once told colleagues that “a squirrel dying in your front yard may be more relevant to your interests right now than people dying in Africa.” At Facebook, “relevance” is virtually the sole criterion that determines what users see. Focusing on the most personally relevant news — the squirrel — is a great business strategy. But it leaves us staring at our front yard instead of reading about suffering, genocide and revolution.
There’s no going back to the old system of gatekeepers, nor should there be. But if algorithms are taking over the editing function and determining what we see, we need to make sure they weigh variables beyond a narrow “relevance.” They need to show us Afghanistan and Libya as well as Apple and Kanye.
Companies that make use of these algorithms must take this curative responsibility far more seriously than they have to date. They need to give us control over what we see — making it clear when they are personalizing, and allowing us to shape and adjust our own filters. We citizens need to uphold our end, too — developing the “filter literacy” needed to use these tools well and demanding content that broadens our horizons even when it’s uncomfortable.
It is in our collective interest to ensure that the Internet lives up to its potential as a revolutionary connective medium. This won’t happen if we’re all sealed off in our own personalized online worlds.
Fuente: Bitácora Almendrón. Tribuna Libre © Miguel Moliné Escalona
No hay comentarios.:
Publicar un comentario