“The Filter Bubble” – a tech problem?
This morning, I woke up, reached for my phone, and checked my Twitter and Facebook feeds. Somehow, this has become my morning routine since acquiring a smartphone– I’ve always loved reading in bed, and the device makes it so easy to curl up under the covers while I link up to the world beyond my apartment. I scanned my self-curated Twitter lists and Facebook newsfeed, finding a mix of news about tonight’s presidential debate, food porn, and several links to WKBT News 8 anchor Jennifer Livingston’s response to an email from a man bullying her about her weight.
Welcome to my filter bubble.
In his popular TED talk, Eli Pariser introduces the concept as a unique, personalized information world. He discusses what he sees as a major shift in how information is flowing– thanks to personalization on the web, he argues, we aren’t necessarily seeing what we need to see. And that’s a problem.
In days of yore, Pariser reflects, we had gatekeepers. The almighty newspaper editors who oversaw the heights of American journalistic excellence. Then came the interwebs, sweeping gatekeepers out of the way– except, not really. The gatekeepers, as it turns out, are still there, only now they are algorithms lacking embedded ethics. Companies like Facebook and Google haven’t coded civic responsibility into how they deliver information to users. And it isn’t just that we don’t see what we need to see: we don’t decide what gets in, and we don’t see what gets edited out.
I sympathize with Pariser. I, too, am concerned about not seeing what I need to see. Which is…well, I’m not sure, exactly. I guess, to quote Justice Stewart, “I know it when I see it.”
This is, in my opinion, one of the core problems with Pariser’s argument, also raised by Jonathan Stray at the Nieman Journalism Lab. How, exactly, do you code diversity into an algorithm? Who writes the code? With what criteria? Pariser places the responsibility of ensuring a balanced information diet squarely on the shoulders of the technology/platform. For lack of a better metaphor, this, to me, is akin to making McDonald’s responsible for how I eat.
Okay, so maybe Pariser is right that without the ethical gatekeepers, we are at a great information disadvantage. But again, I question his assessment of pre-interwebs information flows. As Stray asks, were traditional news media really that good at diversity to begin with? Putting aside for a second any normative evaluation of the quality of traditional news media, we should also consider the empirical evidence of institutionalized inequality of global news flows offered by Mark D. Alleyne. I think it’s unrealistic and impractical to expect technology to solve these issues.
So, where does that leave us? Well, I for one am encouraged by the fact that human ingenuity seems to prevail regardless of structural imbalances. Stray hails the advent of curation as a new form or extension of journalism, highlighting folks like Andy Carvin who make it their business to drink from the firehose and deliver information to readers in a transparently digested fashion. I also love Stray’s suggestion to not just filter, but also map the web— using tools like social network analysis, we can create information network maps that are easier to navigate than, say, Amazon’s list of all books on U.S. politics.
Ultimately, I think Pariser neglects an element of personal responsibility in determining how our brains are fed. If anything, I hope to see a push towards holding educational institutions accountable for teaching media literacy and equipping all of us with the tools necessary to manage our new information environment.