Eli Pariser habla de “La burbuja de los filtros: lo que Internet te oculta” (1 de 2)
0 (0 Likes / 0 Dislikes)
This is DemocracyNow, democracynow.org, the War and Peace report
I am Amy Goodman with Juan Gonzalez
When you follow your friends on facebook or run a search on Google
What information comes up and what gets left out?
This is the subject by a book by Eli Pariser called
"The Filter Bubble: What the Internet is Hiding from you"
According to Pariser the Internet is increasingly becoming an echo chamber
in which websites tailor information
according to the preferences they detect in each viewer
YahooNews tracks which articles we read,
Zappos registers the type of shoes we wear - we prefer
and Netflix stores data on each movie we select.
The top 50 websites
collect an average of 64 bits of personal information each time we visit,
and then custom design their sites
to conform to these our perceived preferences
while these websites profit from tailoring
their advertisements to specific visitors users pay a big price
for living in an information bubble outside of their control
Instead of gaining wide exposure to diverse information
we're subjected to narrow, online filters.
Eli Pariser is the author of "The Filter Bubble:
What the Internet is Hiding from you."
He is also the board president and former exec. director
of the group Moveon.org.
Eli joins us in the New York studio right now
after a world whirlwind tour through the United States.
Welcome Eli.
Thanks for having me on.
So, this may surprise people.
Two of us sitting here, me and Juan,
if we went online, the two of us,
and put into Google "Eli Pariser" -
Right.
We actually might come up
with a wholly different set of finds
a totally different set of links
of search results.
That's right, I was surprised. I didn't know that was
how it was working, until I stumbled across
a little blog post on Google's blog
that said 'Personalized search for everyone',
and as it turns out for the last several years
you know, there is no standard Google.
There's no
sort of a, 'this is the link that is the best link'.
Its the best link for you.
and the definition of what the best link for you is,
is the thing your most likely to click.
So its not necessarily what you need to know,
it's what you want to know, what you are most likely to click.
But isn't that counter to the original thing
that brought so many people to Google,
that the algorithms that Google had developed
really where reaching out to the
best available information that was out there on the web?
Yea, you know, if you look at how they talked about
the original Google algorithm
they actually talked about it in these explicitly democratic terms.
That the web was kind of voting,
each page was voting on each other page and how credible it was.
And this is really a departure from that.
This is moving more toward
something where each person can get very different results
based on what they click on.
When I did this recently with Egypt.
I had two friends Google Egypt -
one person gets
search results that are full of information about the protests there,
about whats going on politically.
The other person
literally nothing about the protests,
only sort of a 'travel to see the pyramids' websites.
Now, wait. Explain that again.
I mean that is astounding.
So you go in the uprising is happening in Egypt.
In fact today there is a mass protest in Tahir Square,
they are protesting the military council and other issues.
So, if I look and somebody who likes to travel look
they may not even see a reference to the uprising?
That's right. I mean there was nothing in the top ten links,
and actually the way people use Google,
most people use just those top three links.
So,
if Google isn't showing you sort of the information you need to know
pretty quickly, you can really miss it.
And this isn't just happening at Google, its happening all across the web.
When I started looking into this,
you know, its happening in most major websites
and increasingly on news websites.
So, YahooNews does the exact same
tailoring what you see on YahooNews
to which articles it thinks you might be interested in,
and whats concerning about this is that its really happening invisibly
you know we dont see this at work.
you can't tell how different the internet you see is
from the internet anyone else sees is
but its getting increasingly different
What about the
responses of those who run these search engines
that they are really responding to the
to the interests and needs of people who use the system?
Well, you know, I think they say
'We are just giving people what we want"
and I say 'well what do you mean by what we want'
because I think actually all of us want a lot of different things
and there is a short-term, sort of compulsive self
that clicks on the celebrity gossip and the, you know,
more trivial articles
and there is a longer-term self that wants to be informed about the world
and be a good citizen.
And those things are intentional all the time
they're
you know we have those two forces inside us
and the best media helps us,
sort of, helps the long-term self can an edge a little bit
it gives us some information vegetables and some information dessert
and you gotta balance the information diet.
This is just like being surrounded by empty calories -
by information junk food.
Eli, talk about your experience going on your Facebook page
So, this was actually the starting point for
looking into this phenomenon
and basically,
after 2008, after I had transitioned from being the exec. director of MoveOn
I went on this little campaign to
meet and befriend people who thought differently from me
I really to hear what conservatives where thinking about,
what they were talking about.
You know, and learn a few things.
So I had added these people as Facebook friends
and I logged on one morning and noticed that they weren't there.
They had disappeared.
And it was very mysterious, you know, where did they go?
and as it turned out Facebook was tracking my behavior on a site
it was looking at every click, it was looking at every Facebook like
and it was saying 'well Eli, you say that you are interested in these people,
but actually we can tell you are clicking more
on the progressive links than on the conservative links.
So we are going to edit it out, edit these folks out.
And they disappeared,
and this gets to some of the danger of this stuff,
which is that
Facebook edited it out your friends?
Yea, no I really...
I miss them,
and
Your conservative friends?
My conservative friends, —
the friends that, you know, that I might —
and the play here is,
there is this thing called confirmation bias, which is basically
our tendency to
feel good about information that confirms what we already believe
and you can also see this in the brain.
People get a little dopamine hit
when they are told that they are right, essentially.
and, so,
if you were able to construct an algorithm that
could show people whatever you wanted
and if the only purpose was actually to get people to click more
and to view more pages
Why would you ever show them something that,
you know, makes them feel uncomfortable,
makes them feel like they may not be right
makes them feel like
there is more to the world than their own little narrow ideas.
Doesn't that in effect reinforce polarization in within the society?
In terms of people not being exposed to
and listening to the view points of others that they may disagree with.
Right, I mean, you know,
democracy really requires this idea of discourse
of people hearing different ideas and responding to them
and thinking about them,
and, I come back to this famous
Daniel Patrick Moynihan quote where he says
that everyone is entitled to their own opinion,
but not their own facts
Its increasingly possible to live in an online world in which
you do have your own facts,
and you Google 'climate change' and you get the climate change links
for you.
and, you know, you don't actually
get exposed necessarily and you don't even know
what the alternate arguments are.
Now, what about the implications for this as all of these,
especially Yahoo, Google develop their own news sites?
What are the implications in terms of the news that they put out
and the news that people receive?
Well, this is where it gets even more worrisome because
you know, when you are just
basically trying to get people to click things more
and view more pages
there is a lot of things that just isn't going to meet that threshold.
So, take news about the War in Afghanistan
when you talk to people who run new websites, they'll tell you
stories about the War in Afghanistan
don't perform very well, they don't get a lot of clicks,
people don't flock to them.
And yet this is arguably the one of the most important
issues facing the country.
We owe it to the people who are there, at the very least,
to understand whats going on.
But it'll never make it through
these filters.
and especially on Facebook this is a problem because the
way that information is transmitted on Facebook is with the 'Like' button
and the 'like' button it has a very particular valiance, its easy to
click 'like' on,
you know, I just ran a marathon or I baked a really awesome cake.
Its very hard to click 'like' on,
you know, War on Afghanistan enters its sixth
- tenth year, sorry.
So, information that is likable gets transmitted,
information that is not likable falls out.
We're talking to Eli Pariser who has written a book
"The Filter Bubble: What the Internet is Hiding from you."
Now, Google knows not only what your asking to search, right?
They know where you are,
they know the kind of computer you're using.
Tell us how much information they are gathering from us
Well, its really striking.
I mean, even if you're not.
If your logged into Google, then Google obviously has access to all your email,
all your documents that you've uploaded - a lot of information
but even if you're logged out
an engineer told me that there are 57 signals that Google tracks.
Signals is sort of their words for variables that they look at.
Everything from your computer's IP address,
that's basically its address on the Internet.
What kind of laptop or computer you are using,
what kind of software you're using.
Even things like the font size
or how long your hovering over a particular link.
And they use that to develop a profile of you,
a sense of what kind of person is this,
and then they use that to tailor the information they show you.
and this is happening in a whole bunch of places,
you know, not just sort of the main Google search,
but also on GoogleNews
and the plan for GoogleNews is that once they sort of perfect this
personalization algothrim,
that they are going to offer it to other news websites
so that all that data can be brought to bare
for any given news website,
that it can tailor itself to you
you know, there are really important things that are going to fall out
if those algorithms aren't really good,
and what this raises is this sort of larger
problem with how we tend to think about the Internet.
Which is that,
we tend to think about the Internet as
this sort of medium where anybody can connect to anyone.
Its this very democratic medium, it's a free for all,
and its so much better than that old society
with the gatekeepers that where
controlling the flows of information.
Really that's not how its panning out, and
what we are seeing is that a couple big companies are really,
most of the information is flowing through a couple big companies
that are acting as the new gatekeepers.
These algorithms do the same thing that the human editors do,
they just do it much less visibly and with much less accountability
What are the options, if there are any, for those who use,
whether its Google, Yahoo or Facebook,
their ability to control and keep their personal information?
Well, you know, there aren't perfect opted out options because even if you
take a new laptop out of the box,
already it says something about you -
that you bought a Mac and not a PC,
I mean its very hard to get entirely out of this.
There's no way to turn it off, entirely, at Google,
but certainly you can open a private browsing window.
That helps
I think in the long-run there is sort of two things that need to happen here,
one is, you know, we need,
our selves, to understand better whats happening,
because its very dangerous when you have these filters operating
and you don't know what they are ruling out that you're not even seeing.
Thats where people make bad decisions,
you know, what Donald Trumo called the 'unknown unknowns' right?
And this creates a lot of 'unknown unknowns'
you don't know how your experience of the world is being edited,
but it's also a matter of
pushing these companies to sort of,
you know, these companies say that they want to be
good, dont be evil' is Google's motto,
they want to change the world.
I think we have to push them to, sort of, live up to their best values
as companies and incorporate into these algorithms
more that just this very narrow idea of what