"In a time of universal deceit telling the truth is a revolutionary act." -George Orwell

Posts Tagged ‘Filter Bubbles’

Centralization and Sociopathology: Concentrated Power & Wealth Are Intrinsically Sociopathological

In Uncategorized on May 24, 2013 at 8:35 pm

http://www.tgdaily.com/sites/default/files/stock/article_images/misc/matrixsystemfailure.jpgOldspeak: “It is no measure of health to be well adjusted to a profoundly sick society.” -Jiddu Krishnamurti

“We live in a time of unprecedented inequality,  with extreme wealth & power concentration. Concurrently, our civilization is dominated by sociopathic organizations. The two most powerful, the transnational corporate network, and governments worldwide have combined to form an awesome and near omniscient control grid, a corptalitarian oligarchical collective that is manipulating most humans on this planet. This collective has infected most of humanity with its worldview and belief systems, spawning billions of secondary sociopaths who act as gatekeepers, maintaining & defending the sociopathic systems around which society is organized. We’ve reached a point where these systems, rapidly metastasizing via globalization, are in direct conflict with the only system that matters, the ecosystem. If the ecosystem loses this conflict, we all lose. Localization is the key to saving our ecosystem,  Which side are you on?

By C.D. @ Oftwominds:

Concentrated power and wealth are intrinsically sociopathological by their very nature.

 
I have long spoken of the dangers inherent to centralization of power and the extreme concentrations of wealth centralization inevitably creates.
Longtime contributor C.D. recently highlighted another danger of centralization:sociopaths/psychopaths excel in organizations that centralize power, and their ability to flatter, browbeat and manipulate others greases their climb to the top.
In effect, centralization is tailor-made for sociopaths gaining power. Sociopaths seek power over others, and centralization gives them the perfect avenue to control over millions or even entire nations.
Even worse (from the view of non-sociopaths), their perverse abilities are tailor-made for excelling in office and national politics via ruthless elimination of rivals and enemies and grandiose appeals to national greatness, ideological purity, etc.
As C.D. points out, the ultimate protection against sociopathology is to minimize the power held in any one agency, organization or institution:

After you watch these films on psychopaths, I think you’ll have an even greater understanding of why your premise of centralization is a key problem of our society. The first film points out that psychopaths generally thrive in the corporate/government top-down organization (I have seen it happen in my agency, unfortunately) and that when they come to power, their values (or lack thereof) tend to pervade the organization to varying degrees. In some cases, they end up creating secondary psychopaths which is kind of like a spiritual/moral disease that infects people.

If we are to believe the premise in the film that there are always psychopaths among us in small numbers, it follows then that we must limit the power of any one institution, whether it’s private or public, so that the damage created by psychopaths is limited.

It is very difficult for many people to fathom that there are people in our society that are that evil, for lack of a better term, and it is even harder for many people in society to accept that people in the higher strata of our society can exhibit these dangerous traits.

The same goes for criminal behavior. From my studies, it’s pretty clear that criminality is fairly constant throughout the different levels of our society and yet, it is the lower classes that are subjected to more scrutiny by law enforcement. The disparity between blue collar and white collar crime is pretty evident when one looks at arrests and sentencing. The total lack of effective enforcement against politically connected banks over the last few years is astounding to me and it sets a dangerous precedent. Corruption and psychopathy go hand in hand.

A less dark reason for avoiding over centralization is that we have to be aware of normal human fallibility. Nobody possesses enough information, experience, ability, lack of bias, etc. to always make the right decisions.

Defense Against the Psychopath (video, 37 minutes; the many photos of political, religious and secular leaders will likely offend many/most; if you look past these outrages, there is useful information here)
The Sociopath Next Door (video, 37 minutes)
As C.D. observes, once sociopaths rule an organization or nation, they create a zombie army of secondary sociopaths beneath them as those who resist are undermined, banished, fired or exterminated. If there is any lesson to be drawn from Iraq, it is how a single sociopath can completely undermine and destroy civil society by empowering secondary sociopaths and eliminating or marginalizing anyone who dares to cling to their humanity, conscience and independence.
“Going along to get along” breeds passive acceptance of sociopathology as “the new normal” and mimicry of the values and techniques of sociopathology as the ambitious and fearful (i.e. almost everyone) scramble to emulate the “successful” leadership.
Organizations can be perverted into institutionalizing sociopathology via sociopathological goals and rules of conduct. Make the metric of success in war a body count of dead “enemy combatants” and you’ll soon have dead civilians stacked like cordwood as proof of every units’ outstanding success.
Make lowering unemployment the acme of policy success and soon every agency will be gaming and manipulating data to reach that metric of success. Make higher grades the metric of academic success and soon every kid is getting a gold star and an A or B.
Centralization has another dark side: those ensconced in highly concentrated centers of power (for example, The White House) are in another world, and they find it increasingly easy to become isolated from the larger context and to slip into reliance on sycophants, toadies (i.e. budding secondary sociopaths) and “experts” (i.e. apparatchiks and factotums) who are equally influenced by the intense “high” of concentrated power/wealth.
Increasingly out of touch with those outside the circle of power, those within the circle slide into a belief in the superiority of their knowledge, skills and awareness–the very definition of sociopathology.
Even worse (if that is possible), the incestuous nature of the tight circle of power breeds a uniformity of opinion and ideology that creates a feedback loop that marginalizes dissenters and those with open minds. Dissenters are soon dismissed–”not a team player”– or trotted out for PR purposes, i.e. as evidence the administration maintains ties to the outside world.
Those few dissenters who resist the siren song of power soon face a choice: either quietly quit “to pursue other opportunities” (the easy way out) or quit in a blast of public refutation of the administration’s policies.
Public dissenters are quickly crucified by those in power, and knowing this fate awaits any dissenter places a powerful disincentive on “going public” about the sociopathology of the inner circle of power.
On rare occasions, an insider has the courage and talent to secure documentation that details the sociopathology of a policy, agency or administration (for example, Daniel Ellsberg and The Pentagon Papers).
Nothing infuriates a sociopath or a sociopathological organization more than the exposure of their sociopathology, and so those in power will stop at nothing to silence, discredit, criminalize or eliminate the heroic whistleblower.
In these ways, centralized power is itself is a sociopathologizing force. We cannot understand the present devolution of our civil society, economy and ethics unless we understand that concentrated power and wealth are intrinsically sociopathological by their very nature.
 
The solution: a culture of decentralization, transparency and open competition, what I call the DATA model (Decentralized, Adaptive, Transparent and Accountable) in my book Why Things Are Falling Apart and What We Can Do About It.

Facebook Censors Prominent Political Critics; Deactivated Accounts In Coordinated Purge

In Uncategorized on December 29, 2012 at 6:42 pm

8316464299 f197dba5e3 b Facebook Even Censors ART

Oldspeak:”The smart way to keep people passive and obedient is to strictly limit the spectrum of acceptable opinion, but allow very lively debate within that spectrum” -Noam Chomsky Facebook is the incarnation of Chomsky’s statement. This is hardly surprising. Meanwhile the rays of sunlight between the U.S. with its inverted totalitarian kleptocracy, and overt, hardcore totalitarian regimes grow fainter and fainter. It seems to be just as NSA Whistleblower William Binney said  says as he holds his thumb and forefinger close together,  “We are, like, that far from a turnkey totalitarian state.” Private, terms governed, electronic social networks, masquerading as public, free, all sharing democratic spaces. While we are constantly encouraged to share our feelings and everything else about ourselves on facebook, only a very narrow range of reality control approved feelings and things are acceptable. All others are removed. This is genius of this variant of Big Brother. It’s not something to fear or avoid. You love it. You tell it everything. You share everything gleefully. Where you are, who you’re with, when, why, how, how long, etc, etc, etc…. It’s the perfect narcissism cultivating surveillance tool. It is your friend.  Big Brother is your friend. “Ignorance Is Strength”

Update: Facebook Yields to Pressure: Reactivates Political Critics’ Accounts

By Washington’s Blog:

We’ve previously documented that the largest social media websites censor government criticism.

For example, Facebook pays low-wage foreign workers to delete certain content based upon a censorship list. For instance, Facebook deletes accounts created by any Palestinian resistance groups.

Today, Facebook deactivated the Facebook accounts of some of the leading American political critics.

For example, former diplomat and U.C. Berkeley Professor Emeritus Peter Dale Scott told us that his Facebook account was suddenly deactivated today without any justification.

So did Richard Gage, founder of Architects and Engineers for 9/11 Truth.

And Michael Rivero, owner of the popular website What Really Happened.

Infowars – one of the world’s most popular alternative media sites – confirms that accounts for the following political commentators have been shut down:

  • Kurt Nimmo, writer for Infowars.com and formerly Counterpunch
  • Aaron Dykes of Infowars
  • Jason from Infowars
  • Infowar Artist

Indeed, Facebook told an Infowars reporter last year not to post anything political:

Be careful making about making political statements on facebook … facebook is about building relationships not a platform for your political viewpoint. Don’t antagonize your base. Be careful and congnizat (sic) of what you are preaching.

And Infowars also confirms that the Facebook account for Natural News – one of the most popular alternative health sites – has been shut down.

Reports are that the Facebook accounts of a number of other political critics were suspended or deactivated today as well, including:

  • Robert M. Bowman, former director of the “Star Wars” defense program under President Ronald Reagan
  • Anthony J. Hilder, popular radio host
  • William Lewis
  • Wacboston
  • Michael Murphy
  • Mike Skuthan
  • Packy Savvenas
  • Sean Wright and Katherine Albrect

 

“Internet Censorship Affects Everybody”: The Global Struggle For Online Freedom

In Uncategorized on January 18, 2012 at 4:24 pm

Oldspeak: The reason why these issues are so important for ordinary Americans and really go beyond just sort of a nerdy, geeky technical issue is that in today’s society, we, as citizens, increasingly depend on internet services and platforms, mobile services and platforms, not only for our personal lives and our businesses and our jobs, but also for our political discourse and political activism, getting involved with politics. And so, it’s very important that people who are exercising power, whether they’re corporate or whether they’re government, that are exercising power over what we can see, over what we can access, over what we can publish and transmit through these digital spaces, need to be held accountable, and we need to make sure that power is not being abused in these digital spaces and platforms that we depend on. And so, that’s why this SOPA and PIPA legislation and the fight over it is so important, is who are you empowering to decide what people can and cannot see and do on the internet, and how do you make sure that that power is not going to be abused in ways that could have political consequences. And we’ve actually seen how existing copyright law has sometimes been abused by different actors who want to prevent critics from speaking out.” -Rebecca MacKinnon Chinese style internet censorship is coming to America. It may not happen now, but you can bet this won’t be the last effort to do so.

Related Stories:

Understand Today’s Internet Strike: SOPA, PIPA And A Free Internet

Wikipedia, Reddit to Shut Down Sites Wednesday to Protest Proposed Stop Online Piracy Act

Censorship, Capitalism & “Personalization” The Filter Bubble: What The Internet Is Hiding From You

Internet Censorship Bills Up For Vote Dec 5th – “Stop Online Piracy Act” & “Protect IP” Garner Enthusiastic Bi-Partisan Support In Congress

By Amy Goodman @ Democracy Now

AMY GOODMAN: We’re joined by Rebecca MacKinnon in Washington, D.C., author of Consent of the Networked: The Worldwide Struggle for Internet Freedom.

We welcome you to Democracy Now! Rebecca, the internet has been touted as such a tremendous liberating force. When we look at the events of this past year, the uprisings throughout the Middle East, part of the discussion of how that moment came is because of the internet, because of social media. And yet you talk about, more often than not, the internet is being used to spy on, to crack down on—spy on people, crack down on civil liberties. Talk about what you have found and how this relates to the legislation that we’re seeing now being developed in Washington.

REBECCA MacKINNON: Well, thanks very much, Amy, for having me on here today.

And just to connect my book to the issues that you were just discussing in the previous segment about the Protect IP Act and the Stop Online Piracy Act, I think the reason why this—these issues are so important for ordinary Americans and really go beyond just sort of a nerdy, geeky technical issue is that in today’s society, we, as citizens, increasingly depend on internet services and platforms, mobile services and platforms, not only for our personal lives and our businesses and our jobs, but also for our political discourse and political activism, getting involved with politics. And so, it’s very important that people who are exercising power, whether they’re corporate or whether they’re government, that are exercising power over what we can see, over what we can access, over what we can publish and transmit through these digital spaces, need to be held accountable, and we need to make sure that power is not being abused in these digital spaces and platforms that we depend on. And so, that’s why this SOPA and PIPA legislation and the fight over it is so important, is who are you empowering to decide what people can and cannot see and do on the internet, and how do you make sure that that power is not going to be abused in ways that could have political consequences. And we’ve actually seen how existing copyright law has sometimes been abused by different actors who want to prevent critics from speaking out.

But coming back to the Arab Spring, my book is not about whether the good guys or the bad guys are winning on the internet. The internet is empowering everybody. It’s empowering Democrats. It’s empowering dictators. It’s empowering criminals. It’s empowering people who are doing really wonderful and creative things. But the issue really is how do we ensure that the internet evolves in a manner that remains consistent with our democratic values and that continues to support people’s ability to use these technologies for dissent and political organizing. And while the internet was part of the story in the Arab Spring in terms of how people were able to organize, it’s not so clear to what extent it’s going to be part of the story in terms of building stable democracies in countries like Tunisia and Egypt, where the dictators did fall, let alone in a number of other countries.

In Tunisia, for instance, there is a big argument going on, now that they’ve had their set of democratic elections to the Constitutional Assembly, and they’re trying to write their constitution and figure out how to set up a new democracy. And Tunisia, under Ben Ali, was actually one of the most sophisticated Arab countries when it came to censoring and surveillance on the internet. And quite a number of the people who have been democratically elected in Tunisia are calling for a resumption of censorship and surveillance for national security reasons, to maintain public morals and public order. And there’s a huge debate going on about what is the role of censorship and surveillance in a democracy, and how do you make sure that power is not abused.

And they turn and look at the United States, they look at Europe, and censorship laws are proliferating around the democratic world. And there’s not sufficient discussion and consideration for how these laws are going to be abused. And we’ve seen, actually, in Europe, with a number of efforts to censor both copyright infringement as well as child pornography and so on, that a lot of this internet blocking that happens, even in democracies, oftentimes exercises mission creep, so things that weren’t originally intended to be blocked end up getting blocked when the systems are in place. It’s really difficult to make sure that the censorship does not spread beyond its original intent. It’s very hard to control. So, this is one of the issues.

It’s not that the internet isn’t empowering. It’s not that the internet can’t help the good guys—it certainly does. But we’re at a critical point, I think, in history, where the internet is not some force of nature. How it evolves and how it can be used and who it empowers really depends on all of us taking responsibility for making sure it evolves in a direction that’s compatible with democracy, and that it doesn’t empower the most powerful incumbent governments or the most powerful corporations to decide what we can and cannot see and do with our technology.

AMY GOODMAN: Rebecca MacKinnon, talk about the phenomenon, Control 2.0.

REBECCA MacKINNON: Right. So, Control 2.0 is what I refer to in terms of how authoritarian governments are evolving in the internet age. And so, one example I use is China. And China, in many ways, is exhibit A for how an authoritarian state survives the internet. And how do they do that? They have not cut off their population from the internet. In fact, the internet is expanding rapidly in China. They now have over 500 million internet users. And the Chinese government recognizes that being connected to the global internet is really important for its economy, for its education, for its culture, for innovation. Yet, at the same time, they have worked out a way to filter and censor the content overseas that they feel their citizens should not be accessing.

And what’s even more insidious, actually, is the way in which the state uses the private sector to conduct most of its censorship and surveillance. So, actually, what we know as the Great Firewall of China that blocks Twitter and Facebook, that’s only one part of Chinese internet censorship. Actually, most Chinese internet users are using Chinese-language websites that are run by Chinese companies based in China, and those companies are all held responsible for everything their users are doing. And so, they have to hire entire departments of people to monitor their users at the police’s behest and also to not just block, but delete content that the Chinese government believes infringes Chinese law. And, of course, when—in a country where crime is defined very broadly to include political and religious dissent, that involves a great deal of censorship. And it’s being conducted, to a great degree, not by government agents, but by private corporations who are complying with these demands in order to make a profit in China.

AMY GOODMAN: Rebecca, talk about specifics, like Facebook, Facebook—changes in Facebook features and privacy settings, exposing identities of protesters to police in Egypt, in Iran. Talk about Google. Talk about Apple removing politically controversial apps.

REBECCA MacKINNON: Right. So, for instance, with Facebook, Facebook has its own kind of type of governance, which is why I call private internet companies the “sovereigns of cyberspace.” And so, Facebook has a rule where it requires that its users need to use their real name, their real identity. And while some people violate that rule, that makes them vulnerable to having their account shut down if they are discovered. And so, the reason they do this is that they want people to be accountable for their speech and prevent bullying and so on. And that may make sense in the context of a Western democracy, assuming that you’re not vulnerable in your workplace or anything like that, which is even a question, but it means that you have to be—as an Egyptian activist or as an activist in Syria and so on, you’re more exposed, because you have to be on Facebook using your real name.

And actually, a group of prominent activists in Egypt who were using Facebook to organize an anti-torture movement were doing so, before the regime fell, under fake names, and actually, at a critical point where they were trying to organize a major protest, their Facebook group went down, because they were in violation of the terms of service. And they actually had to find somebody in the U.S. to take over their Facebook page so that they could continue to operate.

And you also have a lot of cases of people in Iran. There have been a number of reports of people being tortured for their Facebook passwords and so on. And the fact that Iranian users are, in most cases, using their real names makes them a great deal more vulnerable.

And as you know, here in the United States, Facebook recently was subject to a fine and had to reach a settlement with the Federal Trade Commission because of the changes in its privacy settings that had been sudden at the end of 2009. People had made assumptions about whether their friends could be seen or not publicly. Suddenly those settings changed, and it exposed a lot of people in ways that, in some cases, were very dangerous.

But also, let’s take some other companies and some of the issues that users face. Apple, in its App Store, it has different versions of its App Store in different parts of the world. And their Chinese App Store censors applications that the Chinese government believes to be controversial. So, for instance, the Dalai Lama app in the Apple Store is not available in China. But Apple employees are also making a lot of other judgments about what content is and isn’t appropriate, that goes according to standards that are much more narrow than our First Amendment rights. So, for instance, an American political cartoonist, Mark Fiore, had an app in which he was making fun of a range of politicians, including President Obama, and Apple App Store nannies decided to censor that app, because they considered it to be too controversial, even though that speech was clearly protected under the First Amendment. So you have companies making these judgments that go well beyond sort of our judicial and constitutional process.

You also have Amazon, for instance, dropping WikiLeaks, even though it had not been accused, let alone, convicted, of any crime, simply because a number of American politicians objected to WikiLeaks. And so, there is this issue of: are companies, in the way in which they operate their services, considering the free expression rights and privacy rights of their users sufficiently to ensure that we’re able to have robust dissent, that people can speak truth to power in a manner that may be making current government officials very, very uncomfortable, but which is clearly protected both under our Constitution and the Universal Declaration of Human Rights?

AMY GOODMAN: Rebecca—

REBECCA MacKINNON: Should we be expecting companies to push back a bit more?

AMY GOODMAN: I wanted to ask you about the newly released government documents that reveal the Department of Homeland Security hired the military contractor General Dynamics to monitor postings of U.S. citizens on dozens of websites. The sites monitored included Facebook and Twitter, as well as several news sites, including the New York TimesWiredThe Huffington Post. General Dynamics was asked to collect reports that dealt with government agencies, including CIA, FEMA, ICE. Your thoughts?

REBECCA MacKINNON: Well, this is exactly the kind of issue that we need to deal with in a democracy. Now, if they have been hired to monitor postings that citizens are putting on a public website, I think that’s a reminder that our public information is public and that it’s being mined and watched by all kinds of people. But it’s also an example of why privacy settings are so important and why—why it’s important that people should be able to be anonymous if they want to be on the internet, if they fear consequences or if they fear misuse of the way in which they’re carrying out political discussions that could be used against them in different ways.

And there’s also a real issue, I think, in the way in which our laws are evolving when it comes to government access to information stored on corporate servers, that is supposed to be private, that we are not intending to be seen in public, which is that, according to the PATRIOT Act and a range of other law that has been passed in recent years, it’s much easier for government agencies to access your email, to access information about your postings on Twitter, even if they’re anonymous, than it is for government agents to come into your home and search your personal effects. To do that, they need a warrant. There is very clear restriction on the government’s ability to read your mail. Yet, according to current law, if your email is older than 180 days old, the government can access your email, if it’s stored on Gmail or Yahoo! or Hotmail, without any kind of warrant or court order. So, there’s a real erosion of our Fourth Amendment rights, really, to protection from unreasonable search and seizure. And this is going on, I think, to a great degree without a lot people realizing the extent to which our privacy rights are being eroded.

AMY GOODMAN: Rebecca, we have 30 seconds, but the significance of Wednesday, of tomorrow, of Wikipedia and many other websites going dark in protest of the legislation here in the United States? What do you think is the most important issue people should take away from what’s happening and also from your book, Consent of the Networked?

REBECCA MacKINNON: Well, I think the action tomorrow really demonstrates that internet censorship affects everybody, it’s not just affecting people in China, that this is an issue that we all need to be concerned about, and it can happen in democracies as well as in dictatorships.

And the core message of my book is that if we want democracy to survive in the internet age, we really need to work to make sure that the internet evolves in a manner that is compatible with democracy, and that means exercising our power not only as consumers and internet users and investors, but also as voters, to make sure that our digital lives contain the same kind of protections of our rights that we expect in physical space.

AMY GOODMAN: Rebecca MacKinnon, I want to thank you very much for being with us, senior fellow at the New America Foundation, co-founder of Global Voices Online. Her new book is called Consent of the Networked: The Worldwide Struggle for Internet Freedom.

Censorship, Capitalism & “Personalization” The Filter Bubble: What The Internet Is Hiding From You

In Uncategorized on May 27, 2011 at 8:10 pm

Oldspeak:”WOW. So much for Net Neutrality. At least in Communist China, people are fully aware the internet and online social media is being censored.  :- | Here in the land of the free U.S.A., internet censorship is practiced without your knowledge, in much more subtle, insidious, and invasive ways. Cyber gatekeepers like Google, Yahoo, Facebook and the other top 50 websites collect an average of 64 bits of personal information each time we visit and then custom-designs their sites to conform to our perceived preferences. Marketed as sexy and convenient sounding “Personalization”, the dominant search engines and social media sites that control much of what you see and read, in their voracious desire for generating ad revenue, actively edit out information that is contrary to what you are perceived to prefer or believe via data collected on your viewing habits. So a Google search for “Egypt” on your computer will be different from an identical search I make on my machine. You only see what you’re most likely to click on and thus generate revenue for them. Net Neutrality is functionally a thing of the past. The 21st century “Ministry of Truth” is invisible, omnipotent and making obscene amounts of money from mining and manipulating your personal preferences and information. The internet, originally thought as a tool to exchange, free and unencumbered, information and ideas from all point of view has been privatized. The only ideas and information you’re likely to see are those much like your own. These conditions increase polarization, societal atomization, isolation, apathy, the gap between the public and private sphere and a general ignorance of the full world around us. While reducing actual interpersonal relations/face to face contact, social ties, and concern for a “greater good”. “Personalization” is nothing more than a cybernetic and irresistible tool meant to divide and conquer the people. Folks are far easier to control and manipulate when they’re disconnected physically and psychologically balkanized. And far worse, making people feel happy and excited to participate in their own enslavement to the modern-day gods of consumption and self-interest. ‘Ignorance is Strength’ and Profit is Paramount. Could the personal computer have morphed into the 21st century version ‘Telescreen‘? “

Related Video: Eli Pariser: Beware online “filter bubbles”

By Amy Goodman @ Democracy Now:

The internet is increasingly becoming an echo chamber in which websites tailor information according to the preferences they detect in each viewer. When some users search the word “Egypt,” they may get the latest news about the revolution, others might only see search results about Egyptian vacations. The top 50 websites collect an average of 64 bits of personal information each time we visit—and then custom-design their sites to conform to our perceived preferences. What impact will this online filtering have on the future of democracy? We speak to Eli Pariser, author of The Filter Bubble: What the Internet Is Hiding from You. “Take news about the war in Afghanistan. When you talk to people who run news websites, they’ll tell you stories about the war in Afghanistan don’t perform very well. They don’t get a lot of clicks. People don’t flock to them. And yet, this is arguably one of the most important issues facing the country,” says Pariser. “But it will never make it through these filters. And especially on Facebook this is a problem, because the way that information is transmitted on Facebook is with the ‘like’ button. And the ‘like’ button, it has a very particular valence. It’s easy to click ‘like’ on ‘I just ran a marathon’ or ‘I baked a really awesome cake.’ It’s very hard to click ‘like’ on ‘war in Afghanistan enters its 10th year.’”

Guest:

Eli Pariser, author of the new book, ‘The Filter Bubble: What the Internet Is Hiding from You’. He is also the board president and former executive director of MoveOn.org, which at five million members is one of the largest citizens’ organizations in American politics.

JUAN GONZALEZ: When you follow your friends on Facebook or run a search on Google, what information comes up, and what gets left out? That’s the subject of a new book by Eli Pariser called The Filter Bubble: What the Internet Is Hiding from You. According to Pariser, the internet is increasingly becoming an echo chamber in which websites tailor information according to the preferences they detect in each viewer. Yahoo! News tracks which articles we read. Zappos registers the type of shoes we wear, we prefer. And Netflix stores data on each movie we select.

AMY GOODMAN: The top 50 websites collect an average of 64 bits of personal information each time we visit and then custom-designs their sites to conform to our perceived preferences. While these websites profit from tailoring their advertisements to specific visitors, users pay a big price for living in an information bubble outside of their control. Instead of gaining wide exposure to diverse information, we’re subjected to narrow online filters.

Eli Pariser is the author of The Filter Bubble: What the Internet Is Hiding from You. He is also the board president and former executive director of the groupMoveOn.org. Eli joins us in the New York studio right now after a whirlwind tour through the United States.

Welcome, Eli.

ELI PARISER: Thanks for having me on.

AMY GOODMAN: So, this may surprise people. Two of us sitting here, me and Juan, if we went online, the two of us, and put into Google “Eli Pariser”—

ELI PARISER: Right.

AMY GOODMAN:—we actually might come up with a wholly different set of finds, a totally different set of links, of search results.

ELI PARISER: That’s right. I was surprised. I didn’t know that that was, you know, how it was working, until I stumbled across a little blog post on Google’s blog that said “personalized search for everyone.” And as it turns out, for the last several years, there is no standard Google. There’s no sort of “this is the link that is the best link.” It’s the best link for you. And the definition of what the best link for you is, is the thing that you’re the most likely to click. So, it’s not necessarily what you need to know; it’s what you want to know, what you’re most likely to click.

JUAN GONZALEZ: But isn’t that counter to the original thing that brought so many people to Google, that the algorithms that Google had developed really were reaching out to the best available information that was out there on the web?

ELI PARISER: Yeah. You know, if you look at how they talked about the original Google algorithm, they actually talked about it in these explicitly democratic terms, that the web was kind of voting—each page was voting on each other page in how credible it was. And this is really a departure from that. This is moving more toward, you know, something where each person can get very different results based on what they click on.

And when I did this recently with Egypt—I had two friends google “Egypt”—one person gets search results that are full of information about the protests there, about what’s going on politically; the other person, literally nothing about the protests, only sort of travel to see the Pyramids websites.

AMY GOODMAN: Now, wait, explain that again. I mean, that is astounding. So you go in. The uprising is happening in Egypt.

ELI PARISER: Right.

AMY GOODMAN: In fact, today there’s a mass protest in Tahrir Square. They’re protesting the military council and other issues. So, if I look, and someone who likes to travel look, they may not even see a reference to the uprising?

ELI PARISER: That’s right. I mean, there was nothing in the top 10 links. And, you know, actually, the way that people use Google, most people use just those top three links. So, if Google isn’t showing you sort of the information that you need to know pretty quickly, you can really miss it. And this isn’t just happening at Google; it’s happening all across the web, when I started looking into this. You know, it’s happening on most major websites, and increasingly on news websites. So, Yahoo! News does the exact same thing, tailoring what you see on Yahoo! News to which articles it thinks you might be interested in. And, you know, what’s concerning about this is that it’s really happening invisibly. You know, we don’t see this at work. You can’t tell how different the internet that you see is from the internet that anyone else sees is, but it’s getting increasingly different.

JUAN GONZALEZ: Well, what about the responses of those who run these search engines, that they’re merely responding to the interests and needs of the people who use the system?

ELI PARISER: Well, you know, I think—they say, “We’re just giving people what we want.” And I say, “Well, what do you mean by ‘what we want’?” Because I think, actually, all of us want a lot of different things. And there’s a short-term sort of compulsive self that clicks on the celebrity gossip and the more trivial articles, and there’s a longer-term self that wants to be informed about the world and be a good citizen. And those things are intentional all the time. You know, we have those two forces inside us. And the best media helps us sort of—helps the long-term self get an edge a little bit. It gives us some sort of information vegetables and some information dessert, and you get a balanced information diet. This is like you’re just surrounded by empty calories, by information junk food.

AMY GOODMAN: Eli, talk about your experience going on your own Facebook page.

ELI PARISER: So, this was actually the starting point for looking into this phenomenon. And basically, after 2008 and after I had transitioned out of being the executive director of MoveOn, I went on this little campaign to meet and befriend people who thought differently from me. I really wanted to hear what conservatives were thinking about, what they were talking about, you know, and learn a few things. And so, I had added these people as Facebook friends. And I logged on one morning and noticed that they weren’t there. They had disappeared. And it was very mysterious. You know, where did they go? And as it turned out, Facebook was tracking my behavior on the site. It was looking at every click. It was looking at every, you know, Facebook “like.” And it was saying, “Well, Eli, you say that you’re interested in these people, but actually, we can tell your clicking more on the progressive links than on the conservative links, so we’re going to edit it out, edit these folks out.” And they disappeared. And this gets to some of the danger of this stuff, which is that, you know, we have—

JUAN GONZALEZ: But Facebook edited out your friends?

ELI PARISER: Yeah, no. I really—you know, I miss them. And—

AMY GOODMAN: Your conservative friends.

ELI PARISER: My conservative friends, the friends that—you know, that I might—and what the play here is, is there’s this thing called confirmation bias, which is basically our tendency to feel good about information that confirms what we already believe. And, you know, you can actually see this in the brain. People get a little dopamine hit when they’re told that they’re right, essentially. And so, you know, if you were able to construct an algorithm that could show people whatever you wanted, and if the only purpose was actually to get people to click more and to view more pages, why would you ever show them something that makes them feel uncomfortable, makes them feel like they may not be right, makes them feel like there’s more to the world than our own little narrow ideas?

JUAN GONZALEZ: And doesn’t that, in effect, reinforce polarization within the society, in terms of people not being exposed to and listening to the viewpoints of others that they may disagree with?

ELI PARISER: Right. I mean, you know, democracy really requires this idea of discourse, of people hearing different ideas and responding to them and thinking about them. And, you know, I come back to this famous Daniel Patrick Moynihan quote where he says, you know, “Everybody is entitled to their own opinions, but not their own facts.” It’s increasingly possible to live in an online world in which you do have your own facts. And you google “climate change,” and you get the climate change links for you, and you don’t actually get exposed necessarily—you don’t even know what the alternate arguments are.

JUAN GONZALEZ: Now, what about the implications for this, as all of these—especially Google, Yahoo!, developed their own news sites? What are the implications in terms of the news that they put out then and the news that people receive?

ELI PARISER: Well, this is where it gets even more worrisome, because when you’re just basically trying to get people to click things more and view more pages, there’s a lot of things that just isn’t going to meet that threshold. So, you know, take news about the war in Afghanistan. When you talk to people who run news websites, they’ll tell you stories about the war in Afghanistan don’t perform very well. They don’t get a lot of clicks. People don’t flock to them. And yet, this is arguably one of the most important issues facing the country. We owe it to the people who there, at the very least, to understand what’s going on. But it will never make it through these filters. And especially on Facebook this is a problem, because the way that information is transmitted on Facebook is with the “like” button. And the “like” button, it has a very particular valence. It’s easy to click “like” on, you know, “I just ran a marathon” or “I baked a really awesome cake.” It’s very hard to click “like” on, you know, “war in Afghanistan enters its sixth year”—or “10th year,” sorry. You know, so information that is likable gets transmitted; information that’s not likable falls out.

AMY GOODMAN: We’re talking to Eli Pariser, who has written the book The Filter Bubble: What the Internet Is Hiding from You. Now, Google knows not only what you’re asking to search, right? They know where you are. They know the kind of computer you’re using. Tell us how much information they’re gathering from us.

ELI PARISER: Well, it’s really striking. I mean, even if you’re not—if you’re logged in to Google, then Google obviously has access to all of your email, all of your documents that you’ve uploaded, a lot of information. But even if you’re logged out, an engineer told me that there are 57 signals that Google tracks—”signals” is sort of their word for variables that they look at—everything from your computer’s IP address—that’s basically its address on the internet—what kind of laptop you’re using or computer you’re using, what kind of software you’re using, even things like the font size or how long you’re hovering over a particular link. And they use that to develop a profile of you, a sense of what kind of person is this. And then they use that to tailor the information that they show you.

And this is happening in a whole bunch of places, you know, not just sort of the main Google search, but also on Google News. And the plan for Google News is that once they sort of perfect this personalization algorithm, that they’re going to offer it to other news websites, so that all of that data can be brought to bear for any given news website, that it can tailor itself to you. You know, there are really important things that are going to fall out if those algorithms aren’t really good.

And what this raises is a sort of larger problem with how we tend to think about the internet, which is that we tend to think about the internet as this sort of medium where anybody can connect to anyone, it’s this very democratic medium, it’s a free-for-all, and it’s so much better than that old society with the gatekeepers that were controlling the flows of information. Really, that’s not how it’s panning out. And what we’re seeing is that a couple big companies are really—you know, most of the information is flowing through a couple big companies that are acting as the new gatekeepers. These algorithms do the same thing that the human editors do. They just do it much less visibly and with much less accountability.

JUAN GONZALEZ: And what are the options, the opt-out options, if there are any, for those who use, whether it’s Google or Yahoo! or Facebook? Their ability to control and keep their personal information?

ELI PARISER: Well, you know, there aren’t perfect opt-out options, because even if you take a new laptop out of the box, already it says something about you, that you bought a Mac and not a PC. I mean, it’s very hard to get entirely out of this. There’s no way to turn it off entirely at Google. But certainly, you can open a private browsing window. That helps.

I think, in the long run, you know, there’s sort of two things that need to happen here. One is, we need, ourselves, to understand better what’s happening, because it’s very dangerous when you have these kinds of filters operating and you don’t know what they’re ruling out that you’re not even seeing. That’s sort of a—that’s where people make bad decisions, is, you know, what Donald Rumsfeld called the “unknown unknowns,” right? And this creates a lot of unknown unknowns. You don’t know how your experience of the world is being edited.

But it’s also a matter of pushing these companies to sort of—you know, these companies say that they want to be good. “Don’t be evil” is Google’s motto. They want to change the world. I think we have to push them to sort of live up to their best values as companies and incorporate into these algorithms more than just this very narrow idea of what is important.

AMY GOODMAN: So, what are they saying, the leaders of Google, Facebook, Yahoo!? I mean, are you talking to them?

ELI PARISER: Well, I tried to. You know, I had a brief conversation with Larry Page, in which he said, “Well, I don’t think this is a very interesting problem.” And that was about that. But, you know, further down in Google, there are a bunch of people who are wrestling with this. I think the challenge is—I talked to one Facebook engineer who sort of summed it up quite well, and he said, “Look, what we love doing is sitting around and coming up with new clever ways of getting people to spend more minutes on Facebook, and we’re very good at that. And this is a much more complicated thing that you’re asking us to do, where you’re asking us to think about sort of our social responsibility and our civic responsibility, what kind of information is important. This is a much more complicated problem. We just want to do the easy stuff.” And, you know, I think that’s what’s sort of led us to this current place. I think there are also people who see the flipside of that and say this is one of the big, juicy problems in front of us, is how do we actually take the best of sort of 20th century editorial values and import them into these new systems that are deciding what people see and what people don’t see.

AMY GOODMAN: Talk about how much money is being made off of this. And I mean, just this neutral term of “personalization”—

ELI PARISER: Right.

AMY GOODMAN:—it sounds so benign. In fact, it sounds attractive.

ELI PARISER: It sounds great, yeah.

AMY GOODMAN: It’s geared and tailored for you. What could be better?

ELI PARISER: Right. And it does rely on the sense of a sort of cozy, familiar world online, where your favorite website greets you and goes, “Oh, hey, Eli, we’ve teed up all of these articles for you. Welcome.” It feels very good.

But, you know, what’s driving this is—you know, in some ways, this is the driving struggle on the internet right now between all of these different companies, to accumulate the biggest amounts of data on each of us. And Facebook has its strategy, which is basically ask people to tell Facebook about themselves. Google has its strategy, which is to watch your clicks. Microsoft and Yahoo! have their strategies. And all of this feeds into a database, which can then be used to do three things. It can target ads better, so you get better targeted ads, which honestly, I think, you know, sometimes is fine, if you know that it’s happening. It can target content, which I think is much more problematic. You start to get content that just reflects what it thinks you want to see. And then the third thing is, and it can make decisions about you.

So, one of the sort of more surprising findings in the book was that banks are beginning to look at people’s Facebook friends and their credit ratings in order to decide to whom to give—to offer credit. And this is based on this fact that, you know, if you look at the credit ratings of people, you can make predictions about the credit ratings of their friends. It’s very creepy, though, because really what you’re saying then is that it would be better not to be Facebook friends with people who have lower credit ratings. It’s not really the kind of society that we want to be building, particularly.

JUAN GONZALEZ: Well, even more frightening, obviously, is once all of this information, personal information, is gathered, it saves the government, in its ability to surveil its population, a lot of work, because basically the private companies can gather the information, and all the government has to do is issue the subpoena or make the call that “for national security, we need this information.” So, in essence, it doesn’t have to do the actual surveillance. It just has to be able to use it when it needs to.

ELI PARISER: There’s a funny Onion article that has the headline “CIA Rules Out Very Successful New Facebook Program,” implying that the CIA started Facebook to gather data. And it’s funny, but there is sort of some truth there, which is that these companies do have these massive databases, and the protections that we have for our data that live on these servers are far—you know, far less protection than if it’s on your home computer. The FBI needs to do much less paperwork in order to ask Google for your data than it does to, you know, come into your home and look at your computer. And so, increasingly—so this is sort of the downside of cloud computing, is that it allows more and more of our data and everything that we do to be available to the government and, you know, for their purposes.

JUAN GONZALEZ: And not only in a democracy, but in an authoritarian state, as well.

ELI PARISER: That’s right. I mean, it’s a natural byproduct of consolidating so much of what we do online in a few big companies that really don’t have a whole lot of accountability, you know, that aren’t being pushed very hard by governments to do this right or do it responsibly. It will naturally lead to abuses.

AMY GOODMAN: Google Inc. announced yesterday that they have launched a bid to dominate a world in which the smartphone replaces the wallet as the container for credit cards, coupons and receipts. The mobile app is called Google Wallet. How does this fit into this picture?

ELI PARISER: Well, it’s just another—I mean, the way that Google thinks is, how can we design products that people will use that allow us to accumulate even more data about them? So, obviously, once you start to have a sense of everything that people are buying flowing through Google’s servers, then you have way more data on which to target ads and target content and do this kind of personalization. You know exactly how to slice and dice people. And again, you know, in some contexts, that’s fine, actually. I don’t mind when I go on Amazon, and it recommends books. They’re obviously not very good recommendations sometimes, but it’s fine. But when it’s happening invisibly and when it’s shaping not just what you buy but what you know about the world, I think, you know, is more of a problem. And if this is going to be sort of the way that the future of the internet looks, then we need to make sure that it’s much more transparent when this is happening, so that we know when things are being targeted to us. And we have to make sure that we have some control as consumers over this, that it’s not just in the hands of these big companies that have very different interests.

AMY GOODMAN: So, you have a powerful force, Eli Pariser. You were the head of MoveOn.org. Now you’re what? The chair of the board—

ELI PARISER: I’m on the board, yeah.

AMY GOODMAN:—of MoveOn.org. So, this, MoveOn, has millions of people it reaches all over the country. What will MoveOn do about this?

ELI PARISER: Well, you know, there’s sort of this dance here, because basically MoveOn takes on the issues that its members want to take up. So I’ve been very—you know, I don’t want to sort of impose by fiat that I wrote a book, and here’s—now we’re going to campaign about this. But, you know, there are campaigns that we’re starting to look at. One of them, I think, that’s very simple but actually would go a significant way is just to, you know, have a basic—have a way of signaling on Facebook that something is important, even if it’s not likable. Obviously this is sort of just one small piece, but actually, if you did have an “important” button, you would start having a lot of different information propagating across Facebook. You’d have people exposed to things that maybe aren’t as smile-inducing, but we really need to know. And Facebook is actually considering adding some new verbs. So, this could be a winnable thing. It’s not—it won’t solve the whole problem, but it would start to indicate—it would start to remind these companies that there are ways that they can start to build in, you know, some more kind of civic values into what they’re doing.

JUAN GONZALEZ: And any sense that in Congress any of the politicians are paying attention to some of these issues?

AMY GOODMAN: Or understand this?

ELI PARISER: Yeah, there are a few that have been really attentive to this. Al Franken, in particular, has been very good on these data and privacy issues and really pushing forward. It’s obviously challenging because a lot of the Democratic congressmen and women are—get a lot of money from these companies, Silicon Valley. You know, certainly the Obama administration and Obama got a lot of support from Silicon Valley. So, they don’t totally want to get on the wrong side of these companies. And they feel like the companies are on the side of good and on the side of sort of pushing the world in the direction that they want it to. It means that we don’t have as good congressional watchdogs as you would hope, but there are a few good ones. And Franken, in particular, has been great on this.

AMY GOODMAN: Well, Eli Pariser, I want to thank you for your work and for writing The Filter Bubble: What the Internet Is Hiding from You, board president and former executive director of MoveOn.org, which at five million members is one of the largest citizens’ organizations in American politics. This is Democracy Now!, democracynow.org, The War and Peace Report. Back in a minute.

ELI PARISER: Thank you.

 

Follow

Get every new post delivered to your Inbox.

Join 404 other followers