"In a time of universal deceit telling the truth is a revolutionary act." -George Orwell

Posts Tagged ‘Hierarchical Society’

Why Life in America Can Literally Drive You Insane

In Uncategorized on August 16, 2013 at 5:58 pm

Oldspeak: “Can you imagine the uproar, the national panic if 1 in every 78 Americans had cancer? Or was a victim of gun violence? Or if 1 in 78 Americans had AIDS? Why then is it acceptable that 1 in 78 Americans is suffering from severe and disabling mental illness? Why is it acceptable that there has been a 35 FOLD INCREASE in mental illness among children!?!!  Pathology has been normalized. Sociopathy is a key trait in the dominant institutions of our civilization (corporations) and by extension the people who work within them. 70% of Americans hate their jobs.  Lack of empathy and compassion is seen as normal.  You see it every day walking down the street; people looking upon with disdain or just plain actively ignoring the steadily growing number of homeless and mentally ill who populate our streets.  Elder abuse is institutionalized in this time where there our population is the greyest it’s ever been. Youth are increasingly disposable, with children serving as fodder for the burgeoning prison-industrial complex; pumped full of powerful toxic, illness inducing pills that no one has any idea what the long-term health effect will be. While pharmaceutical drug dealers’ profit margins explode.  Whole cities are failing while the banks & other corporations that greatly contributed to their failure are given unlimited resources. Something is terribly terribly wrong. Our society is making us crazy. Destroying our planet. How much longer well we go on without having serious discussion about restructuring our civilization in a way that healthy, beneficial and sustainable for all? ” –OSJ

By Bruce E. Levine @ AlterNet:

In “The Epidemic of Mental Illness: Why [3]?” (New York Review of Books, 2011), Marcia Angell, former editor-in-chief of the New England Journal of Medicine, discusses over-diagnosis of psychiatric disorders, pathologizing of normal behaviors, Big Pharma corruption of psychiatry, and the adverse effects of psychiatric medications. While diagnostic expansionism and Big Pharma certainly deserve a large share of the blame for this epidemic, there is another reason.

A June 2013 Gallup poll [4] revealed that 70% of Americans hate their jobs or have “checked out” of them. Life may or may not suck any more than it did a generation ago, but our belief in “progress” has increased expectations that life should be more satisfying, resulting in mass disappointment. For many of us, society has become increasingly alienating, isolating and insane, and earning a buck means more degrees, compliance, ass-kissing, shit-eating, and inauthenticity. So, we want to rebel. However, many of us feel hopeless about the possibility of either our own escape from societal oppression or that political activism can create societal change. So, many of us, especially young Americans, rebel by what is commonly called mental illness.

While historically some Americans have consciously faked mental illness to rebel from oppressive societal demands (e.g., a young Malcolm X acted crazy to successfully avoid military service), today, the vast majority of Americans who are diagnosed and treated for mental illness are in no way proud malingerers in the fashion of Malcolm X. Many of us, sadly, are ashamed of our inefficiency and nonproductivity and desperately try to fit in. However, try as we might to pay attention, adapt, adjust, and comply with our alienating jobs, boring schools, and sterile society, our humanity gets in the way, and we become anxious, depressed and dysfunctional.

The Mental Illness Epidemic

Severe, disabling mental illness has dramatically increased in the Untied States. Marcia Angell, in her 2011 New York Review of Bookspiece, summarizes [3]: “The tally of those who are so disabled by mental disorders that they qualify for Supplemental Security Income (SSI) or Social Security Disability Insurance (SSDI) increased nearly two and a half times between 1987 and 2007—from 1 in 184 Americans to 1 in 76. For children, the rise is even more startling—a thirty-five-fold increase in the same two decades.”

Angell also reports that a large survey of adults conducted between 2001 and 2003 sponsored by the National Institute of Mental Health found that at some point in their lives, 46% of Americans met the criteria established by the American Psychiatric Association for at least one mental illness.

In 1998, Martin Seligman, then president of the American Psychological Association, spoke [5] to the National Press Club about an American depression epidemic: “We discovered two astonishing things about the rate of depression across the century. The first was there is now between ten and twenty times as much of it as there was fifty years ago. And the second is that it has become a young person’s problem. When I first started working in depression thirty years ago. . . the average age of which the first onset of depression occurred was 29.5. . . .Now the average age is between 14 and 15.”

In 2011, the U.S. Centers for Disease Control and Prevention [6] (CDC) reported that antidepressant use in the United States has increased nearly 400% in the last two decades, making antidepressants the most frequently used class of medications by Americans ages 18-44 years. By 2008, 23% of women ages 40–59 years were taking antidepressants.

The CDC, on May 3, 2013, reported [7] that the suicide rate among Americans ages 35–64 years increased 28.4% between 1999 and 2010 (from 13.7 suicides per 100,000 population in 1999 to 17.6 per 100,000 in 2010).

The New York Times [8]reported in 2007 that the number of American children and adolescents treated for bipolar disorder had increased 40-fold between 1994 and 2003. In May 2013, CDC reported in “Mental Health Surveillance Among Children—United States, 2005–2011 [9],” the following: “A total of 13%–20% of children living in the United States experience a mental disorder in a given year, and surveillance during 1994–2011 has shown the prevalence of these conditions to be increasing.”

Over-Diagnosis, Pathologizing the Normal and Psychiatric Drug Adverse Effects

Even within mainstream psychiatry, few continue to argue that the increase in mental illness is due to previous under-diagnosis of mental disorders. The most common explanations for the mental illness epidemic include recent over-diagnosis of psychiatric disorders, diagnoses expansionism, and psychiatry’s pathologizing normal behavior.

The first DSM (Diagnostic and Statistical Manual of Mental Disorders), psychiatry’s diagnostic bible, was published by the American Psychiatric Association in 1952 and listed 106 disorders (initially called “reactions”). DSM-2 was published in 1968, and the number of disorders increased to 182. DSM-3 was published in 1980, and though homosexuality was dropped from it, diagnoses were expanded to 265, with several child disorders added that would soon become popular, including oppositional defiant disorder (ODD). DSM-4, published in 1994, contained 365 diagnoses.

DSM-5 was published in May, 2013. The journal PLOS Medicinereported [10] in 2012, “69% of the DSM-5 task force members report having ties to the pharmaceutical industry.” DSM-5 did not add as many new diagnoses [11] as had previous revisions. However, DSM-5 has been criticized even by some mainstream psychiatrists such as Allen Frances, the former chair of the DSM-4 taskforce, for creating more mental patients by making it easier to qualify for a mental illness, especially for depression. (See Frances’ “Last Plea To DSM-5: Save Grief From the Drug Companies [12].”)

In the last two decades, there have been a slew of books written by journalists and mental health professionals about the lack of science behind the DSM, the over-diagnosis of psychiatric disorders, and the pathologizing of normal behaviors. A sample of these books includes: Paula Caplan’s They Say You’re Crazy (1995), Herb Kutchins and Stuart Kirk’s Making Us Crazy (1997), Allan Horwitz and Jerome Wakefield’s The Loss of Sadness: How Psychiatry Transformed Normal Sorrow into Depressive Disorder (2007), Christopher Lane’s Shyness: How Normal Behavior Became a Sickness (2008), Stuart Kirk, Tomi Gomory, and David Cohen’s Mad Science: Psychiatric Coercion, Diagnosis, and Drugs (2013), Gary Greenberg’s The Book of Woe: The DSM and the Unmaking of Psychiatry (2013), and Allen Frances’ Saving Normal (2013).

Even more remarkable than former chair of the DSM-4 taskforce, Allen Frances, jumping on the DSM-trashing bandwagon has been the harsh critique [13] of DSM-5 by Thomas Insel, director of the National Institute of Mental Health (NIMH). Insel recently announced that the DSM’s diagnostic categories lack validity, and that “NIMH will be re-orienting its research away from DSM categories.” And psychiatrist Robert Spitzer, former chair of the DSM-3 task force, wrote the foreword to Horwitz and Wakefield’s The Loss of Sadness and is now critical [14] of DSM’s inattention to context in which the symptoms occur which, he points out, can medicalize normal experiences.

So, in just two decades, pointing out the pseudoscience of the DSM has gone from being an “extremist slur of radical anti-psychiatrists” to a mainstream proposition from the former chairs of both the DSM-3 and DSM-4 taskforces and the director of NIMH.

Yet another explanation for the epidemic may also be evolving from radical to mainstream, thanks primarily to the efforts of investigative journalist Robert Whitaker and his book Anatomy of An Epidemic [15] (2010). Whitaker argues that the adverse effects of psychiatric medications are the primary cause of the epidemic. He reports that these drugs, for many patients, cause episodic and moderate emotional and behavioral problems to become severe, chronic and disabling ones.

Examining the scientific literature that now extends over 50 years, Whitaker discovered that while some psychiatric medications for some people may be effective over the short term, these drugs increase the likelihood that a person will become chronically ill over the long term. Whitaker reports, “The scientific literature shows that many patients treated for a milder problem will worsen in response to a drug—say have a manic episode after taking an antidepressant—and that can lead to a new and more severe diagnosis like bipolar disorder.”

With respect to the dramatic increase of pediatric bipolar disorder, Whitaker points out that, “Once psychiatrists started putting ‘hyperactive’ children on Ritalin, they started to see prepubertal children with manic symptoms. Same thing happened when psychiatrists started prescribing antidepressants to children and teenagers. A significant percentage had manic or hypomanic reactions to the antidepressants.” And then these children and teenagers are put on heavier duty drugs, including drug cocktails, often do not respond favorably to treatment and deteriorate. And that, for Whitaker, is a major reason for the 35-fold increase between 1987 and 2007 of children classified as being disabled by mental disorders. (See my 2010 interview with him, “Are Prozac and Other Psychiatric Drugs Causing the Astonishing Rise of Mental Illness in America [16]?”)

Whitaker’s explanation for the epidemic has now, even within mainstream psychiatric institutions, entered into the debate; for example, Whitaker was invited by the National Alliance for the Mentally Ill (NAMI) to speak at their 2013 annual convention [17] that took place last June While Whitaker concludes that psychiatry’s drug-based paradigm of care is the primary cause of the epidemic, he does not rule out the possibility that various cultural factors may also be contributing to the increase in the number of mentally ill.

Mental Illness as Rebellion Against Society

“The most deadly criticism one could make of modern civilization is that apart from its man-made crises and catastrophes, is not humanly interesting. . . . In the end, such a civilization can produce only a mass man: incapable of spontaneous, self-directed activities: at best patient, docile, disciplined to monotonous work to an almost pathetic degree. . . . Ultimately such a society produces only two groups of men: the conditioners and the conditioned, the active and passive barbarians.” —Lewis Mumford, 1951

Once it was routine for many respected social critics such as Lewis Mumford and Erich Fromm to express concern about the impact of modern civilization on our mental health. But today the idea that the mental illness epidemic is also being caused by a peculiar rebellion against a dehumanizing society has been, for the most part, removed from the mainstream map. When a societal problem grows to become all encompassing, we often no longer even notice it.

We are today disengaged from our jobs and our schooling. Young people are pressured to accrue increasingly large student-loan debt so as to acquire the credentials to get a job, often one which they will have little enthusiasm about. And increasing numbers of us are completely socially isolated, having nobody who cares about us.

Returning to that June 2013 Gallup survey, “The State of the American Workplace: Employee Engagement [18],” only 30% of workers “were engaged, or involved in, enthusiastic about, and committed to their workplace.” In contrast to this “actively engaged group,” 50% were “not engaged,” simply going through the motions to get a paycheck, while 20% were classified as “actively disengaged,” hating going to work and putting energy into undermining their workplace. Those with higher education levels reported more discontent with their workplace.

How engaged are we with our schooling? Another Gallup poll “The School Cliff: Student Engagement Drops With Each School Year [19]” (released in January 2013), reported that the longer students stay in school, the less engaged they become. The poll surveyed nearly 500,000 students in 37 states in 2012, and found nearly 80% of elementary students reported being engaged with school, but by high school, only 40% reported being engaged. As the pollsters point out, “If we were doing right by our students and our future, these numbers would be the absolute opposite. For each year a student progresses in school, they should be more engaged, not less.”

Life clearly sucks more than it did a generation ago when it comes to student loan debt. According to American Student Assistance’s “Student Debt Loan Statistics [20],” approximately 37 million Americans have student loan debt. The majority of borrowers still paying back their loans are in their 30s or older. Approximately two-thirds of students graduate college with some education debt. Nearly 30% of college students who take out loans drop out of school, and students who drop out of college before earning a degree struggle most with student loans. As of October 2012, the average amount of student loan debt for the Class of 2011 was $26,600, a 5% increase from 2010. Only about 37% of federal student-loan borrowers between 2004 and 2009 managed to make timely payments without postponing payments or becoming delinquent.

In addition to the pain of jobs, school, and debt, there is increasingly more pain of social isolation. A major study reported in the American Sociological Review in 2006, “Social Isolation in America: Changes in Core Discussion Networks Over Two Decades [21],” examined Americans’ core network of confidants (those people in our lives we consider close enough to trust with personal information and whom we rely on as a sounding board). Authors reported that in 1985, 10% of Americans said that they had no confidants in their lives; but by 2004, 25% of Americans stated they had no confidants in their lives. This study confirmed the continuation of trends that came to public attention in sociologist Robert Putnam’s 2000 book Bowling Alone.

Underlying many of psychiatry’s nearly 400 diagnoses is the experience of helplessness, hopelessness, passivity, boredom, fear, isolation, and dehumanization—culminating in a loss of autonomy and community-connectedness. Do our societal institutions promote:

  • Enthusiasm—or passivity?
  • Respectful personal relationships—or manipulative impersonal ones?
  • Community, trust, and confidence—or isolation, fear and paranoia?
  • Empowerment—or helplessness?
  • Autonomy (self-direction)—or heteronomy (institutional-direction)?
  • Participatory democracy—or authoritarian hierarchies?
  • Diversity and stimulation—or homogeneity and boredom?

Research (that I documented in Commonsense Rebellion [22]) shows that those labeled with attention deficit hyperactivity disorder (ADHD) do worst in environments that are boring, repetitive, and externally controlled; and that ADHD-labeled children are indistinguishable from “normals” when they have chosen their learning activities and are interested in them. Thus, the standard classroom could not be more imperfectly designed to meet the learning needs of young people who are labeled with ADHD.

As I discussed last year in AlterNet in “Would We Have Drugged Up Einstein? How Anti-Authoritarianism Is Deemed a Mental Health Problem [23],” there is a fundamental bias in mental health professionals for interpreting inattention and noncompliance as a mental disorder. Those with extended schooling have lived for many years in a world where all pay attention to much that is unstimulating. In this world, one routinely complies with the demands of authorities. Thus for many M.D.s and Ph.D.s, people who rebel against this attentional and behavioral compliance appear to be from another world—a diagnosable one.

The reality is that with enough helplessness, hopelessness, passivity, boredom, fear, isolation, and dehumanization, we rebel and refuse to comply. Some of us rebel by becoming inattentive. Others become aggressive. In large numbers we eat, drink and gamble too much. Still others become addicted to drugs, illicit and prescription. Millions work slavishly at dissatisfying jobs, become depressed and passive aggressive, while no small number of us can’t cut it and become homeless and appear crazy. Feeling misunderstood and uncared about, millions of us ultimately rebel against societal demands, however, given our wherewithal, our rebellions are often passive and disorganized, and routinely futile and self-destructive.

When we have hope, energy and friends, we can choose to rebel against societal oppression with, for example, a wildcat strike or a back-to-the-land commune. But when we lack hope, energy and friends, we routinely rebel without consciousness of rebellion and in a manner in which we today commonly call mental illness.

For some Americans, no doubt, the conscious goal is to get classified as mentally disabled so as to receive disability payments (averaging $700 to 1,400 per month [24]). But isn’t that too a withdrawal of cooperation with society and a rebellion of sorts, based on the judgment that this is the best paying and least miserable financial option?

Links:
[1] http://alternet.org
[2] http://www.alternet.org/authors/bruce-e-levine
[3] http://www.nybooks.com/articles/archives/2011/jun/23/epidemic-mental-illness-why/?page=1
[4] http://www.latimes.com/business/money/la-fi-mo-employee-engagement-gallup-poll-20130617,0,5878658.story
[5] http://www.nonopp.com/ar/Psicologia/00/epidemic_depersion.htm
[6] http://www.cdc.gov/nchs/data/databriefs/db76.htm
[7] http://www.cdc.gov/mmwr/preview/mmwrhtml/mm6217a1.htm?s_cid=mm6217a1_w
[8] http://www.nytimes.com/2007/09/04/health/04psych.html?_r=0
[9] http://www.cdc.gov/mmwr/preview/mmwrhtml/su6202a1.htm?s_cid=su6202a1_w
[10] http://www.plosmedicine.org/article/info%3Adoi%2F10.1371%2Fjournal.pmed.1001190
[11] http://www.marketwatch.com/story/15-new-mental-illnesses-in-the-dsm-5-2013-05-22
[12] http://www.huffingtonpost.com/allen-frances/saving-grief-from-dsm-5-a_b_2325108.html
[13] http://www.nimh.nih.gov/about/director/2013/transforming-diagnosis.shtml
[14] http://en.wikipedia.org/wiki/Robert_Spitzer_%28psychiatrist%29
[15] http://www.amazon.com/Anatomy-Epidemic-Bullets-Psychiatric-Astonishing/dp/0307452425/ref=sr_1_1?s=books&ie=UTF8&qid=1354546881&sr=1-1&keywords=Anatomy+of+an+Epidemic%3A+Magic+Bullets%2C+Psychiatric+Drugs%2C+and+the+Astonishing+Rise+of+Mental+Illness+in+Ame
[16] http://www.alternet.org/story/146659/are_prozac_and_other_psychiatric_drugs_causing_the_astonishing_rise_of_mental_illness_in_america?paging=off
[17] http://www.peteearley.com/2013/07/01/nami-convention-coverage-robert-whitakers-case-against-anti-psychotics/?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+peteearley+%28The+Official+Blog+of+Author+Pete+Earley%29
[18] http://businessjournal.gallup.com/content/162953/tackle-employees-stagnating-engagement.aspx
[19] http://thegallupblog.gallup.com/2013/01/the-school-cliff-student-engagement.html
[20] http://www.asa.org/policy/resources/stats/
[21] http://sites.duke.edu/theatrst130s02s2011mg3/files/2011/05/McPherson-et-al-Soc-Isolation-2006.pdf
[22] http://www.amazon.com/Commonsense-Rebellion-Taking-Shrinks-Corporations/dp/0826414508/ref=pd_sim_b_2
[23] http://www.alternet.org/story/154225/would_we_have_drugged_up_einstein_how_anti-authoritarianism_is_deemed_a_mental_health_problem?paging=off
[24] http://www.nolo.com/legal-encyclopedia/social-security-disability-benefits-29686.html
[25] http://www.alternet.org/tags/mental-illness
[26] http://www.alternet.org/%2Bnew_src%2B

Advertisements

Survival Of The Nicest? : A New Theory Of Human Origins Says Cooperation—Not Competition—Is Instinctive

In Uncategorized on July 26, 2013 at 7:51 pm
Hugging Salt Shakers photo by Harlan Harris

Oldspeak: “Breaking news from the department of “Duh”:  How bout that. Capitalism, the system that fosters competition, separation, inequality, vertical hierarchy, uniformity, conditional profit driven cooperation, alienation, and a variety of other maladaptive behaviors, is actually not the best system ever devised, has Dubuya famously asserted. In fact, the sacred precepts of Capitalism “aren’t in sync with our evolutionary roots and may not be good for our long-term success as humans.” Meanwhile we literally train our young to act in ways that are contrary to our naturally beneficial predispositions.  To serve capitalism. Interesting, isn’t it that currently we’re experiences a whole range of threats to our long-term success as humans. Irreversible environmental destruction, rapid non-replenishable resource  depletion, mass extinctions, global drought, and accelerated decline of food and water production, etc, etc, etc… Until we change this counter productive extraction based system, the threats will continue to grow.” –OSJ

By Eric Michael Johnson @ YES Magazine:

A century ago, industrialists like Andrew Carnegie believed that Darwin’s theories justified an economy of vicious competition and inequality. They left us with an ideological legacy that says the corporate economy, in which wealth concentrates in the hands of a few, produces the best for humanity. This was always a distortion of Darwin’s ideas. His 1871 book The Descent of Man argued that the human species had succeeded because of traits like sharing and compassion. “Those communities,” he wrote, “which included the greatest number of the most sympathetic members would flourish best, and rear the greatest number of offspring.” Darwin was no economist, but wealth-sharing and cooperation have always looked more consistent with his observations about human survival than the elitism and hierarchy that dominates contemporary corporate life.

Nearly 150 years later, modern science has verified Darwin’s early insights with direct implications for how we do business in our society. New peer-reviewed research by Michael Tomasello, an American psychologist and co-director of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, has synthesized three decades of research to develop a comprehensive evolutionary theory of human cooperation. What can we learn about sharing as a result?

Tomasello holds that there were two key steps that led to humans’ unique form of interdependence. The first was all about who was coming to dinner. Approximately two million years ago, a fledgling species known as Homo habilis emerged on the great plains of Africa. At the same time that these four-foot-tall, bipedal apes appeared, a period of global cooling produced vast, open environments. This climate change event ultimately forced our hominid ancestors to adapt to a new way of life or perish entirely. Since they lacked the ability to take down large game, like the ferocious carnivores of the early Pleistocene, the solution they hit upon was scavenging the carcasses of recently killed large mammals. The analysis of fossil bones from this period has revealed evidence of stone-tool cut marks overlaid on top of carnivore teeth marks. The precursors of modern humans had a habit of arriving late to the feast.

However, this survival strategy brought an entirely new set of challenges: Individuals now had to coordinate their behaviors, work together, and learn how to share. For apes living in the dense rainforest, the search for ripe fruit and nuts was largely an individual activity. But on the plains, our ancestors needed to travel in groups to survive, and the act of scavenging from a single animal carcass forced proto-humans to learn to tolerate each other and allow each other a fair share. This resulted in a form of social selection that favored cooperation: “Individuals who attempted to hog all of the food at a scavenged carcass would be actively repelled by others,” writes Tomasello, “and perhaps shunned in other ways as well.”

Like what you’re reading? YES! is nonprofit and relies on reader support.
Click here to chip in $5 or more
to help us keep the inspiration coming.

This evolutionary legacy can be seen in our behavior today, particularly among children who are too young to have been taught such notions of fairness. For example, in a 2011 study published in the journal Nature, anthropologist Katharina Hamann and her colleagues found that 3-year-old children share food more equitably if they gain it through cooperative effort rather than via individual labor or no work at all. In contrast, chimpanzees showed no difference in how they shared food under these different scenarios; they wouldn’t necessarily hoard the food individually, but they placed no value on cooperative efforts either. The implication, according to Tomasello, is that human evolution has predisposed us to work collaboratively and given us an intuitive sense that cooperation deserves equal rewards.

The second step in Tomasello’s theory leads directly into what kinds of businesses and economies are more in line with human evolution. Humans have, of course, uniquely large population sizes—much larger than those of other primates. It was the human penchant for cooperation that allowed groups to grow in number and eventually become tribal societies.

Humans, more than any other primate, developed psychological adaptations that allowed them to quickly recognize members of their own group (through unique behaviors, traditions, or forms of language) and develop a shared cultural identity in the pursuit of a common goal.
“The result,” says Tomasello, “was a new kind of interdependence and group-mindedness that went well beyond the joint intentionality of small-scale cooperation to a kind of collective intentionality at the level of the entire society.”

What does this mean for the different forms of business today? Corporate workplaces probably aren’t in sync with our evolutionary roots and may not be good for our long-term success as humans. Corporate culture imposes uniformity, mandated from the top down, throughout the organization. But the cooperative—the financial model in which a group of members owns a business and makes the rules about how to run it—is a modern institution that has much in common with the collective tribal heritage of our species. Worker-owned cooperatives are regionally distinct and organized around their constituent members. As a result, worker co-ops develop unique cultures that, following Tomasello’s theory, would be expected to better promote a shared identity among all members of the group. This shared identity would give rise to greater trust and collaboration without the need for centralized control.

Moreover, the structure of corporations is a recipe for worker alienation and dissatisfaction. Humans have evolved the ability to quickly form collective intentionality that motivates group members to pursue a shared goal. “Once they have formed a joint goal,” Tomasello says, “humans are committed to it.” Corporations, by law, are required to maximize profits for their investors. The shared goal among corporate employees is not to benefit their own community but rather a distant population of financiers who have no personal connection to their lives or labor.

However, because worker-owned cooperatives focus on maximizing value for their members, the cooperative is operated by and for the local community—a goal much more consistent with our evolutionary heritage. As Darwin concluded in The Descent of Man, “The more enduring social instincts conquer the less persistent instincts.” As worker-owned cooperatives continue to gain prominence around the world, we may ultimately witness the downfall of Carnegie’s “law of competition” and a return to the collaborative environments that the human species has long called home.


Eric Michael Johnson wrote this article for How Cooperatives Are Driving the New Economy, the Spring 2013 issue of YES! Magazine. Eric is a doctoral student in the history of science at the University of British Columbia. His research examines the interplay between evolutionary biology and politics.

Obama Administration Gets Explicit: The ‘War On Terror’ Is Permanent. “Limitless War” To Continue For ‘At Least’ 10 to 20 More Years

In Uncategorized on May 27, 2013 at 4:46 pm

https://i1.wp.com/th08.deviantart.net/fs70/PRE/f/2011/140/7/1/1984_the_movie_map_by_33k7-d3gruo4.pngOldspeak: “The war is not meant to be won, it is meant to be continuous. Hierarchical society is only possible on the basis of poverty and ignorance. This new version is the past and no different past can ever have existed. In principle the war effort is always planned to keep society on the brink of starvation. The war is waged by the ruling group against its own subjects and its object is not the victory over either Eurasia or East Asia, but to keep the very structure of society intact.” -George Orwell.

Each year of endless war that passes further normalizes the endless rights erosions justified in its name….Each year that passes, millions of young Americans come of age having spent their entire lives, literally, with these powers and this climate fixed in place: to them, there is nothing radical or aberrational about any of it. The post-9/11 era is all they have been trained to know. That is how a state of permanent war not only devastates its foreign targets but also degrades the population of the nation that prosecutes it.

This war will end only once Americans realize the vast and multi-faceted costs they are bearing so that the nation’s political elites can be empowered and its oligarchs can further prosper. But Washington clearly has no fear that such realizations are imminent. They are moving in the other direction: aggressively planning how to further entrench and expand this war.” –Glenn Grunwald

Today in America, 1 in 2 Americans is low-income and/or poverty-stricken. Americans are the best entertained and quite likely the least well-informed people in the western world. 39% of people who think the Benghazi embassy attack was America’s biggest scandal can’t find it on a map. Poverty of though and life are at historic highs. It is only under conditions like these can 40% percent of Americans be ok with a  U.S.  president asserting the right to act as Remote-controlled Judge, Jury & Executioner of anyone he deems a terrorist, including Americans. (The figure jumps to 65% for non-americans)  Nearly 1 in 5 Americans is on the brink of starvation. War is being waged continuously, secretly, remotely in foreign lands for the sake of  “National Security” to keep our society “intact”.  Many of the conditions that existed in Huxley and Orwell’s dystopic alternate universes exist right now in the real world. In true Orwellian fashion, we’re being told we’re in a “recovery” while many of these conditions are not even acknowledged to exist. While our leaders crow about the end of wars, they continue elsewhere, as plans are made to expand them. U.S.  State Department paid “Private Military Contractors” a.k.a. Mercinaries replace regular U.S. combat personnel, and get paid 3x as much to do a less accountable job of  “force projection” a.k.a occupation of foreign lands.  100o American bases dot the globe, there’s rarely if any talk of closing them.  When will the majority start to question if this is the society we want to remain intact? We will the majority start to seriously consider alternatives to the profoundly corrupt, highly centralized and sociopathic 2 party political farce of governance? Lies are truth. Freedom is slavery, War is peace, Ignorance is strength. All these conditions exist in our real world. Transformational change in essential to our survival.”

By Glenn Grunwald @ The U.K. Guardian:

Last October, senior Obama officials anonymously unveiled to the Washington Post their newly minted “disposition matrix”, a complex computer system that will be used to determine how a terrorist suspect will be “disposed of”: indefinite detention, prosecution in a real court, assassination-by-CIA-drones, etc. Their rationale for why this was needed now, a full 12 years after the 9/11 attack:

Among senior Obama administration officials, there is a broad consensus that such operations are likely to be extended at least another decade. Given the way al-Qaida continues to metastasize, some officials said no clear end is in sight. . . . That timeline suggests that the United States has reached only the midpoint of what was once known as the global war on terrorism.”

On Thursday, the Senate Armed Services Committee held a hearing on whether the statutory basis for this “war” – the 2001 Authorization to Use Military Force (AUMF) – should be revised (meaning: expanded). This is how Wired’s Spencer Ackerman (soon to be the Guardian US’s national security editor) described the most significant exchange:

“Asked at a Senate hearing today how long the war on terrorism will last, Michael Sheehan, the assistant secretary of defense for special operations and low-intensity conflict, answered, ‘At least 10 to 20 years.’ . . . A spokeswoman, Army Col. Anne Edgecomb, clarified that Sheehan meant the conflict is likely to last 10 to 20 more years from today – atop the 12 years that the conflict has already lasted. Welcome to America’s Thirty Years War.”

That the Obama administration is now repeatedly declaring that the “war on terror” will last at least another decade (or two) is vastly more significant than all three of this week’s big media controversies (Benghazi, IRS, and AP/DOJ) combined. The military historian Andrew Bacevich has spent years warning that US policy planners have adopted an explicit doctrine of “endless war”. Obama officials, despite repeatedly boasting that they have delivered permanently crippling blows to al-Qaida, are now, as clearly as the English language permits, openly declaring this to be so.

It is hard to resist the conclusion that this war has no purpose other than its own eternal perpetuation. This war is not a means to any end but rather is the end in itself. Not only is it the end itself, but it is also its own fuel: it is precisely this endless war – justified in the name of stopping the threat of terrorism – that is the single greatest cause of that threat.

In January, former Pentagon general counsel Jeh Johnson delivered a highly-touted speech suggesting that the war on terror will eventually end; he advocated that outcome, arguing:

‘War’ must be regarded as a finite, extraordinary and unnatural state of affairs. We must not accept the current conflict, and all that it entails, as the ‘new normal.'”

In response, I wrote that the “war on terror” cannot and will not end on its own for two reasons: (1) it is designed by its very terms to be permanent, incapable of ending, since the war itself ironically ensures that there will never come a time when people stop wanting to bring violence back to the US (the operational definition of “terrorism”), and (2) the nation’s most powerful political and economic factions reap a bonanza of benefits from its continuation. Whatever else is true, it is now beyond doubt that ending this war is the last thing on the mind of the 2009 Nobel Peace Prize winner and those who work at the highest levels of his administration. Is there any way they can make that clearer beyond declaring that it will continue for “at least” another 10-20 years?

The genius of America’s endless war machine is that, learning from the unplesantness of the Vietnam war protests, it has rendered the costs of war largely invisible. That is accomplished by heaping all of the fighting burden on a tiny and mostly economically marginalized faction of the population, by using sterile, mechanized instruments to deliver the violence, and by suppressing any real discussion in establishment media circles of America’s innocent victims and the worldwide anti-American rage that generates.

Though rarely visible, the costs are nonetheless gargantuan. Just in financial terms, as Americans are told they must sacrifice Social Security and Medicare benefits and place their children in a crumbling educational system, the Pentagon remains the world’s largest employer and continues to militarily outspend the rest of the world by a significant margin. The mythology of the Reagan presidency is that he induced the collapse of the Soviet Union by luring it into unsustainable military spending and wars: should there come a point when we think about applying that lesson to ourselves?

Then there are the threats to Americans’ security. Having their government spend decades proudly touting itself as “A Nation at War” and bringing horrific violence to the world is certain to prompt more and more people to want to attack Americans, as the US government itself claims took place just recently in Boston (and as clearly took place multiple other times over the last several years).

And then there’s the most intangible yet most significant cost: each year of endless war that passes further normalizes the endless rights erosions justified in its name. The second term of the Bush administration and first five years of the Obama presidency have been devoted to codifying and institutionalizing the vast and unchecked powers that are typically vested in leaders in the name of war. Those powers of secrecy, indefinite detention, mass surveillance, and due-process-free assassination are not going anywhere. They are now permanent fixtures not only in the US political system but, worse, in American political culture.

Each year that passes, millions of young Americans come of age having spent their entire lives, literally, with these powers and this climate fixed in place: to them, there is nothing radical or aberrational about any of it. The post-9/11 era is all they have been trained to know. That is how a state of permanent war not only devastates its foreign targets but also degrades the population of the nation that prosecutes it.

This war will end only once Americans realize the vast and multi-faceted costs they are bearing so that the nation’s political elites can be empowered and its oligarchs can further prosper. But Washington clearly has no fear that such realizations are imminent. They are moving in the other direction: aggressively planning how to further entrench and expand this war.

One might think that if there is to be a debate over the 12-year-old AUMF, it would be about repealing it. Democratic Congresswoman Barbara Lee, who heroically cast the only vote against it when it was originally enacted by presciently warning of how abused it would be, has been advocating its repeal for some time now in favor of using reasonable security measures to defend against such threats and standard law enforcement measures to punish them (which have proven far more effective than military solutions). But just as happened in 2001, neither she nor her warnings are deemed sufficiently Serious even to consider, let alone embrace.

Instead, the Washington AUMF “debate” recognizes only two positions: (1) Congress should codify expanded powers for the administration to fight a wider war beyond what the 2001 AUMF provides (that’s the argument recently made by the supreme war-cheerleaders-from-a-safe-distance at the Washington Post editorial page and their favorite war-justifying think tank theorists, and the one being made by many Senators from both parties), or (2) the administration does not need any expanded authority because it is already free to wage a global war with very few limits under the warped “interpretation” of the AUMF which both the Bush and Obama DOJs have successfully persuaded courts to accept (that’s the Obama administration’s position). In other words, the shared premise is that the US government must continue to wage unlimited, permanent war, and the only debate is whether that should happen under a new law or the old one.

Just to convey a sense for how degraded is this Washington “debate”: Obama officials at yesterday’s Senate hearing repeatedly insisted that this “war” is already one without geographical limits and without any real conceptual constraints. The AUMF’s war power, they said, “stretches from Boston to the [tribal areas of Pakistan]” and can be used “anywhere around the world, including inside Syria, where the rebel Nusra Front recently allied itself with al-Qaida’s Iraq affiliate, or even what Sen. Lindsey Graham (R-SC) called ‘boots on the ground in Congo'”. The acting general counsel of the Pentagon said it even “authorized war against al-Qaida’s associated forces in Mali, Libya and Syria”. Newly elected independent Sen. Angus King of Maine said after listening to how the Obama administration interprets its war powers under the AUMF:

This is the most astounding and most astoundingly disturbing hearing that I’ve been to since I’ve been here. You guys have essentially rewritten the Constitution today.”

Former Bush DOJ official Jack Goldsmith, who testified at the hearing, summarized what was said after it was over: Obama officials argued that “they had domestic authority to use force in Mali, Syria, Libya, and Congo, against Islamist terrorist threats there”; that “they were actively considering emerging threats and stated that it was possible they would need to return to Congress for new authorities against those threats but did not at present need new authorities”; that “the conflict authorized by the AUMF was not nearly over”; and that “several members of the Committee were surprised by the breadth of DOD’s interpretation of the AUMF.” Conveying the dark irony of America’s war machine, seemingly lifted right out of the Cold War era film Dr. Strangelove, Goldsmith added:

Amazingly, there is a very large question even in the Armed Services Committee about who the United States is at war against and where, and how those determinations are made.”

Nobody really even knows with whom the US is at war, or where. Everyone just knows that it is vital that it continue in unlimited form indefinitely.

In response to that, the only real movement in Congress is to think about how to enact a new law to expand the authorization even further. But it’s a worthless and illusory debate, affecting nothing other than the pretexts and symbols used to justify what will, in all cases, be a permanent and limitless war. The Washington AUMF debate is about nothing other than whether more fig leafs are needed to make it all pretty and legal.

The Obama administration already claims the power to wage endless and boundless war, in virtually total secrecy, and without a single meaningful check or constraint. No institution with any power disputes this. To the contrary, the only ones which exert real influence – Congress, the courts, the establishment media, the plutocratic class – clearly favor its continuation and only think about how further to enable it. That will continue unless and until Americans begin to realize just what a mammoth price they’re paying for this ongoing splurge of war spending and endless aggression.

Related matters

Although I’m no fan of mindless partisan hackery, one must acknowledge, if one is to be honest, that sometimes it produces high comedy of the type few other afflictions are capable of producing.

On a related note: when Attorney General Eric Holder spoke about the DOJ’s subpoeans for AP’s phone records – purportedly issued in order to find the source for AP’s story about a successfully thwarted terror attack from Yemen – he made this claim about the leak they were investigating: “if not the most serious, it is within the top two or three most serious leaks that I have ever seen.” But yesterday, the Washington Post reported that CIA officials gave the go-ahead to AP to report the story, based in part on the fact that the administration itself planned to make a formal announcement boasting of their success in thwarting the plot. Meanwhile, the invaluable Marcy Wheeler today makes a strong case that the Obama administration engaged in a fear-mongering campaign over this plot that they knew at the time was false – all for the purpose of justifying the president’s newly announced “signature drone strikes” in Yemen.

The key lesson from all of this should have been learned long ago: nothing is less reliable than unchecked claims from political officials that their secret conduct is justified by National Security Threats and the desire to Keep Us Safe.