"In a time of universal deceit telling the truth is a revolutionary act." -George Orwell

Posts Tagged ‘History’

Anthropogenic Climate Change Setting The Stage For Worldwide Wars Over Decreasing Food & Water

In Uncategorized on April 2, 2014 at 8:19 pm

Oldspeak: ““Gradual warming of the globe may not be noticed by most, but everyone – either directly or indirectly – will be affected to some degree by changes in the frequency and intensity of extreme weather events as green-house gases continue to accumulate in the atmosphere.”

Scientists are already cognizant of how badly a warming Arctic impacts subsistence, for example, according to the Arctic Methane Emergency Group: “The weather extremes … are causing real problems for farmers… World food production can be expected to decline, with mass starvation inevitable. The price of food will rise inexorably, producing global unrest and making food security even more of an issue.”

“The nexus between climate change, human migration, and instability constitutes … a transcendent challenge. The conjunction of these undercurrents was most recently visible during the Arab Spring, where food availability, increasing food prices, drought, and poor access to water, as well as urbanization and international migration contributed to the pressures that underpinned the political upheaval.

As for example, Syria suffered from devastating droughts in the decade leading up to its rebellion as the country’s total water resources cut in half between 2002 and 2008. As a result, the drier winters hit Syria, which, at the time, was the top wheat-growing region of the eastern Mediterranean, thereby, exacerbating its crisis.” -Robert Hunziker and Jack Hunziker

Failing harvests in the US, Ukraine and other countries this year have eroded reserves to their lowest level since 1974. The US, now holds in reserve a historically low 6.5% of the maize that it expects to consume in the next year… We’ve not been producing as much as we are consuming. That is why stocks are being run down. Supplies are now very tight across the world and reserves are at a very low level, leaving no room for unexpected events… Global grain consumption has exceeded production in 8 of the last 13 years, leading to a drawdown in reserves. Worldwide, carryover grain stocks—the amount left in the bin when the new harvest begins—stand at 423 million tons, enough to cover 68 days of consumption. This is just 6 days more than the low that preceded the 2007–08 grain crisis, when several countries restricted exports and food riots broke out in dozens of countries because of the spike in prices…. Lester Brown, president of the Earth policy research centre in Washington, says that the climate is no longer reliable and the demands for food are growing so fast that a breakdown is inevitable, unless urgent action is taken.Food shortages undermined earlier civilisations. We are on the same path. Each country is now fending for itself. The world is living one year to the next.” We are entering a new era of rising food prices and spreading hunger. Food supplies are tightening everywhere and land is becoming the most sought-after commodity as the world shifts from an age of food abundance to one of scarcity,” says Brown. “The geopolitics of food is fast overshadowing the geopolitics of oil.”His warnings come as the UN and world governments reported that extreme heat and drought in the US and other major food-exporting countries had hit harvests badly and sent prices spiralling. “The situation we are in is not temporary. These things will happen all the time. Climate is in a state of flux and there is no normal any more. “We are beginning a new chapter. We will see food unrest in many more places.”  Armed aggression is no longer the principal threat to our future. The overriding threats to this century are climate change, population growth, spreading water shortages and rising food prices,” Brown says.” -John Vidal, U.K. Observer.

“Look beyond the propaganda. The “Arab Spring”, Unrest in Venezuela, Ukraine, and dozens of other countries on all continents are not about “freedom” and “democracy” and “people rising up against dictators”. it’s about food. And the shrinking availability of it as a result of Anthropogenic Climate Change. We are consuming more than we are producing and with less water available as temperatures rise and droughts and other extreme weather worsens, you can expect food production to continue to fall. With human population continuing to rise, this is a recipe for disaster. Our food production systems are unsustainable and toxic to the ecology. And they are practically certain to fail as ever rising food demands far outsize falling production. Then what?” -OSJ

By Robert Hunziker and Jack Hunziker @ Dissident Voice

The “warming of the Arctic” could become one of the greatest catastrophes in human history, even exceeding the notoriety of Adolf Hitler and Genghis Khan. Likely, it will impact more people than the combined effect of those brutal leaders. In fact, global warming may eventually be categorized as the greatest threat of all time, even greater than the Black Death’s 75-to-200 million dead, circa 1350.

The integrity of Arctic sea ice is essential to prevent the risks of (1) methane outbreak and/or (2) fierce, damaging weather throughout the Northern Hemisphere. Regrettably, the Arctic “sea ice area” registered a seasonal record low on March 10, 2014 at 12.95 million square kilometers. Whereas, ‘maximum ice growth’ is usually expected in March, not all-time seasonal lows immediately preceding the onset of summer.1

Extreme weather events, as a consequence of the warming Arctic, will likely wreak havoc over the entire Northern Hemisphere, causing severe droughts, freezing cold spells, and widespread flooding (some early evidence of this is already at hand.)

These combinations of extreme weather events have the potential to rival the damage of the great mythical floods. Already, Eastern Europe had a taste of extreme climate change in 2013 when a once-in-500-year flood hit hard, wiping out vast swaths of cropland.

In the future, when shortages of food and water become more commonplace because of extreme climactic change, it is probable that desperate groups of roughnecks will battle for food and water, similar to the dystopia depicted in Mad Max (Warner Bros. 1979) the story of a breakdown of society where bandit tribes battle over the last remaining droplets of petroleum.

Over time, climate change is setting the stage for worldwide wars over food & water.

Origin of Food and Water Wars

Research conducted by Jennifer Francis, PhD, Rutgers University – Institute of Marine and Coastal Sciences, shows that Arctic sea ice loss, with its consequent warming, impacts upper-level atmospheric circulation, radically distorting jet streams above 30,000 feet, which adversely affects weather patterns throughout the Northern Hemisphere.2

“Gradual warming of the globe may not be noticed by most, but everyone – either directly or indirectly – will be affected to some degree by changes in the frequency and intensity of extreme weather events as green-house gases continue to accumulate in the atmosphere.”2

Scientists are already cognizant of how badly a warming Arctic impacts subsistence, for example, according to the Arctic Methane Emergency Group: “The weather extremes … are causing real problems for farmers… World food production can be expected to decline, with mass starvation inevitable. The price of food will rise inexorably, producing global unrest and making food security even more of an issue.”3

“The nexus between climate change, human migration, and instability constitutes … a transcendent challenge. The conjunction of these undercurrents was most recently visible during the Arab Spring, where food availability, increasing food prices, drought, and poor access to water, as well as urbanization and international migration contributed to the pressures that underpinned the political upheaval.”4

As for example, Syria suffered from devastating droughts in the decade leading up to its rebellion as the country’s total water resources cut in half between 2002 and 2008. As a result, the drier winters hit Syria, which, at the time, was the top wheat-growing region of the eastern Mediterranean, thereby, exacerbating its crisis.

In 2009 the UN and the International Federation of Red Cross and Red Crescent Societies reported that more than 800,000 Syrians lost their entire means of livelihood because of drought.5

In the recent past, ferocious weather conditions have struck all across the planet, for example: a once every 500-year flood in Eastern Europe, a once in 50-year drought in the U.S. Midwest, the worst drought in 200 years in China, affecting more people than the entire population of North America; the worst flooding in Pakistan in 100 years (a continuous deluge lasting for over a month); the most costly flash flood damage in Canada’s modern history; Syria’s drought has been classified as the worst in the history of the Fertile Crescent while Brazil is experiencing it’s worst drought in decades, the list goes on, and on, and on.
Merciless weather is lashing out with torrential storms and embedded droughts like never before. No other period of time in modern history comes close.

The reason behind the weather dilemma has everything to do with global warming in the Arctic, which is warming 2-3 times faster than elsewhere on the planet. In turn, the Arctic, which serves as the thermostat for the entire Northern Hemisphere, is disrupting the jet streams, which, as a result, influences weather patterns throughout the hemisphere, causing droughts and torrential storms to become “embedded or stalled” for long duration, e.g., Colorado’s torrential downpour and massive flooding in 2013, which was as fierce as superabundant coastal tropical storms but not at all like mid-latitude, middle America storms.

History Repeats

Once food and water shortages become widespread as a result of a more extreme and unpredictable climate behavior, it is highly probable that people all across the planet will become so disgusted and distraught that they’ll be looking for blood.

In that regard, history shows that, during such times, desperation overrides prudence. Therefore, hiding behind security gates and armed troops won’t make a difference, similar to the late 18th century French Revolution when masses of citizens used pitchforks, stones, and sticks to overwhelm the king’s formidable armed forces. At the time, France was one of the mightiest forces in the world, but like toy soldiers, its army fell at the hands of its own citizens.

In the end, civilizations cannot, and have not, survived the forces of desperation born of starvation.
In the case of Paris, two years of poor grain harvests because of bad weather conditions set the stage for revolution. On June 21, 1791 the king, queen, and their attendants fled their Paris residences, whisked away in carriages, as masses of enraged, starving protestors swarmed the city streets.

The forewarnings had been there years beforehand. On August 20, 1986 Finance Minister Calonne informed King Louis XVI that the royal finances were insolvent (because of costly foreign wars- like the U.S. today) Hard times hit (also similar to U.S. today) Six months later the First Assembly of Notables met, resisting imposition of taxes and fiscal reforms (similar to the U.S. right wing today) It was nearly three years later April 27th, 1789 when the Reveillon Riot in Paris, caused by low wages (like U.S. wages today, Wal-Mart, McDonalds) and food shortages (not in U.S. yet), led to 25 deaths by troops.

Thereafter, the public’s anger grew to a fever pitch. On July the 14th rioters stormed the most notorious jail for political prisoners in all of France, the Bastille. By July 17th the “Great Fear” had begun to taken command of the streets as the peasantry revolted against their socio-economic system.

One of their prime targets was Queen Marie Antoinette, the Dauphin of the world’s most powerful monarchy, whose last spoken words were delivered to Henri Sanson, her executioner, as she accidentally stepped on his foot upon climbing the steps of the scaffold: “Monsieur, I ask your pardon. I did not do it on purpose,” before losing her head in front of tens of thousands of cheering Parisians, screaming “Vive la Nation!

Flash forward in time into the future, and imagine the backlash in the country if food shortages hit America because of the failure of the government to set policies to convert fossil fuels to renewable energy sources. As such, the US could have led the entire world to conversion to renewable sources of energy. As things stand, it is a “missed opportunity.”

In stark contrast to America’s reluctance, Scotland’s energy sources are already 40% renewables and will be 100% by 2020.

Food and Civil Disturbances

According to a landmark study, “Food insecurity is both cause and a consequence of political violence.” Henk-Jan Brinkman and Cullen S. Hendrix, Food Insecurity and Conflict, The World Development Report 2011.
The link between high grain prices and riots is well established. For example, according to The Economist magazine (December 2007), when high grain prices sparked riots in 48 countries, the magazine’s food- price index was at its highest point since originating in 1845.

As for a more current situation, the Arab Spring uprisings of 2011 brought political and economic issues to the forefront, but behind the scenes, climate stress played a big role.

According to Marco Lagif of the New England Complex Systems Institute (NECSI) in Technology Review, MIT, August 2011, the single factor that triggers riots around the world is the price of food. The evidence comes from data gathered by the United Nations that plots the price of food against time, the so-called Food Price Index of the Food and Agriculture Organization of the UN.

On December 13, 2011, four days before Mohamed Bouazizi set himself on fire in Tunisia, sparking the Arab Spring riots, NECSI contacted the U.S. government, warning that global food prices were about to cross the tipping point when almost anything can trigger riots.

Accordingly, the NECSI study was presented, by invitation, at the World Economic Forum in Davos and was featured as one of the top ten discoveries in science in 2011 by Wired magazine.

“Definitely, it is one of the causes of the Arab Spring,” says Shenggen Fan, director-general of the International Food Policy Research Institute. As well, it is increasingly clear that the climate models that predicted the countries surrounding the Mediterranean would start to dry out are correct.6

As for Syria, it is a prime example of the drama of changing climatic conditions and the consequences. The country’s farmlands north and east of the Euphrates River constitute the breadbasket of the Middle East. Unfortunately, up to 60 percent of Syria’s land experienced one of the worst droughts on record from 2006-11.
In Syria’s northeast and the south, nearly 75 percent suffered total crop failure. Herders in the northeast lost 85 percent of their livestock. According to the UN, 800,000 Syrians had their livelihoods totally wiped out, moving to the cities to find work or to refugee camps, similar to what happened in Paris in the late 18th century.

Furthermore, the drought pushed three million Syrians into extreme poverty. According to Abeer Etefa of the World Food Program, “Food inflation in Syria remains the main issue for citizens,” eerily similar to what occurred in France in the late 18th century just prior to it’s revolution.
The French Revolution Redux, in America?

As countries like the United States hastily continue their pursuit of policies dedicated to ‘energy independence’ by fracking, using extreme pressure to force toxic chemicals underground to suck up every last remnant of oil and gas, the warming of the Arctic is elevated, and the jet streams become more distorted, resulting in extremely harsh, deadly and unpredictable weather systems, pummeling the entire Northern Hemisphere.

Eventually, the outcome leads to shortages of food, and like a flashback of 18th century France, people starve or fight.

_______________________________________________________________________________________

  1. Source: NSIDC, National Snow and Ice Data Center, Boulder, CO. []
  2. Jennifer A. Francis and Stephen J. Vavrus, Evidence Linking Arctic Amplification to Extreme Weather in Mid-Latitudes, Geophysical Research Letters, Vol. 39, L06801, 17 March 2012. [] []
  3. Source: Arctic Methane Emergency Group. []
  4. Michael Werz and Max Hofman, Climate Change, Migration, and Conflict, The Arab Spring and Climate Change, Climate and Security Correlations Series, Feb. 2013. []
  5. Robert F. Worth, Earth is Parched Where Syrian Farms Thrived, New York Times, Oct. 13, 2010. []
  6. “Human-Caused Climate Change Already a Major Factor in more Frequent Mediterranean Droughts,” National Oceanic and Atmospheric Administration, NOAA, October 27, 2011. []

Robert Hunziker (MA in economic history at DePaul University, Chicago) is a former hedge fund manager and now a professional independent negotiator for worldwide commodity actual transactions and a freelance writer for progressive publications as well as business journals. He can be contacted at: rlhunziker@gmail.com. Jack Hunziker is a composer and critic of music. He attended Crossroads School in Santa Monica and is an on-and-off student at UCLA. Read other articles by Robert Hunziker and Jack Hunziker.

Free Speech Under Siege In The “West”

In Uncategorized on June 27, 2011 at 1:51 pm

Oldspeak:“Democracies stand for free speech; dictatorships suppress it….The censorship of memory, which we once fondly imagined to be the mark of dictatorship, is now a major growth industry in the “free” West. Indeed, official censorship is only the tip of an iceberg of cultural censorship. A public person must be on constant guard against causing offense, whether intentionally or not.” - Robert Skidelsky. How can knowledge, discovery, and intellectual advancement be achieved without free, unfettered inquiry and constant and rigorous questioning of “accepted truths” based in religion, science or cultural memory?  Political correctness cannot ever usurp freedom of speech, to do so opens the door to authoritarianism, totalitarianism, rigidity of thought and society. There should be no such thing as accepted ways of thinking in a free society. The frightening thing is in the supposedly “free” U.S. much of the population self-censors and acts as thought police to those who think outside the politically correct and accepted spheres of thought. Phrases like “Conspiracy Theorist”, “Radical” “Fringe Elements” or ” ‘Your name here’ Extremists” are used to dismiss un-PC thought and speech as not worthy of serious, critical consideration, as they fly in the face of generally “accepted truths”  There are fewer and fewer public spheres one can introduce ideas which challenge people to actually think and consider facts that don’t jive with what they see in corporate media networks and learn from commodified, corporate controlled for-profit education systems. This has a chilling effect on those interested in engaging in political protest movements, dissent, and challenging and questioning the official narrative of history and objective reality. It’s what leads the Department of Justice to think it’s ok to surveil harass and violate the civil liberties of  law abiding citizens who dare dissent. It that that much different than what goes on in China, Iran, or Israel? If people are discouraged or afraid to engage politically in any way that they wish, state-sanctioned or not, democracy dies.”

By Robert Skidelsky @ Project Syndicate:

Recently, at a literary festival in Britain, I found myself on a panel discussing free speech. For liberals, free speech is a key index of freedom. Democracies stand for free speech; dictatorships suppress it.

When we in the West look outward, this remains our view. We condemn governments that silence, imprison, and even kill writers and journalists. Reporters Sans Frontièreskeeps a list: 24 journalists have been killed, and 148 imprisoned, just this year. Part of the promise we see in the “Arab Spring” is the liberation of the media from the dictator’s grasp.

Yet freedom of speech in the West is under strain. Traditionally, British law imposed two main limitations on the “right to free speech.” The first prohibited the use of words or expressions likely to disrupt public order; the second was the law against libel. There are good grounds for both – to preserve the peace, and to protect individuals’ reputations from lies. Most free societies accept such limits as reasonable.

But the law has recently become more restrictive. “Incitement to religious and racial hatred” and “incitement to hatred on the basis of sexual orientation” are now illegal in most European countries, independent of any threat to public order. The law has shifted from proscribing language likely to cause violence to prohibiting language intended to give offense.

A blatant example of this is the law against Holocaust denial. To deny or minimize the Holocaust is a crime in 15 European countries and Israel. It may be argued that the Holocaust was a crime so uniquely abhorrent as to qualify as a special case. But special cases have a habit of multiplying.

France has made it illegal to deny any “internationally recognized crimes against humanity.” Whereas in Muslim countries it is illegal to call the Armenian massacres of 1915-1917 “genocide,” in some Western countries it is illegal to say that they were not. Some East European countries specifically prohibit the denial of communist “genocides.”

The censorship of memory, which we once fondly imagined to be the mark of dictatorship, is now a major growth industry in the “free” West. Indeed, official censorship is only the tip of an iceberg of cultural censorship. A public person must be on constant guard against causing offense, whether intentionally or not.

Breaking the cultural code damages a person’s reputation, and perhaps one’s career. Britain’s Home Secretary Kenneth Clarke recently had to apologize for saying that some rapes were less serious than others, implying the need for legal discrimination. The parade of gaffes and subsequent groveling apologies has become a regular feature of public life.

In his classic essay On Liberty, John Stuart Mill defended free speech on the ground that free inquiry was necessary to advance knowledge. Restrictions on certain areas of historical inquiry are based on the opposite premise: the truth is known, and it is impious to question it. This is absurd; every historian knows that there is no such thing as final historical truth.

It is not the task of history to defend public order or morals, but to establish what happened. Legally protected history ensures that historians will play safe. To be sure, living by Mill’s principle often requires protecting the rights of unsavory characters. David Irving writes mendacious history, but his prosecution and imprisonment in Austria for “Holocaust denial” would have horrified Mill.

By contrast, the pressure for “political correctness” rests on the argument that the truth is unknowable. Statements about the human condition are essentially matters of opinion.  Because a statement of opinion by some individuals is almost certain to offend others, and since such statements make no contribution to the discovery of truth, their degree of offensiveness becomes the sole criterion for judging their admissibility. Hence the taboo on certain words, phrases, and arguments that imply that certain individuals, groups, or practices are superior or inferior, normal or abnormal; hence the search for ever more neutral ways to label social phenomena, thereby draining language of its vigor and interest.

A classic example is the way that “family” has replaced “marriage” in public discourse, with the implication that all “lifestyles” are equally valuable, despite the fact that most people persist in wanting to get married. It has become taboo to describe homosexuality as a “perversion,” though this was precisely the word used in the 1960’s by the radical philosopher Herbert Marcuse (who was praising homosexuality as an expression of dissent). In today’s atmosphere of what Marcuse would call “repressive tolerance,” such language would be considered “stigmatizing.”

The sociological imperative behind the spread of “political correctness” is the fact that we no longer live in patriarchal, hierarchical, mono-cultural societies, which exhibit general, if unreflective, agreement on basic values. The pathetic efforts to inculcate a common sense of “Britishness” or “Dutchness” in multi-cultural societies, however well-intentioned, attest to the breakdown of a common identity.

Public language has thus become the common currency of cultural exchange, and everyone is on notice to mind one’s manners. The result is a multiplication of weasel words that chill political and moral debate, and that create a widening gap between public language and what many ordinary people think.

The defense of free speech is made no easier by the abuses of the popular press. We need free media to expose abuses of power. But investigative journalism becomes discredited when it is suborned to “expose” the private lives of the famous when no issue of public interest is involved. Entertaining gossip has mutated into an assault on privacy, with newspapers claiming that any attempt to keep them out of people’s bedrooms is an assault on free speech.

You know that a doctrine is in trouble when not even those claiming to defend it understand what it means. By that standard, the classic doctrine of free speech is in crisis. We had better sort it out quickly – legally, morally, and culturally – if we are to retain a proper sense of what it means to live in a free society.

Robert Skidelsky, a member of the British House of Lords, is Professor Emeritus of Political Economy at Warwick University.

 

Zombie Politics, Democracy, And The Threat of Authoritarianism

In Uncategorized on June 10, 2011 at 11:59 am

Oldspeak:”In the minds of the American public, the dominant media, and the accommodating pundits and intellectuals, there is no sense of how authoritarianism in its soft and hard forms can manifest itself as anything other than horrible images of concentration camps, goose-stepping storm troopers, rigid modes of censorship, and chilling spectacles of extremist government repression and violence. That is, there is little understanding of how new modes of authoritarian ideology, policy, values, and social relations might manifest themselves in degrees and gradations so as to create the conditions for a distinctly undemocratic and increasingly cruel and oppressive social order. As the late Susan Sontag suggested in another context, there is a willful ignorance of how emerging registers of power and governance “dissolve politics into pathology.”[10] It is generally believed that in a constitutional democracy, power is in the hands of the people, and that the long legacy of democratic ideals in America, however imperfect, is enough to prevent democracy from being subverted or lost. And yet the lessons of history provide clear examples of how the emergence of reactionary politics, the increasing power of the military, and the power of big business subverted democracy in Argentina, Chile, Germany, and Italy. In spite of these histories, there is no room in the public imagination to entertain what has become the unthinkable—that such an order in its contemporary form might be more nuanced, less theatrical, more cunning, less concerned with repressive modes of control than with manipulative modes of consent—what one might call a mode of authoritarianism with a distinctly American character.” – Henry A. Giroux

By Henry A. Giroux @ Truthout:

Introduction (Part I)

Education is the point at which we decide whether we love the world enough to assume responsibility for it and by the same token save it from ruin which, except for renewal, except for the coming of the new and young, would be inevitable. And education, too, is where we decide whether we love our children enough not to expel them from our world and leave them to their own devices, nor to strike from their hands their chance of undertaking something new, something unforeseen by us, but to prepare them in advance for the task of renewing a common world. -Hannah Arendt [1]

The Rise of Zombie Politics

In the world of popular culture, zombies seem to be everywhere, as evidenced by the relentless slew of books, movies, video games, and comics. From the haunting Night of the Living Dead to the comic movie Zombieland, the figure of the zombie has captured and touched something unique in the contemporary imagination. But the dark and terrifying image of the zombie with missing body parts, oozing body fluids, and an appetite for fresh, living, human brains does more than feed the mass-marketing machines that prey on the spectacle of the violent, grotesque, and ethically comatose. There is more at work in this wave of fascination with the grotesquely walking hyper-dead than a Hollywood appropriation of the dark recesses and unrestrained urges of the human mind. The zombie phenomenon is now on display nightly on television alongside endless examples of destruction unfolding in real-time. Such a cultural fascination with proliferating images of the living hyper-dead and unrelenting human catastrophes that extend from a global economic meltdown to the earthquake in Haiti to the ecological disaster caused by the oil spill in the Gulf of Mexico signals a shift away from the hope that accompanies the living to a politics of cynicism and despair. The macabre double movement between “the dead that walk”[2] and those who are alive but are dying and suffering cannot be understood outside of the casino capitalism that now shapes every aspect of society in its own image. A casino capitalist zombie politics views competition as a form of social combat, celebrates war as an extension of politics, and legitimates a ruthless Social Darwinism in which particular individuals and groups are considered simply redundant, disposable—nothing more than human waste left to stew in their own misfortune—easy prey for the zombies who have a ravenous appetite for chaos and revel in apocalyptic visions filled with destruction, decay, abandoned houses, burned-out cars, gutted landscapes, and trashed gas stations.

The twenty-first-century zombies no longer emerge from the grave; they now inhabit the rich environs of Wall Street and roam the halls of the gilded monuments of greed such as Goldman Sachs. As an editorial in The New York Times points out, the new zombies of free-market fundamentalism turned “the financial system into a casino. Like gambling, the transactions mostly just shifted paper money around the globe. Unlike gambling, they packed an enormous capacity for collective and economic destruction—hobbling banks that made bad bets, freezing credit and economic activity. Society—not the bankers—bore the cost.”[3] In this way, the zombie— the immoral, sub-Nietzschean, id-driven “other” who is “hyper-dead” but still alive as an avatar of death and cruelty—provides an apt metaphor for a new kind of authoritarianism that has a grip on contemporary politics in the United States.[4] This is an authoritarianism in which mindless self-gratification becomes the sanctioned norm and public issues collapse into the realm of privatized anger and rage. The rule of the market offers the hyper-dead an opportunity to exercise unprecedented power in American society, reconstructing civic and political culture almost entirely in the service of a politics that fuels the friend/enemy divide, even as democracy becomes the scandal of casino capitalism—its ultimate humiliation.

Click below to listen to The Critical Lede’s audio interview with Dr. Henry Giroux.

Press play to listen to the interview:

But the new zombies are not only wandering around in the banks, investment houses, and death chambers of high finance, they have an ever-increasing presence in the highest reaches of government and in the forefront of mainstream media. The growing numbers of zombies in the mainstream media have huge financial backing from the corporate elite and represent the new face of the culture of cruelty and hatred in the second Gilded Age. Any mention of the social state, putting limits on casino capitalism, and regulating corporate zombies puts Sarah Palin, Glenn Beck,
Rush Limbaugh, and other talking heads into a state of high rage. They disparage any discourse that embraces social justice, social responsibility, and human rights. Appealing to “real” American values such as family, God, and Guns, they are in the forefront of a zombie politics that opposes any legislation or policy designed to lessen human suffering and promote economic and social progress. As Arun Gupta points out, they are insistent in their opposition to “civil rights, school desegregation, women’s rights, labor organizing, the minimum wage, Social Security, LGBT rights, welfare, immigrant rights, public education, reproductive rights, Medicare, [and] Medicaid.”[5] The walking hyper-dead even oppose providing the extension of unemployment benefits to millions of Americans who are out of work, food, and hope. They spectacularize hatred and trade in lies and misinformation. They make populist appeals to the people while legitimating the power of the rich. They appeal to common sense as a way of devaluing a culture of questioning and critical exchange. Unrelenting in their role as archetypes of the hyper-dead, they are misanthropes trading in fear, hatred, and hyper-nationalism.

The human suffering produced by the walking hyper-dead can also be seen in the nativist apoplexy resulting in the racist anti-immigration laws passed in Arizona, the attempts to ban ethnic studies in public schools, the rise of the punishing state, the social dumping of millions of people of color into prisons, and the attempts of Tea Party fanatics and politicians who want to “take back America” from President Barack Obama—described in the new lexicon of right-wing political illiteracy as both an alleged socialist and the new Hitler. Newt Gingrich joins Glenn Beck and other members of the elite squad of the hyper-dead in arguing that Obama is just another version of Joseph Stalin. For Gingrich and the rest of the zombie ideologues, any discourse that advocates for social protections, easing human suffering, or imagining a better future is dismissed by being compared to the horrors of the Nazi holocaust. Dystopian discourse and End Times morbidity rule the collective consciousness of this group.

The “death panels” envisaged by Sarah Palin are not going to emerge from Obama’s health care reform plan but from the toolkits the zombie politicians and talking heads open up every time they are given the opportunity to speak. The death threats, vandalism, and crowds shouting homophobic slurs at openly gay U.S. House Representative Barney Frank already speak to a fixation with images of death, violence, and war that now grips the country. Sarah Palin’s infamous call to a gathering of her followers to “reload” in opposition to President Obama’s policies—soon followed in a nationally televised press conference with a request for the American people to embrace Arizona’s new xenophobic laws—makes her one of the most prominent of the political zombies. Not only has she made less-than-vague endorsements of violence in many of her public speeches, she has cheerfully embraced the new face of white supremacy in her recent unapologetic endorsement of racial profiling, stating in a widely reported speech that “It’s time for Americans across this great country to stand up and say, ‘We’re all Arizonians now.’”[6] The current descent into racism, ignorance, corruption, and mob idiocy makes clear the degree to which politics has become a sport for zombies rather than engaged and thoughtful citizens.[7]

The hyper-dead celebrate talk radio haters such as Rush Limbaugh, whose fanaticism appears to pass without criticism in the mainstream media. Limbaugh echoes the fanatics who whipped up racial hatred in Weimar Germany, the ideological zombies who dissolved the line between reason and distortion-laden propaganda. How else to explain his claim “that environmentalist terrorists might have caused the ecological disaster in the gulf”?[8] The ethically frozen zombies that dominate screen culture believe that only an appeal to self-interest motivates people—a convenient counterpart to a culture of cruelty that rebukes, if not disdains, any appeal to the virtues of a moral and just society. They smile at their audiences while collapsing the distinction between opinions and reasoned arguments. They report on Tea Party rallies while feeding the misplaced ideological frenzy that motivates such gatherings but then refuse to comment on rallies all over the country that do not trade in violence or spectacle. They report uncritically on Islam bashers, such as the radical right-wing radio host Michael Savage, as if his ultra-extremist racist views are a legitimate part of the American mainstream. In the age of zombie politics, there is too little public outrage or informed public anger over the pushing of millions of people out of their homes and jobs, the defunding of schools, and the rising tide of homeless families and destitute communities. Instead of organized, massive protests against casino capitalism, the American public is treated to an endless and arrogant display of wealth, greed, and power. Armies of zombies tune in to gossip-laden entertainment, game, and reality TV shows, transfixed by the empty lure of celebrity culture.

The roaming hordes of celebrity zombie intellectuals work hard to fuel a sense of misguided fear and indignation toward democratic politics, the social state, and immigrants—all of which is spewed out in bitter words and comes terribly close to inciting violence. Zombies love death-dealing institutions, which accounts for why they rarely criticize the bloated military budget and the rise of the punishing state and its expanding prison system. They smile with patriotic glee, anxious to further the demands of empire as automated drones kill innocent civilians—conveniently dismissed as collateral damage—and the torture state rolls inexorably along in Afghanistan, Iraq, and in other hidden and unknown sites. The slaughter that inevitably follows catastrophe is not new, but the current politics of death has reached new heights and threatens to transform a weak democracy into a full-fledged authoritarian state.

A Turn to the Dark Side of Politics

The American media, large segments of the public, and many educators widely believe that authoritarianism is alien to the political landscape of American society. Authoritarianism is generally associated with tyranny and governments that exercise power in violation of the rule of law. A commonly held perception of the American public is that authoritarianism is always elsewhere. It can be found in other allegedly “less developed/civilized countries,” such as contemporary China or Iran, or it belongs to a fixed moment in modern history, often associated with the rise of twentieth-century totalitarianism in its different forms in Germany, Italy, and the Soviet Union under Stalin. Even as the United States became more disposed to modes of tyrannical power under the second Bush administration—demonstrated, for example, by the existence of secret CIA prisons, warrantless spying on Americans, and state-sanctioned kidnaping—mainstream liberals, intellectuals, journalists, and media pundits argued that any suggestion that the United States was becoming an authoritarian society was simply preposterous. For instance, the journalist James Traub repeated the dominant view that whatever problems the United States faced under the Bush administration had nothing to do with a growing authoritarianism or its more extreme form, totalitarianism.[9] On the contrary, according to this position, America was simply beholden to a temporary seizure of power by some extremists, who represented a form of political exceptionalism and an annoying growth on the body politic. In other words, as repugnant as many of Bush’s domestic and foreign policies might have been, they neither threatened nor compromised in any substantial way America’s claim to being a democratic society.

Against the notion that the Bush administration had pushed the United States close to the brink of authoritarianism, some pundits have argued that this dark moment in America’s history, while uncharacteristic of a substantive democracy, had to be understood as temporary perversion of American law and democratic ideals that would end when George W. Bush concluded his second term in the White House. In this view, the regime of George W. Bush and its demonstrated contempt for democracy was explained away as the outgrowth of a random act of politics— a corrupt election and the bad-faith act of a conservative court in 2000 or a poorly run election campaign in 2004 by an uncinematic and boring Democratic candidate. According to this narrative, the Bush-Cheney regime exhibited such extreme modes of governance in its embrace of an imperial presidency, its violation of domestic and international laws, and its disdain for human rights and democratic values that it was hard to view such antidemocratic policies as part of a pervasive shift toward a hidden order of authoritarian politics, which historically has existed at the margins of American society. It would be difficult to label such a government other than as shockingly and uniquely extremist, given a political legacy that included the rise of the security and torture state; the creation of legal illegalities in which civil liberties were trampled; the launching of an unjust war in Iraq legitimated through official lies; the passing of legislative policies that drained the federal surplus by giving away more than a trillion dollars in tax cuts to the rich; the enactment of a shameful policy of preemptive war; the endorsement of an inflated military budget at the expense of much-needed social programs; the selling off of as many government functions as possible to corporate interests; the resurrection of an imperial presidency; an incessant attack against unions; support for a muzzled and increasingly corporate-controlled media; the government production of fake news reports to gain consent for regressive policies; the use of an Orwellian vocabulary for disguising monstrous acts such as torture (“enhanced interrogation techniques”); the furtherance of a racist campaign of legal harassment and incarceration of Arabs, Muslims, and immigrants; the advancement of a prison binge through a repressive policy of criminalization; the establishment of an unregulated and ultimately devastating form of casino capitalism; the arrogant celebration and support for the interests and values of big business at the expense of citizens and the common good; and the dismantling of social services and social safety nets as part of a larger campaign of ushering in the corporate state and the reign of finance capital?

Authoritarianism With a Friendly Face

In the minds of the American public, the dominant media, and the accommodating pundits and intellectuals, there is no sense of how authoritarianism in its soft and hard forms can manifest itself as anything other than horrible images of concentration camps, goose-stepping storm troopers, rigid modes of censorship, and chilling spectacles of extremist government repression and violence. That is, there is little understanding of how new modes of authoritarian ideology, policy, values, and social relations might manifest themselves in degrees and gradations so as to create the conditions for a distinctly undemocratic and increasingly cruel and oppressive social order. As the late Susan Sontag suggested in another context, there is a willful ignorance of how emerging registers of power and governance “dissolve politics into pathology.”[10] It is generally believed that in a constitutional democracy, power is in the hands of the people, and that the long legacy of democratic ideals in America, however imperfect, is enough to prevent democracy from being subverted or lost. And yet the lessons of history provide clear examples of how the emergence of reactionary politics, the increasing power of the military, and the power of big business subverted democracy in Argentina, Chile, Germany, and Italy. In
spite of these histories, there is no room in the public imagination to entertain what has become the unthinkable—that such an order in its contemporary form might be more nuanced, less theatrical, more cunning, less concerned with repressive modes of control than with manipulative modes of consent—what one might call a mode of authoritarianism with a distinctly American character. [11]

Historical conjunctures produce different forms of authoritarianism, though they all share a hatred for democracy, dissent, and civil liberties. It is too easy to believe in a simplistic binary logic that strictly categorizes a country as either authoritarian or democratic, which leaves no room for entertaining the possibility of a mixture of both systems. American politics today suggests a more updated if not a different form of authoritarianism. In this context, it is worth remembering what Huey Long said in response to the question of whether America could ever become fascist: “Yes, but we will call it anti-fascist.”[12] Long’s reply suggests that fascism is not an ideological apparatus frozen in a particular historical period but a complex and often shifting theoretical and political register for understanding how democracy can be subverted, if not destroyed, from within. This notion of soft or friendly fascism was articulated in 1985 in Bertram Gross’s book Friendly Fascism, in which he argued that if fascism came to the United States it would not embody the same characteristics associated with fascist forms in the historical past. There would be no Nuremberg rallies, doctrines of racial superiority, government-sanctioned book burnings, death camps, genocidal purges, or the abrogation of the U.S. Constitution. In short, fascism would not take the form of an ideological grid from the past simply downloaded onto another country under different historical conditions. Gross believed that fascism was an ongoing danger and had the ability to become relevant under new conditions, taking on familiar forms of thought that resonate with nativist traditions, experiences, and political relations.[13] Similarly, in his Anatomy of Fascism, Robert O. Paxton argued that the texture of American fascism would not mimic traditional European forms but would be rooted in the language, symbols, and culture of everyday life. He writes: “No swastikas in an American fascism, but Stars and Stripes (or Stars and Bars) and Christian crosses. No fascist salute, but mass recitations of the Pledge of Allegiance. These symbols contain no whiff of fascism in themselves, of course, but an American fascism would transform them into obligatory litmus tests for detecting the internal enemy.”[14] It is worth noting that Umberto Eco, in his discussion of “eternal fascism,” also argued that any updated version of fascism would not openly assume the mantle of historical fascism; rather, new forms of authoritarianism would appropriate some of its elements, making it virtually unrecognizable from its traditional forms. Like Gross and Paxton, Eco contended that fascism, if it comes to America, will have a different guise, although it will be no less destructive of democracy. He wrote:

Ur-Fascism [Eternal Fascism] is still around us, sometimes in plainclothes. It would be much easier for us if there appeared on the world scene somebody saying, “I want to reopen Auschwitz, I want the Blackshirts to parade again in the Italian squares.” Life is not that simple. Ur-Fascism can come back under the most innocent of disguises. Our duty is to uncover it and to point our finger at any of its new instances—every day, in every part of the world.[15]

The renowned political theorist Sheldon Wolin, in Democracy Incorporated, updates these views and argues persuasively that the United States has produced its own unique form of authoritarianism, which he calls “inverted totalitarianism.”[16] Wolin claims that under traditional forms of totalitarianism, there are usually founding texts such as Mein Kampf, rule by a personal demagogue such as Adolf Hitler, political change enacted by a revolutionary movement such as the Bolsheviks, the constitution rewritten or discarded, the political state’s firm control over corporate interests, and an idealized and all-encompassing ideology used to create a unified and totalizing understanding of society. At the same time, the government uses all the power of its cultural and repressive state apparatuses to fashion followers in its own ideological image and collective identity.

In the United States, Wolin argues that an emerging authoritarianism appears to take on a very different form.[17] Instead of a charismatic leader, the government is now governed through the anonymous and largely remote hand of corporate power and finance capital. Political sovereignty is largely replaced by economic sovereignty as corporate power takes over the reins of governance. The dire consequence, as David Harvey points out, is that “raw money power wielded by the few undermines all semblances of democratic governance. The pharmaceutical companies, health insurance and hospital lobbies, for example, spent more than $133 million in the first three months of 2009 to make sure they got their way on health care reform in the United States.”[18] The more money influences politics the more corrupt the political culture becomes. Under such circumstances, holding office is largely dependent on having huge amounts of capital at one’s disposal, while laws and policies at all levels of government are mostly fashioned by lobbyists representing big business corporations and commanding financial institutions. Moreover, as the politics of health care reform indicate, such lobbying, as corrupt and unethical as it may be, is not carried out in the open and displayed by insurance and drug companies as a badge of honor—a kind of open testimonial to the disrespect for democratic governance and a celebration of their power. The subversion of democratic governance in the United States by corporate interests is captured succinctly by Chris Hedges in his observation that

Corporations have 35,000 lobbyists in Washington and thousands more in state capitals that dole out corporate money to shape and write legislation. They use their political action committees to solicit employees and shareholders for donations to fund pliable candidates. The financial sector, for example, spent more than $5 billion on political campaigns, influenc[e] peddling and lobbying during the past decade, which resulted in sweeping deregulation, the gouging of consumers, our global financial meltdown and the subsequent looting of the U.S. Treasury. The Pharmaceutical Research and Manufacturers of America spent $26 million last year and drug companies such as Pfizer, Amgen and Eli Lilly kicked in tens of millions more to buy off the two parties. These corporations have made sure our so-called health reform bill will force us to buy their predatory and defective products. The oil and gas industry, the coal industry, defense contractors and telecommunications companies have thwarted the drive for sustainable energy and orchestrated the steady erosion of civil liberties. Politicians do corporate bidding and stage hollow acts of political theater to keep the fiction of the democratic state alive.[19]

Rather than being forced to adhere to a particular state ideology, the general public in the United States is largely depoliticized through the influence of corporations over schools, higher education, and other cultural apparatuses. The deadening of public values, civic consciousness, and critical citizenship is also the result of the work of anti-public intellectuals representing right-wing ideological and financial interests,[20] dominant media that are largely center-right, and a market-driven public pedagogy that reduces the obligations of citizenship to the endless consumption and discarding of commodities. In addition, a pedagogy of social and political amnesia works through celebrity culture and its counterpart in corporate-driven news, television, radio, and entertainment to produce a culture of stupidity, censorship, and diversionary spectacles.

Depoliticizing Freedom and Agency

Agency is now defined by a neoliberal concept of freedom, a notion that is largely organized according to the narrow notions of individual self-interest and limited to the freedom from constraints. Central to this concept is the freedom to pursue one’s self-interests independently of larger social concerns. For individuals in a consumer society, this often means the freedom to shop, own guns, and define rights without regard to the consequences for others or the larger social order. When applied to economic institutions, this notion of freedom often translates into a call for removing government regulation over the market and economic institutions. This notion of a deregulated and privatized freedom is decoupled from the common good and any understanding of individual and social responsibility. It is an unlimited notion of freedom that both refuses to recognize the importance of social costs and social consequences and has no language for an ethic that calls us beyond ourselves, that engages our responsibility to others. Within this discourse of hyper-individualized freedom, individuals are not only “liberated from the constraints imposed by the dense network of social bonds,” but are also “stripped of the protection which had been matter-of-factly offered in the past by that dense network of social bonds.” [21]

Freedom exclusively tied to personal and political rights without also enabling access to economic resources becomes morally empty and politically dysfunctional. The much-heralded notion of choice associated with personal and political freedom is hardly assured when individuals lack the economic resources, knowledge, and social supports to make such choices and freedoms operative and meaningful. As Zygmunt Bauman points out, “The right to vote (and so, obliquely and at least in theory, the right to influence the composition of the ruler and the shape of the rules that bind the ruled) could be meaningfully exercised only by those ‘who possess sufficient economic and cultural resources’ to be ‘safe from the voluntary or involuntary servitude that cuts off any possible autonomy of choice (and/or its delegation) at the root….[Choice] stripped of economic resources and political power hardly assure[s] personal freedoms to the dispossessed, who have no claim on the resources without which personal freedom can neither be won nor in practice enjoyed.”[22] Paul Bigioni has argued that this flawed notion of freedom played a central role in the emerging fascist dictatorships of the early twentieth century. He writes:

It was the liberals of that era who clamored for unfettered personal and economic freedom, no matter what the cost to society. Such untrammeled freedom is not suitable to civilized humans. It is the freedom of the jungle. In other words, the strong have more of it than the weak. It is a notion of freedom that is inherently violent, because it is enjoyed at the expense of others. Such a notion of freedom legitimizes each and every increase in the wealth and power of those who are already powerful, regardless of the misery that will be suffered by others as a result. The use of the state to limit such “freedom” was denounced by the laissez-faire liberals of the early 20th century. The use of the state to protect such “freedom” was fascism. Just as monopoly is the ruin of the free market, fascism is the ultimate degradation of liberal capitalism.[23]

This stripped-down notion of market-based freedom that now dominates American society cancels out any viable notion of individual and social agency. This market-driven notion of freedom emphasizes choice as an economic function defined largely as the right to buy things while at the same time cancelling out any active understanding of freedom and choice as the right to make rational choices concerning the very structure of power and governance in a society. In embracing a passive attitude toward freedom in which power is viewed as a necessary evil, a conservative notion of freedom reduces politics to the empty ritual of voting and is incapable of understanding freedom as a form of collective, productive power that enables “a notion of political agency and freedom that affirms the equal opportunity of all to exercise political power in order to participate in shaping the most important decisions affecting their lives.”[24] This merging of the market-based understanding of freedom as the freedom to consume and the conservative-based view of freedom as a restriction from all constraints refuses to recognize that the conditions for substantive freedom do not lie in personal and political rights alone; on the contrary, real choices and freedom include the individual and collective ability to actively intervene in and shape both the nature of politics and the myriad forces bearing down on everyday life—a notion of freedom that can only be viable when social rights and economic resources are available to individuals. Of course, this notion of freedom and choice is often dismissed either as a vestige of socialism or simply drowned out in a culture that collapses all social considerations and notions of solidarity into the often cruel and swindle-based discourse of instant gratification and individual gain. Under such conditions, democracy is managed through the empty ritual of elections; citizens are largely rendered passive observers as a result of giving undue influence to corporate power in shaping all of the essential elements of political governance and decision making; and manufactured appeals to fear and personal safety legitimate both the suspension of civil liberties and the expanding powers of an imperial presidency and the policing functions of a militaristic state.

Busy schedule? Click here to keep up with Truthout with free email updates. [5]

I believe that the formative culture necessary to create modes of education, thought, dialogue, critique, and critical agency—the necessary conditions of any aspiring democracy—is largely destroyed through the pacification of intellectuals and the elimination of public spheres capable of creating such a culture. Elements of a depoliticizing and commodifying culture become clear in the shameless propaganda produced by the so-called “embedded” journalists, while a corporate-dominated popular culture largely operates through multiple technologies, screen cultures, and video games that trade endlessly in images of violence, spectacles of consumption, and stultifying modes of (il)literacy. Funded by right-wing ideological, corporate, and militaristic interests, an army of anti-public intellectuals groomed in right-wing think tanks and foundations, such as the American Enterprise Institute and Manhattan Institute, dominate the traditional media, police the universities for any vestige of critical thought and dissent, and endlessly spread their message of privatization, deregulation, and commercialization, exercising a powerful influence in the dismantling of all public spheres not dominated by private and commodifying interests. These “experts in legitimation,” to use Antonio Gramsci’s prescient phrase, peddle civic ignorance just as they renounce any vestige of public accountability for big business, giant media conglomerates, and financial mega corporations. How else to explain that nearly twenty percent of the American people believe incorrectly that Obama is a Muslim!

Under the new authoritarianism, the corporate state and the punishing state merge as economics drives politics, and repression is increasingly used to contain all those individuals and groups caught in an expanding web of destabilizing inequality and powerlessness that touches everything from the need for basic health care, food, and shelter to the promise of a decent education. As the social state is hollowed out under pressure from free-market advocates, right-wing politicians, and conservative ideologues, the United States has increasingly turned its back on any semblance of social justice, civic responsibility, and democracy itself. This might explain the influential journalist Thomas Friedman’s shameless endorsement of military adventurism in the New York Times article in which he argues that “The hidden hand of the market will never work without a hidden fist—McDonald’s cannot flourish without McDonnell Douglas, the designer of the U.S. Air Force F-15. And the hidden fist that keeps the world safe for Silicon Valley’s technologies to flourish is called the U.S. Army, Air Force, Navy and Marine Corps.”[25] Freedom in this discourse is inextricably wedded to state and military violence and is a far cry from any semblance of a claim to democracy.

Zombie Politicas and the Culture of Cruelty

Another characteristic of an emerging authoritarianism in the United States is the correlation between the growing atomization of the individual and the rise of a culture of cruelty, a type of zombie politics in which the living dead engage in forms of rapacious behavior that destroy almost every facet of a substantive democratic polity. There is a mode of terror rooted in a neoliberal market-driven society that numbs many people just as it wipes out the creative faculties of imagination, memory, and critical thought. Under a regime of privatized utopias, hyper-individualism, and ego-centered values, human beings slip into a kind of ethical somnolence, indifferent to the plight and suffering of others. Though writing in a different context, the late Frankfurt School theorist Leo Lowenthal captured this mode of terror in his comments on the deeply sedimented elements of authoritarianism rooted in modern civilization. He wrote:

In a system that reduces life to a chain of disconnected reactions to shock, personal communication tends to lose all meaning….The individual under terrorist conditions is never alone and always alone. He becomes numb and rigid not only in relation to his neighbor but also in relation to himself; fear robs him of the power of spontaneous emotional or mental reaction. Thinking becomes a stupid crime; it endangers his life. The inevitable consequence is that stupidity spreads as a contagious disease among the terrorized population. Human beings live in a state of stupor, in a moral coma.[26]

Implicit in Lowenthal’s commentary is the assumption that as democracy becomes a fiction, the moral mechanisms of language, meaning, and ethics collapse, and a cruel indifference takes over diverse modes of communication and exchange, often as a register of the current paucity of democratic values, identities, and social relations. Surely, this is obvious today as all vestiges of the social compact, social responsibility, and modes of solidarity give way to a form of Social Darwinism with its emphasis on ruthlessness, cruelty, war, violence, hyper modes of masculinity, and a disdain for those considered weak, dependent, alien, or economically unproductive. A poverty of civic ideals is matched not only by a poverty of critical agency but also by the disappearance among the public of the importance of moral and social responsibilities. As public life is commercialized and commodified, the pathology of individual entitlement and narcissism erodes those public spaces in which the conditions for conscience, decency, self-respect, and dignity take root. The delusion of endless growth coupled with an “obsession with wealth creation, the cult of privatization [and] uncritical admiration for unfettered markets, and disdain for the public sector” has produced a culture that seems “consumed by locusts” in “an age of pygmies.”[27]

This culture of cruelty is especially evident in the hardships and deprivations now visited upon many young people in the United States. We have 13.3 million homeless children; one child in five lives in poverty; too many are now under the supervision of the criminal justice system, and many more young adults are unemployed and lack any hope for the future.[28] Moreover, we are subjecting more and more children to psychiatric drugs as a way of controlling their alleged unruly behavior while providing huge profits for drug companies. As Evelyn Pringle points out, “in 2006 more money was spent on treating mental disorders in children aged 0 to 17 than for any other medical condition, with a total of $8.9 billion.”[29] Needless to say, the drugging of American children is less about treating genuine mental disorders than it is about punishing so-called unruly children, largely children of the poor, while creating “lifelong patients and repeat customers for Pharma!”[30] Stories abound about poor young people being raped, beaten, and dying in juvenile detention centers, needlessly trafficked into the criminal justice system as part of a profit-making scheme cooked up by corrupt judges and private correction facilities administrators, and being given powerful antipsychotic medicines in schools and other state facilities.[31] Unfortunately, this regression to sheer Economic Darwinism is not only evident in increasing violence against young people, cutthroat reality TV shows, hate radio, and the Internet, it is also on full display in the discourse of government officials and politicians and serves as a register of the prominence of both a kind of political infantilism and a culture of cruelty. For instance, the Secretary of Education, Arne Duncan, recently stated in an interview in February 2010 that “the best thing that happened to the education system in New Orleans was Hurricane Katrina.”[32] Duncan’s point, beyond the incredible inhumanity reflected in such a comment, was that it took a disaster that uprooted thousands of individuals and families and caused enormous amounts of suffering to enable the Obama administration to implement a massive educational system pushing charter schools based on market-driven principles that disdain public values, if not public schooling itself. This is the language of cruelty and zombie politicians, a language indifferent to the ways in which people who suffer great tragedies are expelled from their histories, narratives, and right to be human. Horrible tragedies caused in part by government indifference are now covered up in the discourse and ideals inspired by the logic of the market. This mean and merciless streak was also on display recently when Lieutenant Governor Andre Bauer, who is running for the Republican nomination for governor in South Carolina, stated that giving people government assistance was comparable to “feeding stray animals.” The utterly derogatory and implicitly racist nature of his remark became obvious in the statement that followed: “You know why? Because they breed. You’re facilitating the problem if you give an animal or a person ample food supply. They will reproduce, especially ones that don’t think too much further than that. And so what you’ve got to do is you’ve got to curtail that type of behavior. They don’t know any better.”[33]

Lowenthal’s argument that in an authoritarian society “stupidity spreads as a contagious disease” is evident in a statement made by Michele Bachmann, a Republican congresswoman, who recently argued that “Americans should purchase [health] insurance with their own tax-free money.”[34] That 43 million Americans are without health insurance because they cannot afford it seems lost on Bachmann, whose comments suggest that these uninsured individuals, families, unemployed workers, and children are not simply a disposable surplus but actually invisible and therefore unworthy of any acknowledgment.

The regressive politics and moral stupidity are also evident in the emergence of right-wing extremists now taking over the Republican Party. This new and aggressive political formation calls for decoupling market-driven financial institutions from any vestige of political and governmental constraint, celebrates emotion over reason, treats critical intelligence as a toxin possessed largely by elites, wraps its sophomoric misrepresentations in an air of beyond-interrogation “we’re just folks” insularity, and calls for the restoration of a traditional, white, Christian, male-dominated America.[35] Such calls embody elements of a racial panic that are evident in all authoritarian movements and have increasingly become a defining feature of a Republican Party that has sided with far-right-wing thugs and goon squads intent on disrupting any vestige of the democratic process. This emerging authoritarian element in American political culture is embodied in the wildly popular media presence of Rush Limbaugh and Glenn Beck—right-wing extremists who share a contempt for reason and believe in organizing politics on the model of war, unconditional surrender, personal insults, hyper-masculine spectacles, and the complete destruction of one’s opponent.

The culture of cruelty, violence, and slander was on full display as the Obama administration successfully passed a weak version of health care reform in 2010. Stoked by a Republican Party that has either looked away or in some cases supported the coded language of racism and violence, it was no surprise that there was barely a peep out of Republican Party leaders when racial and homophobic slurs were hurled by Tea Party demonstrators at civil rights legend Jon Lewis and openly gay Barney Frank, both firm supporters of the Obama health policies. Even worse is the nod to trigger-happy right-wing advocates of violence that conservatives such as Sarah Palin have suggested in their response to the passage of the health care bill. For instance, Frank Rich argues that

this bill that inspired G.O.P. congressmen on the House floor to egg on disruptive protesters even as they were being evicted from the gallery by the Capitol Police last Sunday. It’s this bill that prompted a congressman to shout “baby killer” at Bart Stupak, a staunch anti- abortion Democrat. It’s this bill that drove a demonstrator to spit on Emanuel Cleaver, a black representative from Missouri. And it’s this “middle-of-the-road” bill, as Obama accurately calls it, that has incited an unglued firestorm of homicidal rhetoric, from “Kill the bill!” to Sarah Palin’s cry for her followers to “reload.” At least four of the House members hit with death threats or vandalism are among the 20 political targets Palin marks with rifle crosshairs on a map on her Facebook page.[36]

There is more at work here than the usual right-wing promotion of bigotry and ignorance; there is the use of violent rhetoric and imagery that mimics the discourse of terrorism reminiscent of Oklahoma bomber Timothy McVeigh, dangerous right-wing militia groups, and other American-style fascists. As Chris Hedges insists, “The language of violence always presages violence”[37] and fuels an authoritarianism that feeds on such excesses and the moral coma that accompanies the inability of a society to both question itself and imagine an alternative democratic order. How else can one read the “homicidal rhetoric” that is growing in America as anything other than an obituary for dialogue, democratic values, and civic courage? What does it mean for a democracy when the general public either supports or is silent in the face of widely publicized events such as black and gay members of Congress being subjected to racist and homophobic taunts, a black congressman being spit on, and the throwing of bricks through the office windows of some legislators who supported the health care bill? What does it mean for a democracy when there is little collective outrage when Sarah Palin, a leading voice in the Republican Party, mimics the tactics of vigilantes by posting a map with crosshairs on the districts of Democrats and urges her supporters on with the shameful slogan “Don’t Retreat. Instead—RELOAD!” Under such circumstances, the brandishing of assault weapons at right-wing political rallies, the posters and signs comparing Obama to Hitler, and the ever-increasing chants to “Take Our Country Back” echoes what Frank Rich calls a “small-scale mimicry of Kristallnacht.”[38] Violence and aggression are now openly tolerated and in some cases promoted. The chants, insults, violence, and mob hysteria all portend a dark period in American history—an historical conjuncture in the death knell for democracy is being written as the media turn such events into spectacles rather than treat them as morally and politically repugnant acts more akin to the legacy of fascism than the ideals of an aspiring democracy. All the while the public yawns or, more troubling, engages fantasies of reloading.

Unfortunately, the problems now facing the United States are legion and further the erosion of a civic and democratic culture. Some of the most glaring issues are massive unemployment; a rotting infrastructure; the erosion of vital public services; the dismantling of the social safety net; expanding levels of poverty, especially for children; and an imprisonment binge largely affecting poor minorities of color. But such a list barely scratches the surface. In addition, we have witnessed in the last thirty years the restructuring of public education as either a source of profit for corporations or an updated version of control modeled after prison culture coupled with an increasing culture of lying, cruelty, and corruption, all of which belie a democratic vision of America that now seems imaginable only as a nostalgic rendering of the founding ideals of democracy.

NOTES

1. Hannah Arendt, Between Past and Future (1968; New York: Penguin Books, 1993), p. 196.

2. I have taken this term from Stephen Jones,ed.,The Dead That Walk (Berkeley,CA: Ulysses Press, 2010).

3. Editorial, “Wall Street Casino [6],” The New York Times (April 28, 2010), p. A24.

4. Some of the ideas come from Richard Greene and K. Silem Mohammad, eds., Zombies, Vampires, and Philosophy: New Life for the Undead (Chicago: Open Court, 2010).

5. Arun Gupta, “Party of No: How Republicans and the Right Have Tried to Thwart All Social Progress [7],” Truthout.org (May 21, 2010).

6. Jonathan J. Cooper, “We’re All Arizonians Now [8],” Huffington Post (May 15, 2010).

7. See the excellent commentary on this issue by Frank Rich, “The Rage Is Not About Health Care,” The New York Times (March 28, 2010), p. WK10. See also Justine Sharrock, “The Oath Keepers: The Militant and Armed Side of the Tea Party Movement [9],” AlterNet (March 6, 2010); and Mark Potok, “Rage on the Right: The Year in Hate and Extremism [10],” Southern Poverty Law Center Intelligence Report 137 (Spring 2010).

8. Paul Krugman, “Going to Extreme,” The New York Times (May 16, 2010), p. A23.

9. James Traub, “The Way We Live Now: Weimar Whiners [11],” The New York Times Magazine ( June 1, 2003). For a commentary on such intellectuals, see Tony Judt, “Bush’s Useful Idiots [12],” The London Review of Books 28:18 (September 21, 2006).

10. Cited in Carol Becker, “The Art of Testimony,” Sculpture (March 1997), p. 28.

11. This case for an American version of authoritarianism was updated and made more visible in a number of interesting books and articles. See, for instance, Chris Hedges, American Fascists: The Christian Right and the War on America (New York: Free Press, 2006); Henry A. Giroux, Against the Terror of Neoliberalism: Politics Beyond the Age of Greed (Boulder, CO: Paradigm Publishers, 2008); and Sheldon S. Wolin, Democracy Incorporated: Managed Democracy and the Specter of Inverted Totalitarianism (Princeton: Princeton University Press, 2008).

12. Cited in Paul Bigioni, “Fascism Then, Fascism Now [13],” Toronto Star (November 27, 2005).

13. See Bertram Gross, Friendly Fascism: The New Face of Power in America (Montreal: Black Rose Books, 1985).

14. Robert O. Paxton, The Anatomy of Fascism (New York: Alfred A. Knopf, 2004), p. 202.

15. Umberto Eco, “Eternal Fascism: Fourteen Ways of Looking at a Blackshirt,” New York Review of Books (November–December 1995), p. 15.

16. Wolin, Democracy Incorporated.

17. Along similar theoretical lines, see Stephen Lendman, “A Look Back and Ahead: Police State in America [14],” CounterPunch (December 17, 2007). For an excellent analysis that points to the creeping power of the nation- al security state on American universities, see David Price, “Silent Coup: How the CIA Is Welcoming Itself Back onto American University Campuses,” CounterPunch 17:3 (January 13–31, 2010), pp. 1–5.

18. David Harvey,“Organizing for the Anti-Capitalist Transition [15],” Monthly Review (December15, 2009).

19. Chris Hedges, “Democracy in America Is a Useful Fiction [16],” TruthDig (January 24, 2010).

20. See Janine R. Wedel, Shadow Elite: How the World’s New Power Brokers Undermine Democracy, Government, and the Free Market (New York: Basic Books, 2010).

21. Zygmunt Bauman, Liquid Times: Living in an Age of Uncertainty (London: Polity Press, 2007), pp. 57–58.

22. Ibid., p. 64.

23. Bigioni, “Fascism Then, Fascism Now.”

24. Cornelius Castoriadis, “The Nature and Value of Equity,” Philosophy, Politics, Autonomy: Essays in Political Philosophy (New York: Oxford University Press, 1991), pp. 124–142.

25. ThomasL.Friedman,“A Manifesto for the Fast World [17],”The New York Times Magazine (March 28, 1999).

26. Leo Lowenthal, “Atomization of Man,” False Prophets: Studies in Authoritarianism (New Brunswick, NJ: Transaction Books, 1987), pp. 182–183.

27. Tony Judt, Ill Fares the Land (New York: Penguin Press, 2010), pp. 2–3.

28. I have taken up this issue in my Youth in a Suspect Society: Democracy or Disposability? (New York: Palgrave, 2009). For a series of brilliant commentaries on youth in America, see the work of Tolu Olorunda in The Black Commentator, Truthout, and other online journals.

29. Evelyn Pringle, “Why Are We Drugging Our Kids?,” Truthout (December 14, 2009), http://www.alternet.org/story/144538 [18].

30. Ibid.

31. See Nicholas Confessore, “New York Finds Extreme Crisis in Youth Prisons,” The New York Times (December 14, 2009), p. A1; Duff Wilson, “Poor Children Likelier to Get Antipsychotics,” The New York Times (December 12, 2009), p. A1; and Amy Goodman, “Jailing Kids for Cash [19],” Truthout (February 17, 2009).

32. Jake Tapper, “Political Punch: Power, Pop, and Probings from ABC News Senior White House Correspondent—Duncan: Katrina Was the ‘Best Thing’ for New Orleans School System [20],” ABC News.com ( January 29, 2010).

33. Nathaniel Cary, “GOP Hopeful: People on Public Assistance ‘Like Stray Animals [21],’” Truthout ( January 23, 2010).

34.Cited in Frank Rich, “The State of Union Is Comatose, ”The New York Times (January 31 ,2010).

35. See, for example, Patrick J. Buchanan, “Traditional Americans Are Losing Their Nation [22],” WorldNetDaily (January 24, 2010).

36. Frank Rich, “The Rage Is Not About Health Care,” The New York Times (March 28, 2010), p. WK10.

37. Chris Hedges, “Is America ‘Yearning for Fascism’? [23],” TruthDig (March 29, 2010).

38. Rich, “The State of the Union Is Comatose,” p. WK10.

Christopher Columbus & His Crimes Against Humanity

In Uncategorized on October 11, 2010 at 10:55 am

Oldspeak: “Motivated by greed, apocalyptic christianity and lust for wealth and power… Christopher Columbus went forth into the world and made a mess of millions of innocent peoples lives. “

From Winter Rabbit @ Native American Netroots:

The Christian Crusades had ended in 1291, the Black Death had been deliberately blamed on innocent Jews who said what their Christian torturers forced them to, that they poisoned water wells, causing the Black Death.
Of course, the real cause was in the stomachs of fleas, not planetary alignment, earthquakes, or God’s Judgment. Nonetheless, the extermination of European Jews began in 1348 again, along with a key notorious origin of Manifest Destiny. 

Source

But no sooner had the plague ceased than we saw the contrary . . . [People] gave themselves up to a more shameful and disordered life than they had led before…. Men thought that, by reason of the fewness of mankind, there should be abundance of all produce of the land; yet, on the contrary, by reason of men’s ingratitude, everything came to unwonted scarcity and remained long thus; nay, in certain countries.

Christopher Columbus was born in 1451, barely over a century later in the city – state of Genoa, Italy after the newest Christian Campaign to exterminate the European Jews. Columbus educated himself, and his father was a wool merchant (3). Columbus was a map maker and a sailor in his forties; consequently, he knew that the world was round. What were three of the motivations that led him to set sail on August 3, 1492 on the Pinta, the Nina, and the Santa Maria from the “Southern Spanish port of Palos?” Greed for gold, capitalistic greed through the potential of wealth through the slave trade, and the religious beliefs of Apocalyptic Christianity were three primary motivations Columbus had for setting sail; consequently, which fueled genocide against tens of millions of Indigenous People.

One of Columbus’s motivations was greed for gold, which he acquired on the Gold Coast in the Portuguese colony (3).

Christopher Columbus: The Untold StoryChristopher Columbus:

“Gold is most excellent; gold is treasure, and he who possesses it does all he wishes to in this world.” [2]

Another of Columbus’s motives for making the journey was his capitalistic greed through the potential of wealth through the slave trade, which resulted in more and more slavery because of the desire for sugar and led to the atrocities of the Middle Passage.

SourceSugar cane was the number one crop that produced the growth for Europe. It was brought to the New World from Spain by Christopher Columbus, later shipped to the rest of Europe. The growing sugar industry called for the usage of African slaves. Also the African slave labor and the plantations are what formed the Americas. The work that was performed on the plantations which, produced large quantities of sugar, created an even greater need for slaves, by the enslaved Africans brought to the Atlantic World by the Middle Passage.

Here is a map that provides a good overview.The religious beliefs of Apocalyptic Christianity were yet another one of Columbus’ motivations for setting sail; consequently, it was the most illogical motivation he possessed. For his greed for gold could be coldly construed as a more practical reason, except for all of the Indigenous People he would in the future have to exterminate to get it, which he probably did not yet know of at the time. He had only ventured to the Gold Coast. His use of the slave trade for monetary gain was illogical enough, for it denied the very humanity of the African People and the Indigenous People that he would force into slavery; however, his beliefs regarding Apocalyptic Christianity were projected outwards towards the entire world.

SourceDuring those same long centuries they had further expressed their ruthless intolerance of all persons and thugs that were non-Christian by conducting pogroms against the Jews who lived among them and whom they regarded as the embodiment of the Antichrist imposing torture exile and mass destruction on those who refused to succumb to evangelical persuasion.

Columbus was possessed with the obsession that Christ would return only if the Gospel was spread far and wide. Apocalyptic Christianity taught him: that either a savior in human form would prepare the way for Christ to return in the midst of a war between good and evil and history would end; or, that after the earth suffers dire consequences, evil would increase while love would decrease, then Christ would return with the Final Judgment and end history; or, that a period of peace would precede the Final Judgment. During this “period of peace,” the Jews would be converted, while “the heathens would be either converted or annihilated.” I think the latter best reflects Columbus’s personal view of Apocalyptic Christianity. I will state why after a couple less known facts in order to set up a contrast.

The Indigenous People very well may have had a much better future then and history now if Christopher Columbus had perished in the Atlantic on February 14, 1493. Forthe first European to land in America was Leif Ericson, a Viking seaman from Greenland (see Ericson). The ancient sagas give different accounts of this voyage made in the year 1000.

As for contacts of New World peoples with Europe, the sole early ones involved the Norse who occupied Greenland in very small numbers between A.D. 986 and about 1500. But these Norse visits had no discernible impact on Native American societies. (2)

The Norse left “no discernable impact.” I cannot answer why that is, except to note that Viking voyages decreased and ended during the slow process of the Christianization of Scandinavia. So by contrast, Columbus had an enormous impact that is more far reaching than he could have imagined. Ironic indeed, since he grossly underestimated the earth’s size prior to setting sail. For example, “He thought that Japan lay only three thousand miles from the southern European Coast (3).” He may then have also grossly underestimated the sheer mass numbers of Indigenous Population in the lands he did not first discover in the Americas. No matter though, for such “heathens” would either have to be “converted or annihilated.”

To be sure, the real annihilations did not start until the beginning of Columbus’ second voyage to the Americas in 1493 (1). For while he had expressed admiration for the overall generosity of Indigenous People (1) and considered the Tainos to be “Very handsome, gentle, and friendly,” he interpreted all these positive traits as signs of weakness and vulnerability, saying “if devout religious persons knew the Indian Language well, all these people would soon become Christians (3).” As a consequence, he kidnapped some of the Tainos and took them back to Spain.

It would be easy, he asserted, to “subject everyone and make them do what you wished (3).”

Indeed, he did subject everyone he had the power to subject.

SourceOn his second voyage, in December 1494, Columbus captured 1,500 Tainos on the island of Hispaniola and herded them to Isabela, where 550 of ”the best males and females” were forced aboard ships bound for the slave markets of Seville.

Under Columbus’s leadership, the Spanish attacked the Taino, sparing neither men, women nor children. Warfare, forced labor, starvation and disease reduced Hispaniola’s Taino population (estimated at one million to two million in 1492) to extinction within 30 years.

Furthermore, Columbus wrote a letter to the Spanish governor of the island, Hispaniola. Columbus asked the governor the cut off the ears and the noses of any of the slaves who resisted being subjugated to slavery.

…It is estimated that 100 million Indians from the Caribbean, Central, South, and North America perished at the hands of the European invaders. Sadly, unbelievably, really, much of that wholesale destruction was sanctioned and carried out by the Roman Catholic Church and various Protestant denominations. (1: p. 37)

Greed for gold, capitalistic greed through the potential of wealth through the slave trade, and the religious beliefs of Apocalyptic Christianity were three primary motivations Columbus had for setting sail. He was successful in his aims, which fueled genocide against tens of millions of Indigenous People. He was successful in promoting and aiding in establishing slavery by bringing sugar to Europe and to the New World from Spain, which created the evil necessity in the eyes of some of humanity’s greatest criminals for the Middle Passage, where slaves packed like cargo between decks often had to lie in each other’s feces, urine, and blood.

Columbus’ “successes,” all crimes against humanity, are now more so in these modern times. A day is now in his honor since 1971 (4). That’s one success. Here are more of Columbus’ “successes” from a book I highly recommend buying.

Unlearning the Language of Conquest: Scholars Expose Anti-Indianism in America (Paperback) by Four Arrows (Don Trent Jacobs) (Editor). p. 237.As Moyers pointed out, this “mentality” and blind acceptance of biblical inerrancy, which contributed to the genocide of American Indians during Columbus’ time, has, in many ways, continued and continues to inform U.S. foreign policy, including its dealings with its own sovereign Indian Nations.

Christopher Columbus: The Untold Story“We shall take you and your wives and your children, and shall make slaves of them, and as such shall sell and dispose of them as their Highnesses may command; and we shall take away your goods, and shall do all the harm and damage that we can.” [11]

Source

Mark Twain:

“History doesn’t repeat itself, but it does rhyme.”

http://64.38.12.138/News/2010/…“”They treat us just like guinea pigs when it comes to Indian Health Services.” That’s how one woman on the Cheyenne River Sioux reservation described the birth of her second child. She is not alone. Today, the ACLU and the ACLU of South Dakota filed a Freedom of Information of Act (FOIA) lawsuit against Indian Health Services (IHS), seeking information about the provision of reproductive health care services to the women of the Cheyenne River Sioux.

- snip -

Many women report that they are being told to forgo natural labor and delivery, and instead accept medication to induce labor, either on or before their due dates, at a time selected exclusively by their doctor. They are given little or no counseling – indeed, many women say that the first time their doctor spoke to them about induction of labor was on the day they were induced.

Sources:

(1): Kurt Kaltreider, PH.D. “American Indian Prophecies.” pp. 49-57.

(2): Jared Diamond. “Guns, Germs, And Steel.” pp. 67, 79.

(3): Norton. Katzman. Escott. Chudacoff. Paterson. Tuttle. “A People & A Nation.” pp. 20 – 23.

(4): Four Arrows (Don Trent Jacobs). “Unlearning the Language of Conquest.” pp. 20, 236, 31, 275.


 

A Quarter of Americans Have No Idea What We’re Celebrating This Weekend.

In Uncategorized on July 3, 2010 at 1:24 pm

Oldspeak:” Ignorance is Strength.”

From New York Magazine/CBS News:

Even though our national holidays are mostly used as excuses to eat copiously, do something fun outside, and leave work early, is it too much to ask for Americans to at least be aware of even the most basic historical facts about the day they’re celebrating, and, by extension, the country they live in? As July 4 approaches this Sunday, a Marist poll shows that 26 percent of Americans, including 40 percent of 18- to 29-year-olds, don’t know what country the United States won independence from. Incorrect answers included France, China, Japan, Mexico, and Spain. China and Japan, really? Sigh.

The discouraging numbers were broken down further and the South was the least historically knowledgable region in the country.

Nearly 1/3 of all Southerners didn’t know the UnitedStates won independence from the British. The Midwest was second worst with over ¼ of residents in that part of the country not knowing who America beat to secure its independence.

When pollsters broke the number down by economic status; just under 40 percent of people who make less than $50,000 didn’t know who America beat Britain for its independence.

Broken down by age group, 40 percent of respondents between 18 and 29 had no idea America won independence from England.

Finally, 19 percent of men who took part in the study couldn’t name the country America won its independence from; while 1/3 of women surveyed couldn’t name England as the former empire that controlled the American colonies.

Texas Textbook War: “Slavery” or “Atlantic Triangular Trade”?

In Uncategorized on May 20, 2010 at 1:44 pm

Oldspeak:  Among the proposed changes: Students would be required to learn about the “unintended consequences” of Title IX, affirmative action, and the Great Society, and would need to study conservative icons like Phyllis Schlafly, the Heritage Foundation, and the Moral Majority. The slave trade would be renamed the “Atlantic triangular trade,” American “imperialism” changed to “expansionism,” and all references to “capitalism” have been replaced with “free enterprise.”  Behold! The Ministry of Truth 2010.

From Amanda Paulson  @ The Christian Science Monitor:

Thomas Jefferson out, Phyllis Schlafly in?

While the proposed changes to Texas social studies standards aren’t quite so simple (and contrary to some reports, Thomas Jefferson would still be part of the curriculum), the debate over the standards pushed by a conservative majority of the Texas Board of Education – which will be voted on this week – has resulted in a partisan uproar and generated interest far beyond the Lone Star State.

Conservatives say that the changes are a long-overdue correction to a curriculum that too often deemphasizes religion and caters to liberal views. Critics are dismayed at what they see as an attempt to push conservative ideology – even if it flies in the face of scholarship – into textbooks. And with a textbook industry that is often influenced by the standards in the largest states, there is a chance that the changes have influence beyond Texas.

“Decisions that are made in Texas have a ripple effect across the country,” says Phillip VanFossen, head of the Department of Curriculum and Instruction and a professor of social studies education at Purdue University.

Still, he notes, as the pendulum swings toward national standards – which have yet to be developed for social studies – that influence might wane. Just in case, California this week passed a bill out of a Senate committee that would ensure no California textbooks contain any Texas-driven changes. Conservatives dominate Texas Board of Education

The root of the uproar is a regular process in which the Texas Board of Education revises the state’s standards. Far more than in most states, the elected board is entrusted to write standards itself, rather than merely approve them. With a 10-5 Republican majority, including a coalition of seven social conservatives, the board has pushed what some see as a particularly partisan agenda.

Among the changes: Students would be required to learn about the “unintended consequences” of Title IX, affirmative action, and the Great Society, and would need to study conservative icons like Phyllis Schlafly, the Heritage Foundation, and the Moral Majority.

The slave trade would be renamed the “Atlantic triangular trade,” American “imperialism” changed to “expansionism,” and all references to “capitalism” have been replaced with “free enterprise.”

The role of Thomas Jefferson – who argued for the separation of church and state – is minimized in several places, and the standards would emphasize the degree to which the Founding Fathers were driven by Christian principles.

“In the 18 months that the state board has worked on these standards, they’ve struck a balance that our members feel will give public school students a fuller and stronger appreciation of the religious and cultural roots of American history,” says Brent Connett, a policy analyst with the Texas Conservative Coalition, which released a letter this week calling on the board to approve the standards and to ignore calls for delay.

But others say they are dismayed at the degree to which the standards seem to have been written without regard for scholarship.

Professor VanFossen, for instance, was bothered by a new requirement that students analyze the decline in value of the US dollar and abandonment of the gold standard, without input from economists, and by amendments that would try to cast a more positive spin on Sen. Joe McCarthy’s communist witch hunt.

“It’s ideologically driven,” he says, adding that he’s also bothered that many of the most important skills students need to learn – debate and discussion, constructing arguments, reconciling different perspectives – are being lost amid the highly proscriptive and detailed content.

Others say that whether or not national textbooks are ultimately influenced by Texas (the textbook industry has sought to downplay that fear), the furor that this has caused will be detrimental to future attempts to create standards. ‘No one wants to touch social studies’

“No one wants to touch social studies,” says Peggy Altoff, past president of the National Council for the Social Studies and co-chair of the committee that set social studies standards in Colorado.

Ms. Altoff says it doesn’t have to be such a political, partisan process, and cites Colorado’s experience as an example. Since often what stokes peoples’ anger the most is who is included for study – Cesar Chavez or Newt Gingrich; Thurgood Marshall or Thomas Aquinas – she suggests standards that offer examples, but don’t limit curricula to those figures.

“It doesn’t have to be the Texas debacle,” she says.

Whatever the vote is this week, the conservative influence on the board may be waning.

Don McLeroy, the author of many of the most contentious amendments and a leader of the conservative coalition, was defeated in March in a primary by an opponent who was critical of his approach. Another key social conservative, Cynthia Dunbar, is not seeking reelection, and a more moderate candidate won the GOP primary in her district.

Jackson State Massacre 40 years on: Why 2 died in Mississippi

In Uncategorized on May 14, 2010 at 11:12 am

Oldspeak: One of the most colossal blunders in Mississippi law enforcement history. 2 young men killed, 15 more wounded. A farce of an investigation and trial following the tragedy. All in the name of sqaushing people’s rights to be free and equal citizens of this country.

Without a doubt, the spring of 1970 was a tense and hot season for American college students. Protests and riots flared up over the escalation of the war in Vietnam, the sending of US troops to Cambodia, the environment, civil rights, and the inclusion of women into many formerly all-male universities and colleges. Certainly Kenyon College was experiencing just such a difficult transitional period. At Jackson State University in Jackson, Mississippi, tensions were particularly high in regards to racism and civil rights.

Since its establishment as a teacher’s college in the late 1800s, Jackson State had been subject to racism. The school moved from its original location because it was too close to an all-white area, and established a new campus in an entirely black neighborhood. Lynch Street, named for Mississippi’s first black congressman, bisected the new campus and linked west Jackson, a white suburb, to the downtown area.

In the early 1960s, a Masonic Temple just down the block from the university on Lynch Street was the headquarters for the Mississippi civil rights movement. Despite the proximity of the headquarters to the school, JSU students participated little in demonstrations and protests. A state school, Jackson could not afford to alienate the all-white board of education.

At nearby private institution Tougaloo College, students openly protested and organized sit ins, and brought a march to Jackson State after being forced from a whites-only library. Instead of organized protests, JSU students resorted to throwing rocks and bottles at white motorists who shouted racial slurs at them as they drove from their downtown jobs to their suburban homes via Lynch Street.

Every spring, a mini-riot occurred at Lynch Street and the thoroughfare was temporarily closed, but the city still refused to permanently reroute traffic despite numerous pleas from university officials.

In the spring of 1970, a popular female student was injured by a white motorist while attempting to cross Lynch Street. White motorists already angered the student population by shouting racial slurs and epithets from their windows while driving past campus, and students had retaliated by throwing rocks and bottles. With the already mentioned national stress factors, and the death of four Kent State University students over a week earlier, Jackson State students had enough to be worried about. Racism and the struggle for civil rights made their situation even more unbearable.

What occurred at Jackson State University was a protest against racism. Unlike Kent State, students had not rallied to protest the war in Vietnam.

On May 13, 1970, students amassed on Lynch Street but did not get out of hand. Governor John Bell Williams ordered the Highway Patrol to establish order on the Jackson State campus, and students did not resist.

The next day, the President of the school twice met with students to listen to their concerns, but tension continued to mount.

Around 9:30 PM on May 14, JSU students heard a rumor that Fayette, Mississippi mayor Charles Evers, brother of murdered civil rights activist Medgar Evers, had been killed along with his wife. Students again gathered on Lynch Street and began rioting.

The ROTC building was set on fire, a street light was broken, and a small bonfire was built, but the riot was still a small one. Several white motorists called police to complain that students had thrown rocks at their passing cars, but eyewitnesses later proved that it was non-students, known as “cornerboys,” who did the rock throwing. Firemen arrived to distinguish the fires, but requested police protection after students harassed them as they worked.

Police arrived, blocked off Lynch Street, and cordoned off a thirty block area surrounding the University. Later police told the media that they had received reports of gunfire for an hour and a half before arriving on campus. On the west end of Lynch Street, National Guardsmen assembled, still on call for rioting of the night before. The guardsmen had weapons but no ammunition.

There were seventy-five city police men and Mississippi State officers on the Lynch Street side of Stewart Hall, a men’s dormitory, to hold back the crowd as firemen extinguished a blaze. They were armed with carbines, submachine guns, shotguns, service revolvers and some personal weapons.

When the firemen had departed, the police marched together, weapons in hand, down Lynch Street towards Alexander Center, a women’s dormitory, for reasons still unclear today. A crowd of 75 to 100 students massed together in front of the officers at a distance of about 100 feet. There were reports that students shouted obscenities at officers and threw bricks.

Someone either threw or dropped a bottle, and it broke on the pavement with a loud noise. Some say police then advanced, while others insist the officers simply opened fire, or even others believe a campus security officer had the students under control. At any rate, police began shooting, and later said they had been fired upon by someone inside the Alexander West dormitory or that a powder flare had been spotted in the third floor stairwell window. Two television news reporters agreed that a student had fired first, but were unsure as to where, while a radio reported believed a hand holding a pistol had extended from a window in the women’s dormitory.

At 12:05 AM on May 15, then, police opened fire on Jackson State students and fired for approximately thirty seconds. Students ran for cover, mostly inside one of the doors to Alexander West dormitory. Later police insisted that they had only fired on the dorm, but today bullet holes can still be found in a building façade 180 degrees across the street.

Struggling to get inside, students bottlenecked at the west end door of Alexander West. Some were trampled, while others fell from buckshot pellets and bullets. They were either left on the grass or dragged inside.

Fifty feet from the west end entrance to the dormitory, Phillip Lafayette Gibbs, age 21, lay dead from four gunshot wounds: two in his head, one under his left eye, and one in his left armpit. Gibbs left behind a wife, one child, and another on the way.

Behind the police line across the street, James Earl Green, age 17, was lying dead in front of B. F. Roberts Dining Hall. Green was a senior at Jim Hill High School and on his way home from work at a grocery store when he paused to watch the riot. Police later claimed they had been fired on from the dining hall. Green was killed by one gunshot.

Fifteen other students were wounded, at least one of whom was sitting inside the dormitory lobby.

Each window facing the police in the five story dormitory was shattered. At least 460 rounds were fired on the building, while investigators counted over 160 bullet holes on the outside of the stairwell alone.

Ambulances were not called for the injured students for twenty minutes while officers picked up their shell casings. Police also attempted to remove the shattered glass, but students and even sympathetic whites camped out on the lawn to prevent this until investigators had arrived.

Police and state troopers left the scene, while National Guardsmen remained behind. Jackson city officials later denied that city police had fired, and made no issue of the involvement of highway patrolmen.

“At Kent State there are photographs from the beginning of the incident on,” comments current Jackson State University archivist Juanita Murray. “There are no photographs of that night at Jackson State. There are no photographs of the bodies lying on the ground… there was enough time to cover up what happened before the next newscast.”

Local media coverage was poor and racist, with a few papers reporting that blood tests revealed that Gibbs was legally drunk when he was shot. Even the university newspaper did not report on the tragedy until a special edition one year later.

Members of a grand jury and a jury at a civil trial refused to indict any of the officers involved in the shootings. In 1974, a US Court of Appeals ruled that the officers had overreacted but that they could not be held liable for the two deaths that resulted. In 1982, all but two US Supreme Court Justices refused to hear the case.

On June 13, 1970, President Nixon formed the Commission on Campus Unrest. After the Commission’s first meeting on June 25, public hearings were held for thirteen days at Jackson, Mississippi; Kent State, Ohio; Washington, DC; and Los Angeles, California. Despite the testimony of Jackson State administration, faculty, staff, and students, no arrests or convictions were made.

The Jackson City Council voted to permanently close Lynch Street to through traffic, and added the initials J. R. to street signs to denote John R. Lynch, Mississippi’s first black congressman.

After the closing of Lynch Street, a plaza was constructed. The Gibbs-Green Plaza, commonly referred to as “Plaza,” is a multi-level brick and concrete structure that blocks off J. R. Lynch Street and lies between Alexander Hall and the University Green. Students often meet and spend time here in good weather. Jackson State frequently holds outdoor events on the Plaza, such as dances, concerts, Greek shows, and Homecoming gatherings.

Just north of the Plaza is the Gibbs-Green Monument, which stands outside of Alexander West dormitory.

In 1995, Demetrius Gibbs, son of Phillip Gibbs, received his degree from Jackson State. He says, “If I try to tell people about the shootings at Jackson State, they don’t know about it. They don’t know until I say ‘Kent State.’ For us to even be acknowledged, it had to happen at Kent State first.”

Beyond Cerveza: The Real History of Cinco de Mayo

In Uncategorized on May 5, 2010 at 11:45 am

By: Allison Ford @ Divine Caroline

As Americans, we celebrate plenty of holidays that we don’t fully understand or appreciate. Here’s a pop quiz: What does St. Patrick’s Day actuallycommemorate? What is the real significance of Mardi Gras? Both festivals have legitimate origins, but nowadays they seem to be just two more excuses to dress up in funny hats and drink copious amounts of alcohol. Many people also love to celebrate Cinco de Mayo, but how many of us know what the holiday really stands for?

Most Americans assume that Cinco de Mayo honors the day of Mexican independence, but that’s not correct. Far from being the Mexican version of our Fourth of July, it’s actually more like the Mexican version of Columbus Day, a holiday acknowledged by a few people but completely ignored by the majority. It commemorates the Battle of Puebla, which took place on May 5, 1862, when Mexican forces, despite being badly outnumbered, defeated the invading French army. The battle would not even be considered historically significant except for the fact that the Mexicans’ underdog win became a great source of national pride, and because it marks the last time any foreign army invaded North American soil.

In Mexico, it’s considered a regional celebration, observed in the state of Puebla but not many other places. It’s not even an official federal holiday. But Americans have embraced Cinco de Mayo and everything that accompanies it—mostly the music, the food, and, of course, the cerveza.

Drink-o de Mayo
The holiday started to gain steam in the United States in the 1960s and ’70s. As an increasing number of Hispanic immigrants found their culture underrepresented in schools and in public life, people looked for ways to acknowledge and celebrate Hispanic heritage, and companies looked for ways to increase sales and promote their products. What started out as smallish gatherings in cities with large Hispanic populations turned into a new national holiday, celebrated with far more vigor in the United States than anywhere in Mexico. Some have suggested that Cinco de Mayo, rather than the real Mexican Independence Day (September 16), was presented as the more palatable of the two holidays to white Americans because it didn’t have the anti-imperialist sentiment that many Independence Day celebrations did.

Just as “everyone’s Irish on St. Patty’s Day,” beer companies encourage everyone to embrace Mexican heritage on this day, which results in consumers of all races and nationalities buying Mexican beer, liquor, and food. As the Hispanic population continues to grow, and more and more people develop a taste for Mexican food and beverages, the holiday becomes more mainstream. Corona Extra is now the number-one imported beer in the United States, and May 5 has become a recognized party holiday, just like St. Patrick’s Day or Mardi Gras, often with a buildup of a month or longer. Other companies have gotten in on the Cinco action, too, including Frito-Lay companies, with their salsa and guacamole products, and even American brewers looking to steal some of the market away from Corona, Tecate, and Dos Equis.

Growing Population, Growing Dilemma
People of Hispanic descent don’t always appreciate the blatant co-opting of their culture for the sake of partying and profits. Just as many Americans are offended by the commercialization of Christmas, some Mexicans feel that it’s disrespectful to use their nation’s history as a cheap excuse to hit up happy hour. According to the San Francisco Chronicle, a growing number want to take back the holiday, returning it to a celebration that focuses on family and culture, instead of on tequila and taquitos. As the holiday’s profile broadens, communities are becoming just as likely to see family-friendly festivals that include traditional music, food, storytelling, and dancing as they are to see out-of-control bar crowds.

Another growing concern is how the holiday’s marketers disproportionally target young Hispanics. The Center on Alcohol Marketing and Youth found that young Latinos and Latinas are exposed to more alcohol advertising than their adult counterparts. Major brewers like Anheuser-Busch and Coors even have entire divisions specializing in marketing to Hispanics. The result is that Hispanic adolescents are more likely to drink and get drunk than their peers of other racial groups, and they’re more likely to start drinking at an early age. The companies encourage young Hispanics to see the celebration of Cinco de Mayo (and its many inebriated revelers) as a matter of ethnic pride, and because Hispanics are the fastest-growing group of immigrants, it’s easy to imagine that this targeted marketing is a strategy to develop customers for life.

St. Patrick’s Day is the feast day for the patron saint of Ireland. Mardi Gras is a celebration that leads up to Ash Wednesday, the beginning of the forty days of Lent. Cinco de Mayo may not be the holiday people thought it was, but it still honors an event of which Mexicans are rightly proud. Every holiday can be turned into an excuse to overindulge, but as you raise your glass in a toast this May 5, don’t forget that the celebration has origins that extend far beyond mariachi music and margaritas.

Kent State Massacre 40 years on: Why four died in Ohio

In Uncategorized on May 4, 2010 at 7:37 pm

Oldspeak: One of the most colossal blunders ever perpetrated by the U.S. military happened 40 years ago today. Thankfully hard lessons have been learned from the tragedy.

From The National Post (Canada)

Four decades ago today, members of the Ohio National Guard shot dead four students at Kent State University. Peter Shawn Taylor looks at the events of that day and what we’ve learned from the deaths of those four protesters.

It’s been 40 years since four died in Ohio.

Today is the 40th anniversary of the shootings at Kent State University. These deaths — William Schroeder, Sandra Scheuer, Jeffrey Miller and Allison Krause — at the hands of the Ohio National Guard on May 4, 1970, persist as a significant historical marker.

The incident is today seen as the moment the free-spirited idealism of the Sixties collided head-on with the state’s deadly coercive powers. A nation’s youth, attempting to exercise their rights to free speech and assembly, were deliberately gunned down by their own government.

That the deaths were immortalized in iconic photos and the song Ohio by the band Crosby, Stills, Nash and Young, which still receives substantial airplay today, has solidified the incident’s stature as a signature moment in modern social and cultural history.

But here’s another way of looking at it. Kent State was one of the biggest blunders ever committed by the American military.

A close examination of the actions of the National Guard on May 4, 1970, reveals numerous and egregious failures of preparation, communication, leadership, equipment and tactics. From this perspective, the shootings were not so much a social earthquake as a series of dramatic mistakes by trained men at arms. And it’s an event that has had a permanent impact on modern crowd control methods. A revolution in mob management — rather than the sentiments of a classic rock anthem — is the true legacy of Kent State. No one, especially not the Army, wants another four dead in Ohio.

The events at Kent State were set in motion with U.S. president Richard Nixon’s April 30, 1970, announcement that he was invading Cambodia as part of the war in Vietnam.

Students across North America rose up against the news. At Kent State, a typically peaceful school about 50 km south of Cleveland, the protests quickly took a dangerous turn. The campus building for the Reserve Officer Training Corps (ROTC) program was set ablaze on Saturday, May 2. When the municipal fire department arrived to put it out, protesters attacked the firemen — slashing their hoses with machetes and tossing bricks at them.

The firemen promptly retreated and the local mayor called in the National Guard. The firetrucks then returned under military escort. By this time the ROTC building had burnt to the ground; but the Guardsmen, mostly young, part-time soldiers, stayed put.
Over the weekend the campus remained generally quiet. On Monday, May 4, student leaders called a noon-hour rally to protest the military presence. This was to occur on the Commons, a field adjacent to the smouldering ruins of the ROTC building. The Guard leadership, however, declared a curfew. Whether this was sanctioned by civil authorities remains a contentious issue today. It appears not.

Regardless of the curfew’s legitimacy, as students began to gather and provocatively ring an historic bell on the Commons, the Guard decided a show of force was necessary.
Around 11:45 a.m., 96 Guardsmen and seven officers assembled on the Commons in a skirmish line led by Brig-Gen. Robert Canterbury. The troops wore standard-issue fatigues and helmets and were armed with M-1 carbines and bayonets. The soldiers loaded their rifles, fixed bayonets and began marching across the field to disperse the crowd, estimated at perhaps 2,000 students.

The Guardsmen succeeded in their immediate goal of clearing the Commons. Yet the mob proved an elusive foe. It parted to let the soldiers pass but then formed up again around the edges of the Commons. Some students began throwing rocks from a nearby construction site. The FBI would later collect 174 pounds of rocks from the field.

Without any clear purpose other than shooing away the pesky students, Canterbury kept his troops marching. With gas masks on, the Guardsmen launched tear gas at the crowd. The students threw the canisters back, in a kind of tennis match.

The soldiers then went up a small rise beside the Commons called Blanket Hill. They paused briefly at a concrete pagoda. Then, like a dog chasing a squirrel, they took off after the students again — blindly marching down the other side of Blanket Hill.
The trek ended when the soldiers arrived on a practice football field with fences on three sides. Behind them was a student mob at the top of the hill.
Without any clear objective in mind, the troops had marched themselves straight into a box.

Such a move should be seen as conclusive a tactical error as Lt.-Col. George A. Custer’s actions at Little Bighorn, where he charged recklessly down a river bed with 200 men, only to find several thousand Sioux waiting for him. If it had been a war, Canterbury would have led his men into a bloodbath.

Instead, the event played out like a grim and pointless parade. Students were screaming “pigs off campus” and “Fascist bastards.” The air was thick with bricks. And no one in charge seemed to know what they should be doing or where.

The gas masks also seemed to have a dehumanizing effect on the soldiers, both visually and mentally. On the football field, some soldiers dropped to a knee and pointed their guns menacingly at the students. No one fired

When these actions didn’t scare the protesters off, the only thing left was to march back up the hill through the gauntlet of rocks and angry students. Canterbury at this point assumed he had run out of tear gas, so he stopped calling for gas grenades. In fact there were plenty left. Once they reached the pagoda again the soldiers once more shouldered their weapons.

This time 61 shots rang out in 13 seconds. Four students died, nine were wounded.

In the numerous investigations that followed, no officer ever admitted to giving a fire order. Whether it was a spontaneous reaction to the harassment, a misheard directive or a deliberate action that was later covered-up, someone — either the officers or enlisted men — lost control.

And they lost control with M-1 carbines, the only weapons they carried that day.

In the days following the shooting, public opinion was actually split. Many Americans thought the students got what they deserved for torching the ROTC building and attacking the Guardsmen. President Nixon’s comments seemed to echo this sentiment. “When dissent turns to violence it invites tragedy,” he said.

Yet any sympathy for the Guardsmen was eventually swamped by the student’s perspective. Close examination by the media, FBI and the Nixon-appointed Scranton Commission into Campus Unrest, as well as civil lawsuits, made it clear the students’ actions had not warranted lethal force that day. Public opinion eventually turned the Guard’s actions into a convincing and comprehensive defeat. “Kent State was a national tragedy,” the Scranton Commission concluded. “We must learn from the particular horror of Kent State and ensure it is never repeated.”

Perhaps surprisingly, this rhetoric has largely come to pass. The many mistakes committed at Kent State were not ignored by the military or law enforcement. Today it is seen as a watershed moment in the history of crowd control and non-lethal tactics.

“Kent State has become the standard from which everything else is measured,” says Sid Heal, a crowd control expert with 33 years experience (now retired) at the Los Angeles Sheriff’s Department. “Everybody studies it. One hundred years from now, we will still be studying it. It is the perfect example of what not to do.”

Heal commanded police and military personnel during the 1992 Rodney King riots in L.A., served as an advisor with the U.S. Marines in Somalia in 1995 and currently lectures at war colleges on mobs, riots and non-lethal weapon techniques. He has a checklist of necessary steps for minimizing confrontations with large crowds. The Ohio National Guard missed them all.

Today crowd control begins with exhaustive pre-planning. Preparations for major events such as the Vancouver Olympics or the upcoming G20 summit in Toronto begin months or years in advance. Every possible scenario is contemplated. For example, Toronto recently announced an empty east-end movie studio will serve as a massive temporary G20 jail, if required.

There was no pre-planning or basic reconnaissance at Kent State. Canterbury didn’t even know where he was going.

The next step, says Heal, is the need to establish a clear mission. This typically acknowledges the right of citizens to peaceful assembly, but also sets out the objectives of law enforcement. Negotiations with protesters have become a crucial component of this process. “It is important to establish what the protesters want and what they intend to do,” notes Heal. The student rally at Kent State was a legitimate event.

Containment of a mob is now considered more desirable and practical than outright dispersal. Canterbury discovered this the hard way. The creation of designated free speech areas, separate from the security zone, has become standard operating procedure for potentially volatile events. While these “protests pens” are often contentious, they provide a safer outcome for everyone.

The understanding of crowd psychology has also improved dramatically since 1970. Mobs are not homogenous, observes Heal:

“There are provocateurs, supporters and those who are just mildly interested. You have to realize that not everyone is your enemy.” It makes no sense to tear gas innocent students on their way to class, as was the case at Kent State. This risks turning bystanders into participants.

As for crowd control personnel, Heal observes that well-trained, competent front-line officers are vital; and their main job is to keep a close watch on the rank-and-file. “It is very easy to get emotional in a riot,” he says. “But we can never be seen as the instigator.” On-field leadership’s inability to control their troops’ emotion was another obvious failure at Kent State.

Showing up with M-1 carbines was the Guard’s final mistake. “This left them with only two choices: shout or shoot,” notes Heal.

“You’ve got to have more options than that.” Since 1970 there have been huge innovations in crowd control equipment. Full-face shields, body armour and batons are now standard riot gear. And an entire armoury of non-lethal weaponry has been invented.

The U.S. military even has a Joint Non-Lethal Weapons Directorate, charged with creating equipment and tactics that don’t kill their enemies. This alone should be considered a revolution.

While investigations into the U.S. race riots of 1967 had earlier recommended better training and equipment for soldiers and police involved in crowd control, it took the Kent State shootings to fully implement such changes. No one moves crowds with rifles and bayonets anymore.

“We have learned that prevailing on the field is not always as important as legitimacy,” notes Heal. The modern take on crowd control is that it’s necessary to show restraint and attempt to resolve situations with minimal force. Anti-globalization riots in Seattle in 1999 and Quebec City in 2001 have certainly tested that commitment but there has never been another Kent State. The lessons have stuck.

“Kent State was a whole series of errors,” says Heal. “And those mistakes were so profound they cascaded into a calamity.

Nobody wants a repetition of that.”

Read more: http://news.nationalpost.com/2010/05/04/kent-state-40-years-on-why-four-died-in-ohio/#ixzz0n0M7V3Bh

Follow

Get every new post delivered to your Inbox.

Join 405 other followers