"In a time of universal deceit telling the truth is a revolutionary act." -George Orwell

Posts Tagged ‘Global Ecological Crisis’

Pumped Dry: Global Water Crisis Widespread And Worsening With Continued Depletion Of Groundwater

In Uncategorized on December 11, 2015 at 8:11 pm
https://i0.wp.com/www.gannett-cdn.com/-mm-/c254a5ba5545053fc653683fe88e8fc3d98860de/c=0-79-3176-1873&r=x1523&c=2160x1520/local/-/media/2015/12/02/PalmSprings/PalmSprings/635846506206475077-Kansas-IJ-6.JPG

Kansas Farmer Jay Garetson said: “Thinking about Jared and the challenges that his generation faces, that’s what leaves you gasping for air. It kind of leaves you at a loss for what to do next,” he said, wiping a tear.

 

Oldspeak: “Expect this existential crisis to intensify as temperatures rise and conditions worsen. Sustainability tipping points have been passed for 21 of 37 of Earth’s largest aquifers.  I would imagine these figures are not taking into account the incalculable and permanent damage being done to our water supplies by our extractive energy and mineral mining practices. This story focuses on the U.S. but this story is being told worldwide, be sure to click on the link to the original story &  check out the stories documenting the carnage in India, Peru and Morocco at the bottom of the article. Hmm, Less water for what’s expected to be 9 billion humans, what’s the worst that could happen!?” -OSJ

 

Written By Ian James and Steve Reilly @ The Desert Sun:

SUBLETTE, Kansas – Just before 3 a.m., Jay Garetson’s phone buzzed on the bedside table. He picked it up and read the text: “Low Pressure Alert.”

He felt a jolt of stress and his chest tightened. He dreaded what that automated message probably meant: With the water table dropping, another well on his family’s farm was starting to suck air.

The Garetson family has been farming in the plains of southwestern Kansas for four generations, since 1902. Now they face a hard reality. The groundwater they depend on is disappearing. Their fields could wither. Their farm might not survive for the next generation.

At dawn, Jay was out among the cornfields at the well, trying to diagnose the problem. The pump was humming as it lifted water from nearly 600 feet underground. He turned a valve and let the cool water run into his cupped hands. Just as he had feared, he saw fine bubbles in the water.

“It’s showing signs of weakening,” he said sadly, standing in the shoulder-high corn.

“This’ll last another five or 10 years, but not even at the production rate that we’re at here today,” he said. “It’s just a question of how much time is left.”

Time is running out for portions of the High Plains Aquifer, which lies beneath eight states from South Dakota to Texas and is the lifeblood of one of the world’s most productive farming economies. The aquifer, also known as the Ogallala, makes possible about one-fifth of the country’s output of corn, wheat and cattle. But its levels have been rapidly declining, and with each passing year more wells are going dry.

As less water pours from wells, some farmers are adapting by switching to different crops. Others are shutting down their drained wells and trying to scratch out a living as dryland farmers, relying only on the rains.

In parts of western Kansas, the groundwater has already been exhausted and very little can be extracted for irrigation. In other areas, the remaining water could be mostly used up within a decade.

The severe depletion of the Ogallala Aquifer is symptomatic of a larger crisis in the United States and many parts of the world. Much more water is being pumped from the ground than can be naturally replenished, and groundwater levels are plummeting. It’s happening not only in the High Plains and drought-ravaged California but also in places from the Gulf Coastal Plain to the farmland of the Mississippi River Valley, and from the dry Southwest to the green Southeast.

In a nationwide examination of the problem, USA TODAY and The Desert Sun analyzed two decades of measurements from more than 32,000 wells and found water levels falling in nearly two-thirds of those wells, with heavy pumping causing major declines in many areas. The analysis of U.S. Geological Survey data revealed that:

  • Nationwide, water levels have declined in 64 percent of the wells included in the government database during the past two decades.
  • The average decline among decreasing wells has been more than 10 feet, and in some areas the water table has dropped more than 100 feet during that period – more than 5 feet per year.
  • For 13 counties in Texas, New Mexico, Mississippi, Kansas and Iowa, average water levels have decreased more than 40 feet since 1995.
  • Nationally, the average declines have been larger from 2011-2014 as drought has intensified in the West. But water tables have been falling consistently over the years through both wet and dry periods, and also in relatively wet states such as Florida and Maryland.
  • Across the High Plains, one of the country’s largest depletion zones, the average water levels in more than 4,000 wells are 13.2 feet lower today than they were in 1995. In the southern High Plains, water levels have plunged significantly more – in places over 100 feet in just 20 years.

Average water level decrease in US counties

In many counties across the United States, groundwater levels have been dropping.

The problem is especially severe in the region that relies on the Ogallala Aquifer.

Aquifers are being drawn down in many areas by pumping for agriculture, which accounts for nearly two-thirds of the nation’s use of fresh groundwater. Water is also being drained for cities, expanding development and industries. Across much of the country, overpumping has become a widespread habit. And while the symptoms have long remained largely invisible to most people, the problem is analogous to gradually squandering the balance of a collective bank account. As the balance drops, there’s less of that resource to draw on when it’s needed.

At the same time, falling groundwater levels are bringing increasing costs for well owners, water utilities and society as a whole. As water levels drop, more energy is required to lift water from wells, and those pumping bills are rising. In areas where aquifers are being severely depleted, new wells are being drilled hundreds of feet into the earth at enormous cost. That trend of going deeper and deeper can only go on so long. When groundwater levels fall to precarious lows and wells are exhausted, farming businesses can suffer. And in particularly hard-hit communities, such as parts of California, homeowners have been left relying on tanker trucks to deliver their water.

Since the beginning of the 20th century, the United States is estimated to have lost more than 1,000 cubic kilometers of water from the nation’s aquifers – about 28 times the amount of water that can be held in Lake Mead, the country’s largest reservoir.

That estimate of water losses from 1900 through 2008, calculated by USGS scientist Leonard Konikow, shows the High Plains has accounted for 35 percent of the country’s total depletion. California’s Central Valley accounted for more than 14 percent, and other parts of the country have depleted the remainder, about half of the total.

In places, water that seeped underground over tens of thousands of years is being pumped out before many fully appreciate the value of what’s lost. The declines in groundwater in the United States mirror similar decreases in many parts of the world.

NASA satellites have allowed scientists to map the changes underground on a global scale for the first time, putting into stark relief a drawdown that has long remained largely out of sight. The latest satellite data, together with measurements of water levels in wells, reveal widespread declines in places from Europe to India, and from the Middle East to China.

“Groundwater depletion is this incredible global phenomenon,” said Jay Famiglietti, a professor of earth system science at the University of California, Irvine, and the senior water scientist at NASA’s Jet Propulsion Laboratory. “We never really understood it the way we understand it now. It’s pervasive and it’s happening at a rapid clip.”

Famiglietti and his colleagues have found that more than half of the world’s largest aquifers are declining. Those large-scale losses of groundwater are being monitored from space by two satellites as part of the GRACE mission, which stands for Gravity Recovery and Climate Experiment.

Since 2002, the orbiting satellites have been taking detailed measurements of Earth’s gravity field and recording changes in the total amounts of water, both aboveground and underground. Using that data, the researchers have created a global map showing areas of disappearing water as patches of yellow, orange and red. Those “hotspots” mark regions where there is overpumping of water or where drought has taken a toll.

The map shows that, just as scientists have been predicting due to climate change, some areas in the tropics and the higher latitudes have been growing wetter, Famiglietti said, while many dry and semi-arid regions in the mid-latitudes have been growing drier. In those same dry regions, intensive agriculture is drawing heavily on groundwater. And with little rain to recharge the aquifers, their levels are dropping.

“Many of these resources are finite,” Famiglietti said. “It took tens of thousands of years to accumulate this water, and we’re burning through it in a matter of decades.”

In many regions, government agencies and water districts have studied the problem but haven’t taken sufficient steps to manage aquifers or prevent declines.

Alongside climate change, groundwater depletion has become another human-caused crisis that could bring devastating consequences. As aquifers are pushed far beyond their natural limits, water scarcity is battering farms, undermining economies and intensifying disputes over water.

In parts of the southern High Plains, farmers are feeling the effects. Some counties have seen small decreases in population as people have moved away. Local leaders have been expressing concerns about what sorts of businesses can help sustain their economies as water supplies dwindle.

The Kansas Geological Survey has mapped out how much longer the aquifer can support large-scale pumping. It projects that some places still probably have more than a century of water left, but that large patches of western Kansas will go dry in less than 25 years. Some areas will likely run out faster, within a matter of years.

https://i1.wp.com/www.gannett-cdn.com/-mm-/cdb587673cc9291d9c345a7acae6fc0c2f6d6d78/c=0-302-3264-2146&r=x1523&c=2160x1520/local/-/media/2015/11/26/USATODAY/USATODAY/635841437477040124-Kansas-IJ-10.JPG

The green circles of center-pivot irrigation systems stand out in areas where farms rely on water from the High Plains Aquifer. (Photo: Ian James, The Desert Sun)

The Ogallala Aquifer’s decline shows what the world can expect in other areas where groundwater is being quickly depleted, Famiglietti said. “The fact that they’re running out of water means that we will no longer be growing food there, and so where will that food come from?”

In Haskell County, Kansas, windswept fields of sorghum and corn stretch to the flat horizon in a swaying sea. The huge farms, many of them in the thousands of acres, still appear lush and productive. But driving along the arrow-straight country roads, Jay Garetson can point out spots where wells have gone dry – both on his family’s land and other farms.

All that’s left at one of his decommissioned wells is a round metal cover on a concrete slab, with a rusty Frigidaire lying on its side next to it. His grandfather once used the refrigerator to store oil for the pump.

Opening the well’s metal lid, Jay dropped in a rock. It pinged off the steel casing. More than five seconds later, there was faint splash.

“Now the only water it finds is a couple three feet at the very bottom of the well that the pumps can’t effectively access anymore,” Jay said, his voice echoing in the empty well.

He and his brother, Jarvis, drilled this well in the early 2000s when a shallower well failed. It lasted less than a decade, and then it went dry in 2012, forcing them to drill again – this time 600 feet deep, down to the bedrock at the bottom of the aquifer. It’s hard to say how long that well might last.

If the water keeps dropping about 5 feet per year, he said, it might be finished in as few as 10 years.

“Very simply, we’re running out, and it’s happening far faster than anybody anticipated,” he said. “And as optimistic as I’d like to be about the future, the window for that optimism is closing very quickly.”

He put the cover back on the old well, pointing out a tag that was placed on it by a state regulatory agency.

“We’re documenting very well the demise of the aquifer, but we’re not making the real-world changes in the way we manage the aquifer to really do the serious things that need to happen,” Jay said. “We seem to be unwilling to take the necessary steps to actually reduce water usage.”

Jay is an influential farmer and a longstanding member of the Kansas State Board of Agriculture who has been appointed by both Democratic and Republican governors. He has many ideas about how to extend the life of the aquifer, including mandatory water cutbacks that would be shared by farmers. But he has faced resistance from those who oppose mandatory limits.

Over the past five years, the pumping capacity of the Garetsons’ wells has decreased by about 30 percent as the water table has fallen. They’ve been forced to plant less corn and instead more wheat and sorghum, which use less water and bring in smaller earnings.

When Jay’s grandparents drilled wells in the mid-20th century, they were told the water supply was inexhaustible. They had clung to their land through the hardships of the Dust Bowl, when blowing drifts of soil and grit decimated crops and sent many others packing. In the decades that followed, they built a successful business on the water they pumped from the ground.

Since then, numerous studies have shown that the status quo is far from sustainable. Starting in 1986, Congress directed the USGS to monitor and report on changes in the levels of the Ogallala Aquifer, recognizing its economic importance. An estimated 30 percent of the groundwater used for irrigation in the country is pumped from the aquifer. Researchers have projected that without action to slow the losses, the portion of the aquifer in Kansas will be nearly 70 percent depleted within 50 years.

“What frustrates me is with all this knowledge and all this information, we still collectively refuse to act,” Jay said. “I don’t understand how we can all be so lacking in courage when we all can clearly see this is a train wreck happening in slow motion.”

The costs of inaction are visible just down the road, at a farmhouse where Jay lived as a young boy. Today the white house is abandoned. Weeds have grown around the front steps. Scraps of wood lie in a pile on the porch like logs on a campfire.

When the well went dry two years ago, a farm employee was forced to move out. The Garetsons drilled test holes but found no more water to tap.

In the yard, Jay pointed out the spot beneath a dying elm tree where he used to play on the swings. “It’s probably seen its last swing set in the yard,” he said wistfully.

“It’s something I used to read about and study, you know, the Dust Bowl. And you would see these abandoned farmsteads, and now I’m actually seeing it in my own lifetime,” he said. “Now we’re kind of at the end of the tracks here, and the only thing left to do is decide whether we should go ahead and push the house in and burn it, or probably the most painful option in my mind is to stand back and watch time just slowly melt it down.”

The worst-case scenario, he said, is that within a decade many more homes in the area could look just like this one – dry and deserted.

Jay Garetson checks on a well that is starting to weaken as the Ogallala Aquifer declines. Steve Elfers, Ian James

The United States, along with India and China, is one of the largest users of groundwater in the world.

The federal government has estimated that in 2010, the country used 76 billion gallons of fresh groundwater per day. That’s 117,000 cubic feet per second, roughly comparable to Niagara Falls. Wells across the country are pumping out as much water – even slightly more – than the average flow of approximately 100,000 cubic feet per second that tourists see plunging from the top of Niagara Falls.

When groundwater is pumped from wells, some of it is soaked up by plants, some evaporates, some courses through pipes to cities, and some soaks back into the ground. Part of it ends up flowing into the oceans, adding to the global problem of rising seas as glaciers and ice sheets melt.

Most of the planet’s available freshwater lies underground. Aquifers store water like sponges, holding it in the spaces between rocks, sand, gravel and clay. So much water is now being sucked from some aquifers that those underground spaces are collapsing and the surface of the Earth has been permanently altered.

The ground has sunk in parts of California, Texas, Arizona and Nevada, cracking the foundations of houses, leaving fissures in the ground, and damaging roads, canals and bridges. As layers of aquifers gradually subside, their water-storing capacity is irreversibly decreasing.

Groundwater levels have changed relatively little in some of the country’s wetter areas, as rainfall and snowmelt have offset the amounts pumped out. But even in pockets of the Northeast and upper Midwest, there have been significant declines. Average water levels in Cumberland County, N.J., for instance, decreased nearly 6 feet over the past two decades. In Outagamie County, Wis., there was a decline of 6.1 feet.

Elsewhere, there has been significant depletion across entire regions, largely driven by agriculture. Average water levels fell by 5.7 feet across the Mississippi River Valley aquifer system, by 12.6 feet in the Columbia Plateau basaltic rock aquifers of the Pacific Northwest, and by 17.8 feet in some of the Snake River Plain’s aquifers of southern Idaho.

As the nation’s population grows, expanding cities and suburban development are also having an effect. Total U.S. water use has decreased in recent years due to improvements in efficiency and conservation, but the cumulative strains on groundwater have continued to build.

Big drops in water tables have occurred in many parts the country. The U.S. Geological Survey’s data show that individual monitoring wells with water level decreases of more than 100 feet in the past two decades are located in a long list of states: California, Nevada, New Mexico, Texas, Maryland, Washington, Oregon, Kansas, Iowa, Arkansas, Idaho, Arizona, Louisiana, Colorado, Wyoming and Mississippi.

Saltwater has been seeping into declining aquifers along portions of the Atlantic coast in places such as Hilton Head, S.C. and Savannah, Ga., and beneath coastal cities in Florida such as Jacksonville, Miami and Tampa. When saltwater intrusion taints supplies of drinking water, it can force water districts to use different wells or invest in other costly solutions.

In parts of the desert Southwest and the Great Plains, natural springs that used to gush from the ground have dried up.

There have also been long-term declines in groundwater levels around urban areas including Chicago, Milwaukee, Wis., Long Island, N.Y., Baton Rouge, La., Memphis, Tenn., and Houston.

In each state, the use of groundwater falls under different laws. In many areas, though, the agencies charged with managing water supplies have allowed aquifers to fall into a state of perpetual overdraft, with water levels receding deeper by the year. Even where groundwater regulations exist, pumping often remains largely unchecked.

“Like your bank account, you can’t keep depleting it forever. That’s a non-sustainable condition,” the USGS scientist Konikow said. “Society will have to do something about it. Some areas, they are doing things about it. Other areas, it’s going to kind of slap them in the face at some point as a wake-up call.”

In the farm country of Grant County, Kansas, where grain silos tower over fields that stretch out to a flat horizon, the chamber of commerce hosts an annual dinner that has been a tradition for 53 years. Hundreds of people line up while volunteers dish out local food: barbecued beef, sweet corn, candied squash and prized doughnuts made with milo, another name for sorghum.

The dinner consistently attracts top state politicians. This September, when Lt. Gov. Jeff Colyer gave a speech to a packed auditorium, he emphasized the importance of water.

“We all know here that the lifeblood of our land is that Ogallala Aquifer below us,” Colyer said. “We’ve got to rely on that water.”

He said that’s why Gov. Sam Brownback recently launched an effort to develop a “50-year water vision” for the state. Colyer said southwestern Kansas is working to preserve its water, and he pointed to the large cattle industry and the fast-growing dairy business as signs of a bright economic future.

Those applauding at the long tables included Jay Garetson, his wife, Jill, and two teenage sons. But while Jay credits the state government with doing more than ever to focus on water, he’s concerned the consensus-building approach and the voluntary measures being promoted aren’t enough.

In his office, he rolled out a map to explain why. The map is marked with patches of orange and red denoting areas that have relatively little water left. In one of those spots, “right in the bull’s eye,” he pointed to the family’s hometown of Sublette.

The biggest problem, he said, is that no one can slow down the decline alone. And those who try to use less water will have the aquifer pumped out from beneath them by neighbors.

“Everybody’s got a straw in the same soda,” Jay said. “When you have a common resource, and the individual motivations are to accelerate the use rather than to stretch it out over a period of time, the net result is everybody loses.”

The economics of the profit-driven status quo are driving the depletion, he said, and that points to a need for the state and the regional groundwater district to intervene – like a referee in a sporting event that has deteriorated into a free-for-all. He said the referee should “call a timeout.”

Then, he said, “we need to sit down and think about changing the rules.”

Wells have been drawing out less water and going dry in places from eastern Colorado and the Texas Panhandle. Northern portions of the aquifer in Nebraska still have more water remaining, but parts of the southern High Plains have been left with parched fields.

In areas where little water remains, people have been turning to dryland farming, relying on the rains to grow wheat and other crops. That switch leads to sharply reduced earnings per acre. It requires farmers to use much bigger acreages to turn a profit. It means the land will support far fewer farms, and that could bring hard economic times.

Jay’s brother Jarvis explained how profound those changes could be, pausing from his work after changing a flat tire on a center-pivot irrigation system.

“It’s tough to think about what’s been in my family for well over a hundred years not being here in 20. It may mean that my kids or my nephews don’t come back, may not even have a chance if that’s their desire,” he said, his voice quavering. “It’s just tough to think about it not being there. I mean, it’s a way of life.”

Trying to make the aquifer last longer, some farmers have been adopting water-saving irrigation systems. A sign on one highway reads: “Make Every Drop of Water Count.”

Marieta Hauser, a dryland farmer who is director of the Grant County Chamber of Commerce, said she’s concerned about what sorts of businesses could take the place of irrigated farming, which drives the economy.

“Ideally we all want the aquifer to last forever. It’s not going to. We realize that. So what’s the best way to go forward and maintain the viability of our communities and our businesses?” Hauser said. “Those are the discussions that I hear more than anything, is ‘What’s going to happen to our communities when irrigation is not viable?’”

Some towns, such as Ulysses and Johnson City, have been buying water rights from farmers to secure enough drinking water supplies to keep the taps flowing.

One experiment aimed at slashing water use on farms is underway in Sheridan County, in northwestern Kansas, where the state’s first “Local Enhanced Management Area,” or LEMA, was established in 2013. Through that five-year plan, farmers are trying to keep within a “budget” that calls for a 20 percent reduction in water use.

Even as that strategy is showing signs of working, water managers acknowledge it’s not coming close to halting declines in the aquifer. It’s simply buying a bit more time.

Mark Rude, executive director of the Southwest Kansas Groundwater Management District No. 3, can put a specific number on the gap between the amounts of water pumped and the quantities of rainfall that recharge the aquifer in an average year: “We’re only about 9 percent sustainable.”

In other words, the people of southwestern Kansas are pumping out 11 times more than the aquifer’s natural recharge. People are barred from adding new wells in the area. If a new well is drilled, it needs to replace another well that is shut down.

In practice, the water rights system doesn’t limit pumping at all. In fact, farmers are using much less than they would be permitted under the system of appropriated groundwater rights, which was established decades ago when water seemed plentiful and flood irrigation was the norm.

“Ultimately, I think, budgeting the aquifer is where any area has to start. How much do you have and how much are you willing to see consumed? That’s always a difficult step,” Rude said. When the water district has held meetings and asked farmers whether they’re in favor of developing a water budget, some have been apprehensive about restrictions or mandates.

While some keep pumping, others are leaving. Within the traditional Mennonite community, elders have begun sending away young couples to settle in other areas such as the Snake River Plain in Idaho, where even though aquifer levels are declining, more water remains. They’re leaving, Rude said, “because as the water supply leaves, the intensity of agriculture leaves, and the job opportunities also leave.”

For every acre that runs out of irrigation water and starts being dry-farmed, the state estimates the economy loses nearly $4,000 a year.

The difference between irrigated fields and dryland farms appears starkly on a large satellite photo on the wall of the water district’s office in Garden City. Patches of brown border the green circles of center-pivot irrigation systems.

Moving a hand across the map, Rude pointed out spots where springs and streams have dried up. One spring was a popular swimming hole half a century ago, he said, and it doesn’t flow anymore.

Decades ago, the Arkansas River used to flow between Garden City and Dodge City. Now all that’s left are scattered patches of reeds in the dry riverbed.

Jay Garetson’s wife Jill, who is a teacher, lived near the flowing river as a child. She has watched it disappear, drained by diversions upstream and the declining water table. As a girl, she used to follow her father into cornfields while he fixed sprinklers. Now he’s out of water and relying on several oil wells for income.

“I don’t think we can continue to do things the way we’re doing them,” she said. “Some serious action has to be taken quickly.”

The Garetsons’ 17-year-old son, Jared, is cautiously assessing the future and thinks it may be difficult to return home to farm after college.

Every year he helps out during the corn harvest, and as a hobby he flies a drone to film the harvester mowing down golden rows. But he said the aquifer now seems like a gas tank with its gauge approaching “E.”

“If we lose the aquifer, we lose probably 80 percent of our crops out here,” Jared said. “If our water supply is shut off, that’s a huge amount of food that we’re going to have to find elsewhere.”

They are a close-knit family, and stories of their farming history are woven into conversations around the kitchen table. It’s a legacy that may be slipping away for Jared.

“I’ve thought, why don’t we just pack up, sell the farm and leave? And we’ll find somewhere else that’s got water and that’s going to continue to have water, where we can build?” Jared said. But that’s a difficult idea for his parents and grandparents to accept. “It’s been our home for 113 years now, and for all that to go away and just stop that, that hundred-year-old investment, and that’d be really hard to just pack up and say goodbye to everything.”

As for Jared’s future, he said in order to make long-term investments in farming, it would be crucial to secure enough water for the next 40 years.

“Until we’ve got our water issue taken care of, then I basically have no future here,” Jared said. “It’s kind of sad, but it’s the harsh reality.”

Large rice farms in the Mississippi River Valley depend heavily on water pumped from wells. So do fields of cotton, soybeans and corn across portions of Mississippi, Louisiana, Arkansas and Missouri. The farms are drawing out significantly more than is naturally replenished, and the valley’s alluvial aquifer system has been declining.

“Here, we actually get a lot of rain so you tend to not think of it as being in danger of running low on water,” said Brian Clark, a USGS hydrologist in Little Rock, Arkansas. “But just the sheer amount of use kind of poses that issue.”

Officials in Arkansas, which is the country’s top rice-producing state, are updating the state’s water plan with proposals for coping with a growing “groundwater gap” in the eastern portion of the state. They’ve recommended building infrastructure to make surface water the primary irrigation source for areas that now depend on a declining supply of groundwater.

Other proposed regulatory changes aimed at addressing strains on groundwater are being debated elsewhere, in wet regions as well as dry regions of the country.

In Arizona, state lawmakers have been under increasing pressure to consider groundwater regulations for some of the same rural areas that fought off restrictions about 35 years ago. Some farmers and residents in southeastern Arizona are concerned that unregulated pumping is drawing down groundwater levels, and have been pushing the legislature for action to limit the expansion of irrigated farmlands and begin charging fees for groundwater use.

In Wisconsin, where some people are concerned about farms’ wells drawing down streams and lakes, a bill pending in the legislature would allow state regulators to establish “groundwater protection areas” where there would be tighter permitting rules for new high-capacity wells in order to prevent environmental impacts. The proposed measures would also ease the permitting process for redrilling or repairing existing wells.

In Iowa, growing demands are being placed on the Jordan Aquifer as water is pumped for cities, farms, and industries such as ethanol plants. In June, the Iowa Environmental Protection Commission approved a new rule aimed at limiting pumping. The measures divide wells into tiers based on how much water levels have declined, and lay out procedures for reductions in water use in areas where the aquifer has dropped significantly.

Florida has also faced problems with groundwater declines as expanding development has strained water supplies. As the vast Floridan Aquifer has been drawn down, the amounts of water flowing from some of the state’s natural springs have decreased significantly, altering the sensitive environments where fish, turtles and other wildlife have long flourished.

“We have springs that are going silent because they’re not bubbling with the artesian pressure that they did in the past,” said Robert Knight, president of the Gainesville-based Florida Springs Institute, which advocates reducing the extraction of groundwater to safeguard the natural springs. He pointed out that much of the water pumped from wells is being sprayed on lawns.

As freshwater is pumped out, more seawater has been moving inland underground. And water managers across Florida have been tracking the problem and investing in remedies, including more desalination plants.

The Tampa Bay area built a seawater desalination plant that can churn out 25 million gallons of drinking water a day. The Tampa Bay Water plant, which has been operating since 2008, has helped reduce the stresses on the area’s groundwater supplies. But that has come at a price, with cost of construction alone totaling $158 million.

As the Ogallala Aquifer has declined beneath their land, Jay and Jarvis Garetson have been locked in a bitter dispute with a neighboring landowner over water.

They’re suing the company American Warrior, which owns adjacent farmland, in a case that could set a legal precedent in Kansas.

The case revolves around one of the Garetsons’ wells. They own a vested water right that is one of the oldest in the area, and they have priority under the state’s “first-in-time, first-in-right” system. They’ve claimed “impairment” of that well by two of the company’s nearby wells.

American Warrior holds junior water rights, and a judge issued an injunction temporarily barring the company from using the wells while the case proceeds.

Mike O’Brate, vice president of the family-owned American Warrior, accused the Garetsons of suing out of “greed” and said a lawsuit isn’t the right way to settle the dispute. He said if the Garetsons win, it will set a bad precedent and more suits will follow.

“Everybody will want to file these to shut off their neighbors,” O’Brate said. “Attorneys are going to get filthy rich in a fight over water. It’s not a good thing.”

The Garetson brothers said the 2012 lawsuit was necessary to defend their family’s livelihood.

“The fact of the matter is, we have a vested right to their junior rights, and Kansas water law is very clear,” Jarvis said. “And the sad thing is we had to get the courts involved to make it happen.”

Jay said that in addition to pressing the state to enforce its laws, they hope to call attention to the urgent need for action to preserve the aquifer.

“I guess our family’s decided we’d rather call a question and force everybody to make an informed decision one way or the other than to be complicit in the death of something that didn’t have to go out this way,” he said.

After the lawsuit was filed, the Garetsons faced hostility – even death threats.

As aquifers decline, more legal conflicts are likely to flare up in places across the country. Many disputes have already ended up in the courts.

Mississippi, for instance, is in a long-running legal battle with Tennessee and the city of Memphis, claiming the neighboring state is taking groundwater that belongs to Mississippi. In California, where many aquifers have been divvied up by courts, the Agua Caliente Band of Cahuilla Indians is suing two Coachella Valley water districts in a fight over rights to groundwater.

In Kansas, the state Water Office and the U.S. Army Corps of Engineers have studied a proposal to build an aqueduct that would carry water from the Missouri River to the High Plains, and have estimated the cost at $18 billion.

When Jay and his family start talking about water, the conversation touches on mega-fixes, such as the idea of building a wind-powered pipeline from the Mississippi River.

In the meantime, Jay holds out hope there is still time to save what’s left and extend the use of the aquifer. “But it’s going to take immediate action and it’s going to take mandatory action, and that’s something that is hard for most of us out here, who are pretty individualistic and self-reliant, to contemplate.”

Any imposed cutbacks would be painful for everyone, though the pain could be spread around, he said. And water credits could be traded, creating a market that would help deal with scarcity and put the limited water toward high-value uses.

Jay sometimes wonders if roadside billboards would help increase the sense of urgency. He envisions signs with cross-section drawings of the aquifer “that show the reservoir declining and force people to admit at least, if we’re not going to act, that it was an informed decision not to act.”

Driving down a dirt road through farmland, Jay talked about what losing the aquifer would mean for his family.

“Thinking about Jared and the challenges that his generation faces, that’s what leaves you gasping for air. It kind of leaves you at a loss for what to do next,” he said, wiping a tear.

Jay said he and his brother keep trying to gain five or 10 years by using a new crop or new irrigation technologies. He said their father, Jesse, encourages them to “keep pushing” and keep praying.

“We’ll succeed somewhere. I just always thought it would be here,” he said as he pulled into his gravel driveway next to a cornfield.

He stood beside the mud-splattered pickup, petting his dog.

“In spite of everything I do and we do, it’s still not enough,” he said, sniffling softly. “My boys and my nephews will never have the … they won’t have the same opportunity.”

He paused, keeping his composure.

“If they stay here, it’ll be a salvage operation. It won’t be an expansion or a growth or an improvement. It’ll be a salvage operation,” he said. “That’s the mentality they’ll have to have – unless everybody can come together. The problem is everybody won’t come together, in my experience, until it’s too late.”

As he began to cry, he walked away.

Ian James reported from Kansas and Steve Reilly reported from McLean, Virginia.

Steve Elfers of USA TODAY, Caitlin McGlade of The Arizona Republic and Chad Gillis of The News-Press in Fort Myers, Fla., contributed to this report.

This special report was produced with a grant from the Pulitzer Center on Crisis Reporting. 

Thirsty Yet? Global Urban Water Crisis Growing: These Eight Major World Cities Are Running Out Of Water

In Uncategorized on July 9, 2015 at 4:10 pm
water pipe mumbai

A woman in India walks atop a water main on her way to collect water. (Photo: Meena Kadri/Flickr)

Oldspeak: “Behold! The fruits of Industrial Civilization! It’s just physics really. When a system of infinite growth and consumption is operated on a planet with finite biocapacity, irreplaceably essential resources will eventually run out. Once mighty rivers are drying up and or terminally polluted. Reservoirs are at critical levels. Aquifers are drying up. What are we doing? Popping out babies. Curating our artificially flavored “lives”.  Being bombarded with messages to consume more and more food, alcohol and stuff. Driven by insatiable sense-pleasures. Self  medicating at unprecedented levels in an ever-growing variety of ways, to avoid feeling the base level pain and grief and sadness of existing in our well-appointed thought prisons; of bearing witness to the Great Dying we’re a part of and experiencing whether we choose to recognize it or not. Ignoring the reality of our dying world with an insidious a seductive strain of pathological anthropocentricity. Yes. Humans are running out of water.  Ecological overshoot is getting harder to ignore. The water wars have already begun, but, ultimately, fruitless uses of energy.  Before long, as population increases, and techno-fixes fail, there will be no more water to sustain us. Only Love remains.” -OSJ

Written By Marc Herman @ Take Part:

The amount of rainfall a place gets isn’t the only factor in how much water is available to it. These major urban areas show how dire the coming global freshwater shortage could get.

Earlier this year, an obscure United Nations document, the World Water Development Report, unexpectedly made headlines around the world. The report made the startling claim that the world would face a 40 percent shortfall in freshwater in as soon as 15 years. Crops would fail. Businesses dependent on water would fail. Illness would spread. A financial crash was likely, as was deepening poverty for those just getting by.

The U.N. also concluded that the forces destroying the world’s freshwater supply were not strictly meteorological, but largely the result of human activity. That means that with some changes in how water is managed, there is still time—very little, but enough—for children born this year to graduate from high school with the same access to clean water their parents enjoyed.

Though the U.N. looked at the issue across the globe, the solutions it recommended—capturing rainwater, recycling wastewater, improving sewage and plumbing, and more—need to be implemented locally. Some of the greatest challenges will come in cities, where bursting populations strain systems designed to supply far fewer people and much of the clean water available is lost to waste and shoddy, centuries-old infrastructure.

We’ve looked at eight cities facing different though representative challenges. The amount of water in the earth’s atmosphere is more or less fixed, meaning that as populations and economies grow, what we have needs to be clean, available, and conserved. Economies, infrastructure, river systems, and climates vary from place to place, and the solutions will have to as well. Here is how eight of the world’s major cities are running out of water, and trying to save it.

TOKYO

The roof of Ryogoku Kokugikan arena in Tokyo collects rainwater to be used in the building’s toilets. The inset shows a similar system for residential use. (Photo: Facebook)

Tokyo shouldn’t have a water problem: Japan’s capital enjoys average precipitation similar to that of Seattle or London. But all that rainfall is compressed into just four months of the year, in two short seasons of monsoon and typhoon. Capturing and storing so much water in such a short period in an area four times as dense as California would be a challenge anywhere. One weak rainy season means droughts—and those are now coming about once every decade.

Betting on the rain will be a precarious strategy for the world’s most populous city and its suburbs, home to more than 30 million people. When the four rivers feeding Tokyo run low, crisis conditions arrive fast. Though efficient, 70 percent of Tokyo’s 16,000-mile-long plumbing system depends on surface water (rivers, lakes, and distant snowpack). With only 30 percent of the city’s water coming from underground aquifers and wells, there are not enough alternative sources to tap during these new cyclical droughts.

The Japanese government has so far proved forward-thinking, developing one of the world’s most aggressive programs for capturing rainwater. In Sumida, a Tokyo district that often faces water shortages, the 90,000-square-foot roof of Ryogoku Kokugikan arena is designed to channel rainfall to a tank, where it’s pumped inside the stadium for nonpotable use.

Somewhat more desperate-seeming is a plan to seed clouds, prodding the environment to do what it isn’t doing naturally. Though tested in 2013 with success, the geo-engineering hack is a source of controversy; scientists debate whether the technique could produce enough rain to make much of a difference for such a large population.

MIAMI

As a result of a 20th-century project to drain nearby swamps, water from the Atlantic Ocean began seeping in to the Biscayne Aquifer, Miami’s main source of freshwater. (Infographic: YouTube)

Though most Americans’ concern with water shortage in the U.S. is firmly focused on California at the moment, a crisis is brewing in the last place you’d figure: South Florida, which annually gets four times as much rain, on average, as Los Angeles and about three times as much as San Francisco.

But according to the U.S. Geological Survey, the essential Biscayne Aquifer, which provides water to the Miami–Dade County area, is falling victim to saltwater intrusion from the Atlantic Ocean. Despite the heavy rains replenishing the aquifer year-round, if enough saltwater enters, all of it will become unusable.

The problem arose in the early 20th century, after swamps surrounding the city were drained. Osmosis essentially created a giant sucking effect, drawing the Atlantic into the coastal soils. Measures to hold the ocean back began as early as the 1930s, but seawater is now bypassing the control structures that were installed and leaking into the aquifer. The USGS has made progress mapping the sea water intrusion, but ameliorating it seems a ways off. “As sea level continues to rise and the demand for freshwater increases, the measures required to prevent this intrusion may become more difficult [to implement],” the USGS noted in a press release.

LONDON

A view of the River Thames in London. In just a decade from now, the city’s water infrastructure will be unable to provide for its growing population. (Photo: IDS Photos/Flickr)

London faces a rapidly growing population wringing every last drop out of centuries-old plumbing. Water managers estimate they can meet the city’s needs for the next decade but must find new sources by 2025—even sooner than the rest of the world, by the U.N.’s measure. London’s utility, Thames Water, looked into recycled water—aka “toilet-to-tap”—but, being English, found it necessary first to politely ask people if they’d mind.

At least four urban districts in California use recycled water, which is treated, re-treated, and treated again to be cleaner than conventional supplies before being pumped into groundwater or other supply sources. The so-called “yuck factor” could be an impediment to this solution spreading to London and elsewhere.

CAIRO

The Nile Delta. Ninety-seven percent of Egypt’s water comes from the Nile; 85 percent goes to agriculture, and towns upwater from Cairo dump untreated agriculture and municipal waste into the river. (Photo: Wikipedia)

Five thousand years ago, an ample water supply and a fertile delta at the mouth of the Nile supported the growth of one of the world’s great civilizations. Today, while 97 percent of Egypt’s water comes from the great river, Cairo finds itself downstream from at least 50 poorly regulated factories, agricultural waste, and municipal sewage systems that drain into it.

Though Cairo gets most of the attention, a UNICEF–World Health Organization study released earlier this year found that rural areas to the city’s south, where more than half of Egyptians live, depend on the river not just for irrigation and drinking water but also for waste disposal. Engineer Ayman Ramadan Mohamed Ayad has noted that while most wastewater discharged into the Nile upriver from Cairo is untreated, the river’s enormous size has historically been sufficient to dilute the waste to safe levels (and Cairo’s municipal system treats the water it draws from the river). Ayad argues, however, that as the load increases—with 20 million people now discharging their wastes to the Nile—this will no longer be possible. The African Development Bank recently funded programs to chlorinate wastewater before it’s dumped in the river, but more will need to be done.

On the demand side, more than 80 percent of the water taken from the Nile each year is used for irrigation, mostly the inefficient method of just flooding fields, which loses significant amounts to evaporation. Two years ago, initial steps were taken to modernize irrigation techniques upriver. Those programs have yet to show much progress, however.

SÃO PAOLO

The Cantareira reservoir is one of the main water reservoirs that supplies the state of São Paulo, Brazil. The water level of the whole Cantareira System has recently fallen to 6 percent of total capacity. (Photo: Victor Moriyama/Getty Images)

When it rains in Brazil, it pours. In São Paolo, where in an average year it rains more than it does in the U.S. Pacific Northwest, drains can’t handle the onslaught, and what could be the resource of desperately needed drinking water becomes instead the menace of urban floodwater.

With the worst drought in a century now in its second year, São Paolo’s reservoirs are at barely a quarter of capacity, down from 40 percent a year ago. Yet the city still sees heavy rainstorms. But reservoirs outside the city are often polluted and are too small even at capacity to supply the metropolitan area of 20 million. Asphalt covering the city and poor drainage lead to heavy floods on city streets after as little as a quarter-inch of rain. It’s hard to believe a drought is under way if your house is ankle-deep in water, so consumers haven’t been strident about conservation. The apparent paradox of flooded streets and empty reservoirs will likely fuel an ongoing debate over proposed rationing.

BEIJING

The Jingmi diversion canal, shown here under maintenance, transports freshwater from Miyun reservoir, Beijing’s main water source, 127 kilometers to the city. (Photo: Xiao Lu Chu/Getty Images)

Poor air quality isn’t the only thing impinging Beijing citizens’ ability to enjoy a safe environment. The city’s second-largest reservoir, shut down in 1997 because of pollution from factories and agriculture, has not been returned to use.

Ensuring the cleanliness of its water is even more crucial in China than elsewhere, as there is little it can afford to lose: With 21 percent of the world’s population, China has only 6 percent of its freshwater—a situation that’s only going to get worse, as it’s raining less in northern China than it was a century ago, and glaciers in Tibet, once the largest system outside the Antarctic and Greenland and a key source of drinking water in the country’s south and west, are receding even faster than predicted. The U.N. Environment Programme estimates that nationally, Chinese citizens can rely on getting just one-quarter to one-third of the amount of clean water the rest of the world uses daily.

Add context and dimension to the issues you care about with personal stories and gripping long form narratives reported from the inside of where news is happening.

Hope emerged, however, from a 2013 study from Montreal’s McGill University, which found that an experimental program targeting farmers outside the capital showed promising results over nearly two decades. The vast Miyun reservoir, 100 miles outside Beijing, had seen its reserves reduced by nearly two-thirds because of increasing irrigation demands—while becoming polluted by agricultural runoff. Revenue from a tax on major water users in Beijing was spent paying farmers upstream from Miyun to grow corn instead of rice, which requires more water and creates more runoff.

Over the following 15 years, the study authors wrote, “fertilizer runoff declined sharply while the quantity of water available to downstream users in Beijing and surrounding areas increased.” Farmer income was not significantly affected, and cleaner water downstream led to higher earnings for consumers in the city despite the tax.

BANGALORE, India

Rendition of an apartment complex under development in Bangalore, India, and (inset) its construction. New housing is going up in the city faster than the utility can expand and repair the decaying water system. (Photos: Courtesy PrestigeConstructions.com)

Earlier this year, a report by India’s comptroller and auditor general found that the southern city was losing more than half its drinking water to waste through antiquated plumbing systems. Big losses from leaks aren’t uncommon—Los Angeles loses between 15 and 20 percent—but the situation in Bangalore is more complicated. A technology boom has attracted new residents, leading to new housing construction. Entire apartment blocks are going up faster than local officials can update the plumbing to handle additional strain on the water and sewage systems.

Bangalore’s clean-water challenges illustrate a dynamic that’s repeating itself across the world’s second-largest nation. India’s urban population will grow from 340 million to 590 million by 2030, according to a 2010 McKinsey study. To meet the clean-water needs of all the new city dwellers, the global consulting firm found, the government will have to spend $196 billion—more than 10 percent of the nation’s annual GDP. (McKinsey has a potential financial interest in India’s infrastructure, so its numbers may be inflated.)

In Bangalore, they’re already behind schedule. The newspaper The Hindu reported in March that a 2002 plan to repair the existing system and recover the missing half of Bangalore’s freshwater had yet to be implemented.

MEXICO CITY

A worker fills tanks from a water truck in a poor neighborhood in Mexico City. The city’s water utility estimates that it loses 260 gallons—enough to provide a family of four for a day—per second to leaky pipes in the system. (Photo: Reuters/Eliana Aponte)

Gravity always wins. At more than 7,000 feet above sea level, Mexico City gets nearly all its drinking water by pumping it laboriously uphill from aquifers as far as 150 miles away. The engineering challenge of hauling that much water into the sky adds to the difficulty of supplying more than 20 million residents through an aging system. Mexico City’s public works loses enough water every second—an estimated 260 gallons—to supply a family of four for a day, according to CONAGUA, Mexico’s national water commission. CONAGUA estimates that between 30 and 40 percent of the capital’s potable water is lost to leaks and spills. The good news is that leaks can be fixed.

Water quality remains a worry, however. Unsurprisingly, companies selling bottled water have done very well in Mexico. The economy growing around the lack of potable water has attracted companies such as Coca-Cola and France’s Danone, whose Bonafont (“good spring”) brand is advertised in Mexico as a weight-loss aid. (Toting a bottle will help you “feel thinner anywhere,” according to a popular television ad.)

Meanwhile, disputes over who will get access to underground supplies have turned violent: In February 2014, residents of the town of San Bartolo Atepehuacan, on Mexico City’s outskirts, clashed with police over a waterworks project they feared would divert local springs to the city’s business district. At least 100 people were injured and five arrested as the disturbances continued for more than three months.

Fukushima – A Global Threat That Requires A Global Response

In Uncategorized on October 28, 2013 at 2:30 pm
Truthout depends on you to continue producing grassroots journalism and disseminating conscientious visions for a brighter future. Contribute now by clicking here!

Workers take soil samples in Ukedo, Japan, which was evacuated after the Fukushima nuclear disaster, August 30, 2013. Two and a half years after the Fukushima Daiichi plant belched plumes of radioactive materials over northeast Japan, the almost 83,000 refugees evacuated from the worst-hit areas are still unable to go home. (Photo: Tomas Munita / The New York Times

Oldspeak: “The history of TEPCO shows we cannot trust this company and its mistreated workforce to handle the complex challenges faced at Fukushima. The crisis at Fukushima is a global one, requiring a global solution….

The problems at Fukushima are in large part about facing reality – seeing the challenges, risks and potential harms from the incident. It is about TEPCO and Japan facing the reality that they are not equipped to handle the challenges of Fukushima and need the world to join the effort. 

Facing reality is a common problem throughout the nuclear industry and those who continue to push for nuclear energy. Indeed, it is a problem with many energy issues. We must face the reality of the long-term damage being done to the planet and the people by the carbon-nuclear based energy economy.” –Kevin Zeese & Margaret Flowers

“That’s really all it boils down to isn’t it? “We cannot change anything until we accept it. Condemnation does not liberate, it oppresses.” –Carl Jung. We have to accept reality. Our energy sources and the systems of extraction and exploitation they require are unsustainable, incalculably toxic and dangerous. This is beyond dispute. Coal is not “Clean”. Diesel Gas is not “Clean”. Fracked methane gas is not “Clean” or “Natural”. Nuclear energy is not worth the gargantuan risks it poses to, well, everything that lives. We can’t waste time covering up, blame shifting or condemning past actions at this point. This incident is an ongoing, ever-expanding and uncontrolled release of massive quantities of radioactive material that threatens the planet. it is on a scale far beyond the capabilities of any one nation or corporation to stop or contain. May very well be beyond the capabilities of all nations. But we can’t keep extending and pretending that the Japanese are handing the disaster. An urgent and globally coordinated response is needed.” -OSJ

Related Story:

Fukushima Far From Over

Radioactive Rainwater Overwhelms Fukushima Nuclear Plant

By Kevin Zeese & Margaret Flowers @ Truthout:

The story of Fukushima should be on the front pages of every newspaper. Instead, it is rarely mentioned. The problems at Fukushima are unprecedented in human experience and involve a high risk of radiation events larger than any that the global community has ever experienced. It is going to take the best engineering minds in the world to solve these problems and to diminish their global impact.

When we researched the realities of Fukushima in preparation for this article, words like apocalyptic, cataclysmic and Earth-threatening came to mind. But, when we say such things, people react as if we were the little red hen screaming “the sky is falling” and the reports are ignored. So, we’re going to present what is known in this article and you can decide whether we are facing a potentially cataclysmic event.

Either way, it is clear that the problems at Fukushima demand that the world’s best nuclear engineers and other experts advise and assist in the efforts to solve them. Nuclear engineer Arnie Gundersen of Fairewinds.org and an international team of scientists created a 15-point plan to address the crises at Fukushima.

A subcommittee of the Green Shadow Cabinet (of which we are members), which includes long-time nuclear activist Harvey Wasserman, is circulating a sign-on letter and a petition calling on the United Nations and Japanese government to put in place the Gundersen et al plan and to provide 24-hour media access to information about the crises at Fukushima. There is also a call for international days of action on the weekend of November 9 and 10. The letter and petitions will be delivered to the UN on November 11 which is both Armistice Day and the 32nd month anniversary of the earthquake and tsunami that caused the Fukushima nuclear disaster.

The Problems of Fukushima

There are three major problems at Fukushima: (1) Three reactor cores are missing; (2) Radiated water has been leaking from the plant in mass quantities for 2.5 years; and (3) Eleven thousand spent nuclear fuel rods, perhaps the most dangerous things ever created by humans, are stored at the plant and need to be removed, 1,533 of those are in a very precarious and dangerous position. Each of these three could result in dramatic radiation events, unlike any radiation exposure humans have ever experienced.  We’ll discuss them in order, saving the most dangerous for last.

Missing reactor cores:  Since the accident at Fukushima on March 11, 2011, three reactor cores have gone missing.  There was an unprecedented three reactor ‘melt-down.’ These melted cores, called corium lavas, are thought to have passed through the basements of reactor buildings 1, 2 and 3, and to be somewhere in the ground underneath.

Harvey Wasserman, who has been working on nuclear energy issues for over 40 years, tells us that during those four decades no one ever talked about the possibility of a multiple meltdown, but that is what occurred at Fukushima.

It is an unprecedented situation to not know where these cores are. TEPCO is pouring water where they think the cores are, but they are not sure. There are occasional steam eruptions coming from the grounds of the reactors, so the cores are thought to still be hot.

The concern is that the corium lavas will enter or may have already entered the aquifer below the plant. That would contaminate a much larger area with radioactive elements. Some suggest that it would require the area surrounding Tokyo, 40 million people, to be evacuated. Another concern is that if the corium lavas enter the aquifer, they could create a “super-heated pressurized steam reaction beneath a layer of caprock causing a major ‘hydrovolcanic’ explosion.”

A further concern is that a large reserve of groundwater which is coming in contact with the corium lavas is migrating towards the ocean at the rate of four meters per month. This could release greater amounts of radiation than were released in the early days of the disaster.

Radioactive water leaking into the Pacific Ocean:  TEPCO did not admit that leaks of radioactive water were occurring until July of this year. Shunichi Tanaka the head of Japan’s Nuclear Regulation Authority finally told reporters this July that radioactive water has been leaking into the Pacific Ocean since the disaster hit over two years ago. This is the largest single contribution of radionuclides to the marine environment ever observed according to a report by the French Institute for Radiological Protection and Nuclear Safety.  The Japanese government finally admitted that the situation was urgent this September – an emergency they did not acknowledge until 2.5 years after the water problem began.

How much radioactive water is leaking into the ocean? An estimated 300 tons (71,895 gallons/272,152 liters) of contaminated water is flowing into the ocean every day.  The first radioactive ocean plume released by the Fukushima nuclear power plant disaster will take three years to reach the shores of the United States.  This means, according to a new study from the University of New South Wales, the United States will experience the first radioactive water coming to its shores sometime in early 2014.

One month after Fukushima, the FDA announced it was going to stop testing fish in the Pacific Ocean for radiation.  But, independent research is showing that every bluefin tuna tested in the waters off California has been contaminated with radiation that originated in Fukushima. Daniel Madigan, the marine ecologist who led the Stanford University study from May of 2012 was quoted in the Wall Street Journal saying, “The tuna packaged it up (the radiation) and brought it across the world’s largest ocean. We were definitely surprised to see it at all and even more surprised to see it in every one we measured.” Marine biologist Nicholas Fisher of Stony Brook University in New York State, another member of the study group, said: “We found that absolutely every one of them had comparable concentrations of cesium 134 and cesium 137.”

In addition, Science reports that fish near Fukushima are being found to have high levels of the radioactive isotope, cesium-134. The levels found in these fish are not decreasing,  which indicates that radiation-polluted water continues to leak into the ocean. At least 42 fish species from the area around the plant are considered unsafe.  South Korea has banned Japanese fish as a result of the ongoing leaks.

The half-life (time it takes for half of the element to decay) of cesium 134 is 2.0652 years. For cesium 137, the half-life is 30.17 years. Cesium does not sink to the ocean floor, so fish swim through it. What are the human impacts of cesium?

When contact with radioactive cesium occurs, which is highly unlikely, a person can experience cell damage due to radiation of the cesium particles. Due to this, effects such as nausea, vomiting, diarrhea and bleeding may occur. When the exposure lasts a long time, people may even lose consciousness. Coma or even death may then follow. How serious the effects are depends upon the resistance of individual persons and the duration of exposure and the concentration a person is exposed to, experts say.

There is no end in sight from the leakage of radioactive water into the Pacific from Fukushima.  Harvey Wasserman is questioning whether fishing in the Pacific Ocean will be safe after years of leakage from Fukushima.  The World Health Organization (WHO) is claiming that this will have limited effect on human health, with concentrations predicted to be below WHO safety levels. However, experts seriously question the WHO’s claims.

The United Nations Scientific Committee on the Effects of Radiation is in the process of writing a report to assess the radiation doses and associated effects on health and environment. When finalized, it will be the most comprehensive scientific analysis of the information available to date examining how much radioactive material was released, how it was dispersed over land and water, how Fukushima compares to previous accidents, what the impact is on the environment and food, and what the impact is on human health and the environment.

Wasserman warns that “dilution is no solution.”  The fact that the Pacific Ocean is large does not change the fact that these radioactive elements have long half-lives.  Radiation in water is taken up by vegetation, then smaller fish eat the vegetation, larger fish eat the smaller fish and at the top of the food chain we will find fish like tuna, dolphin and whales with concentrated levels of radiation. Humans at the top of the food chain could be eating these contaminated fish.

As bad as the ongoing leakage of radioactive water is into the Pacific, that is not the largest part of the water problem.  The Asia-Pacific Journal reported last month that TEPCO has 330,000 tons of water stored in 1,000 above-ground tanks and an undetermined amount in underground storage tanks.  Every day, 400 tons of water comes to the site from the mountains, 300 tons of that is the source for the contaminated water leaking into the Pacific daily. It is not clear where the rest of this water goes.

Each day TEPCO injects 400 tons of water into the destroyed facilities to keep them cool; about half is recycled, and the rest goes into the above-ground tanks. They are constantly building new storage tanks for this radioactive water. The tanks being used for storage were put together rapidly and are already leaking. They expect to have 800,000 tons of radioactive water stored on the site by 2016.  Harvey Wasserman warns that these unstable tanks are at risk of rupture if there is another earthquake or storm that hits Fukushima. The Asia-Pacific Journal concludes: “So at present there is no real solution to the water problem.”

The most recent news on the water problem at Fukushima adds to the concerns. On October 11, 2013, TEPCO disclosed that the radioactivity level spiked 6,500 times at a Fukushima well.  “TEPCO said the findings show that radioactive substances like strontium have reached the groundwater. High levels of tritium, which transfers much easier in water than strontium, had already been detected.”

Spent Fuel Rods:  As bad as the problems of radioactive water and missing cores are, the biggest problem at Fukushima comes from the spent fuel rods.  The plant has been in operation for 40 years. As a result, they are storing 11 thousand spent fuel rods on the grounds of the Fukushima plant. These fuel rods are composed of highly radioactive materials such as plutonium and uranium. They are about the width of a thumb and about 15 feet long.

The biggest and most immediate challenge is the 1,533 spent fuel rods packed tightly in a pool four floors above Reactor 4.  Before the storm hit, those rods had been removed for routine maintenance of the reactor.  But, now they are stored 100 feet in the air in damaged racks.  They weigh a total of 400 tons and contain radiation equivalent to 14,000 times the amount released by the Hiroshima atomic bomb.

The building in which these rods are stored has been damaged. TEPCO reinforced it with a steel frame, but the building itself is buckling and sagging, vulnerable to collapse if another earthquake or storm hits the area. Additionally, the ground under and around the building is becoming saturated with water, which further undermines the integrity of the structure and could cause it to tilt.

How dangerous are these fuel rods?  Harvey Wasserman explains that the fuel rods are clad in zirconium which can ignite if they lose coolant. They could also ignite or explode if rods break or hit each other. Wasserman reports that some say this could result in a fission explosion like an atomic bomb, others say that is not what would happen, but agree it would be “a reaction like we have never seen before, a nuclear fire releasing incredible amounts of radiation,” says Wasserman.

These are not the only spent fuel rods at the plant, they are just the most precarious.  There are 11,000 fuel rods scattered around the plant, 6,000 in a cooling pool less than 50 meters from the sagging Reactor 4.  If a fire erupts in the spent fuel pool at Reactor 4, it could ignite the rods in the cooling pool and lead to an even greater release of radiation. It could set off a chain reaction that could not be stopped.

What would happen? Wasserman reports that the plant would have to be evacuated.  The workers who are essential to preventing damage at the plant would leave, and we will have lost a critical safeguard.  In addition, the computers will not work because of the intense radiation. As a result we would be blind – the world would have to sit and wait to see what happened. You might have to not only evacuate Fukushima but all of the population in and around Tokyo, reports Wasserman.

There is no question that the 1,533 spent fuel rods need to be removed.  But Arnie Gundersen, a veteran nuclear engineer and director of Fairewinds Energy Education, who used to build fuel assemblies, told Reuters “They are going to have difficulty in removing a significant number of the rods.” He described the problem in a radio interview:

“If you think of a nuclear fuel rack as a pack of cigarettes, if you pull a cigarette straight up it will come out — but these racks have been distorted. Now when they go to pull the cigarette straight out, it’s going to likely break and release radioactive cesium and other gases, xenon and krypton, into the air. I suspect come November, December, January we’re going to hear that the building’s been evacuated, they’ve broke a fuel rod, the fuel rod is off-gassing.”

Wasserman builds on the analogy, telling us it is “worse than pulling cigarettes out of a crumbled cigarette pack.” It is likely they used salt water as a coolant out of desperation, which would cause corrosion because the rods were never meant to be in salt water.  The condition of the rods is unknown. There is debris in the coolant, so there has been some crumbling from somewhere. Gundersen  adds, “The roof has fallen in, which further distorted the racks,” noting that if a fuel rod snaps, it will release radioactive gas which will require at a minimum evacuation of the plant. They will release those gases into the atmosphere and try again.

The Japan Times writes: “The consequences could be far more severe than any nuclear accident the world has ever seen. If a fuel rod is dropped, breaks or becomes entangled while being removed, possible worst case scenarios include a big explosion, a meltdown in the pool, or a large fire. Any of these situations could lead to massive releases of deadly radionuclides into the atmosphere, putting much of Japan — including Tokyo and Yokohama — and even neighboring countries at serious risk.”

This is not the usual moving of fuel rods.  TEPCO has been saying this is routine, but in fact it is unique – a feat of engineering never done before.  As Gundersen says:

“Tokyo Electric is portraying this as easy. In a normal nuclear reactor, all of this is done with computers. Everything gets pulled perfectly vertically. Well nothing is vertical anymore, the fuel racks are distorted, it’s all going to have to be done manually. The net effect is it’s a really difficult job. It wouldn’t surprise me if they snapped some of the fuel and they can’t remove it.”

Gregory Jaczko, Former Chairman of the U.S. Nuclear Regulatory Commission concurs with Gundersen describing the removal of the spent fuel rods as “a very significant activity, and . . . very, very unprecedented.”

Wasserman sums the challenge up: “We are doing something never done before – bent, crumbling, brittle fuel rods being removed from a pool that is compromised, in a building that is sinking, sagging and buckling, and it all must done under manual control, not with computers.”  And the potential damage from failure would affect hundreds of millions of people.

The Solutions

The three major problems at Fukushima are all unprecedented, each unique in their own way and each has the potential for major damage to humans and the environment. There are no clear solutions but there are steps that need to be taken urgently to get the Fukushima clean-up and de-commissioning on track and minimize the risks.

The first thing that is needed is to end the media blackout.  The global public needs to be informed about the issues the world faces from Fukushima.  The impacts of Fukushima could affect almost everyone on the planet, so we all have a stake in the outcome.  If the public is informed about this problem, the political will to resolve it will rapidly develop.

The nuclear industry, which wants to continue to expand, fears Fukushima being widely discussed because it undermines their already weak economic potential.  But, the profits of the nuclear industry are of minor concern compared to the risks of the triple Fukushima challenges.

The second thing that must be faced is the incompetence of TEPCO.  They are not capable of handling this triple complex crisis. TEPCO “is already Japan’s most distrusted firm” and has been exposed as “dangerously incompetent.”  A poll found that 91 percent of the Japanese public wants the government to intervene at Fukushima.

Tepco’s management of the stricken power plant has been described as a comedy of errors. The constant stream of mistakes has been made worse by constant false denials and efforts to minimize major problems. Indeed the entire Fukushima catastrophe could have been avoided:

“Tepco at first blamed the accident on ‘an unforeseen massive tsunami’ triggered by the Great East Japan Earthquake on March 11, 2011. Then it admitted it had in fact foreseen just such a scenario but hadn’t done anything about it.”

The reality is Fukushima was plagued by human error from the outset.  An official Japanese government investigation concluded that the Fukushima accident was a “man-made” disaster, caused by “collusion” between government and Tepco and bad reactor design. On this point, TEPCO is not alone, this is an industry-wide problem. Many US nuclear plants have serious problems, are being operated beyond their life span, have the same design problems and are near earthquake faults. Regulatory officials in both the US and Japan are too corruptly tied to the industry.

Then, the meltdown itself was denied for months, with TEPCO claiming it had not been confirmed.  Japan Times reports that “in December 2011, the government announced that the plant had reached ‘a state of cold shutdown.’ Normally, that means radiation releases are under control and the temperature of its nuclear fuel is consistently below boiling point.”  Unfortunately, the statement was false – the reactors continue to need water to keep them cool, the fuel rods need to be kept cool – there has been no cold shutdown.

TEPCO has done a terrible job of cleaning up the plant.  Japan Times describes some of the problems:

“The plant is being run on makeshift equipment and breakdowns are endemic. Among nearly a dozen serious problems since April this year there have been successive power outages, leaks of highly radioactive water from underground water pools — and a rat that chewed enough wires to short-circuit a switchboard, causing a power outage that interrupted cooling for nearly 30 hours. Later, the cooling system for a fuel-storage pool had to be switched off for safety checks when two dead rats were found in a transformer box.”

TEPCO has been constantly cutting financial corners and not spending enough to solve the challenges of the Fukushima disaster resulting in shoddy practices that cause environmental damage. Washington’s Blog reports that the Japanese government is spreading radioactivity throughout Japan – and other countries – by burning radioactive waste in incinerators not built to handle such toxic substances. Workers have expressed concerns and even apologized for following order regarding the ‘clean-up.’

Indeed, the workers are another serious concern. The Guardian reported in October 2013 the plummeting morale of workers, problems of alcohol abuse, anxiety, loneliness, Post-Traumatic Stress Disorder and depression. TEPCO cut the pay of its workers by 20 percent in 2011 to save money even though these workers are doing very difficult work and face constant problems. Outside of work, many were traumatized by being forced to evacuate their homes after the Tsunami; and they have no idea how exposed to radiation they have been and what health consequences they will suffer. Contractors are hired based on the lowest bid, resulting in low wages for workers. According to the Guardian, Japan’s top nuclear regulator, Shunichi Tanaka, told reporters: “Mistakes are often linked to morale. People usually don’t make silly, careless mistakes when they’re motivated and working in a positive environment. The lack of it, I think, may be related to the recent problems.”

The history of TEPCO shows we cannot trust this company and its mistreated workforce to handle the complex challenges faced at Fukushima. The crisis at Fukushima is a global one, requiring a global solution.

In an open letter to the United Nations, 16 top nuclear experts urged the government of Japan to transfer responsibility for the Fukushima reactor site to a worldwide engineering group overseen by a civil society panel and an international group of nuclear experts independent from TEPCO and the International Atomic Energy Administration , IAEA. They urge that the stabilization, clean-up and de-commissioning of the plant be well-funded. They make this request with “urgency” because the situation at the Fukushima plant is “progressively deteriorating, not stabilizing.”

Beyond the clean-up, they are also critical of the estimates by the World Health Organization and IAEA of the health and environmental damage caused by the Fukushima disaster and they recommend more accurate methods of accounting, as well as the gathering of data to ensure more accurate estimates. They also want to see the people displaced by Fukushima treated in better ways; and they urge that the views of indigenous people who never wanted the uranium removed from their lands be respected in the future as their views would have prevented this disaster.

Facing Reality

The problems at Fukushima are in large part about facing reality – seeing the challenges, risks and potential harms from the incident. It is about TEPCO and Japan facing the reality that they are not equipped to handle the challenges of Fukushima and need the world to join the effort.

Facing reality is a common problem throughout the nuclear industry and those who continue to push for nuclear energy. Indeed, it is a problem with many energy issues. We must face the reality of the long-term damage being done to the planet and the people by the carbon-nuclear based energy economy.

Another reality the nuclear industry must face is that the United States is turning away from nuclear energy and the world will do the same. As Gary Jaczko, who chaired the US Nuclear Regulatory Commission at the time of the Fukushima incident says “I’ve never seen a movie that’s set 200 years in the future and the planet is being powered by fission reactors—that’s nobody’s vision of the future. This is not a future technology.” He sees US nuclear reactors as aging, many in operation beyond their original lifespan.  The economics of nuclear energy are increasingly difficult as it is a very expensive source of energy.  Further, there is no money or desire to finance new nuclear plants. “The industry is going away,” he said bluntly.

Ralph Nader describes nuclear energy as “unnecessary, uneconomic, uninsurable, unevacuable and, most importantly, unsafe.”  He argues it only continues to exist because the nuclear lobby pushes politicians to protect it. The point made by Nader about the inability to evacuate if there is a nuclear accident is worth underlining.  Wasserman points out that there are nuclear plants in the US that are near earthquake faults, among them are plants near Los Angeles, New York City and Washington, DC.  And, Fukushima was based on a design by General Electric, which was also used to build 23 reactors in the US.

If we faced reality, public officials would be organizing evacuation drills in those cities.  If we did so, Americans would quickly learn that if there is a serious nuclear accident, US cities could not be evacuated. Activists making the reasonable demand for evacuation drills may be a very good strategy to end nuclear power.

Wasserman emphasizes that as bad as Fukushima is, it is not the worst case scenario for a nuclear disaster. Fukushima was 120 kilometers (75 miles) from the center of the earthquake. If that had been 20 kilometers (12 miles), the plant would have been reduced to rubble and caused an immediate nuclear catastrophe.

Another reality we need to face is a very positive one, Wasserman points out “All of our world’s energy needs could be met by solar, wind, thermal, ocean technology.” His point is repeated by many top energy experts, in fact a carbon-free, nuclear-free energy economy is not only possible, it is inevitable.  The only question is how long it will take for us to get there, and how much damage will be done before we end the “all-of-the-above” energy strategy that emphasizes carbon and nuclear energy sources.

Naoto Kan, prime minister of Japan when the disaster began, recently told an audience that he had been a supporter of nuclear power, but after the Fukushima accident, “I changed my thinking 180-degrees, completely.” He realized that “no other accident or disaster” other than a nuclear plant disaster can “affect 50 million people . . . no other accident could cause such a tragedy.” He pointed out that all 54 nuclear plants in Japan have now been closed and expressed confidently that “without nuclear power plants we can absolutely provide the energy to meet our demands.”  In fact, since the disaster Japan has tripled its use of solar energy, to the equivalent of three nuclear plants. He believes: “If humanity really would work together . . . we could generate all our energy through renewable energy.”

To learn more, click here.

Related articles by Margaret Flowers and Kevin Zeese:

Carbon-Free, Nuclear-Free Energy Economy Is Inevitable

Vibrant Movement for Green Energy Economy

Gang Green or Fresh Greens?

US Climate Bomb is Ticking: What the Gas Industry Doesn’t Want You to Know

America’s Secret Fukushima Poisoning the Bread Basket of the World

The Rule of Law in Times of Ecological Collapse – Truthout

Dirty Energy’s Dirty Tactics: Boulder on the Front Lines of the Renewable Energy Future

To hear Kevin Zeese and Margaret Flowers interview with Harvey Wasserman of NukeFree.org Fukushima – A Global Threat That Requires a Global Response click here.

West Coast Of North America To Be Hit Hard By Fukushima Radiation That Could Be 10 Times Higher Than In Japan

In Uncategorized on August 27, 2013 at 5:29 pm

https://i0.wp.com/iprc.soest.hawaii.edu/news/marine_and_tsunami_debris/2011/11_04_maximenko_tsunami_debris/map_of_trajectory_med.jpgOldspeak” This ongoing and uncontrolled ecological catastrophe continues, with no end in sight. Untold billions of tons of water are being dumped into the Pacific ocean by the Japanese. The contamination is expected to get worse as time passes and impact Baja California and other North American west coast hotspots. And no one knows how to fix it. “Last year, scientists from the National Oceanic and Atmospheric Administration’s (NOAA) Pacific Marine Environmental Laboratory and 3 scientists from the GEOMAR Research Center for Marine Geosciences showed that radiation on the West Coast of North America could end up being 10 times higher than in Japan“. These are reputable research scientists saying this. Yet, universal silence in state media. Citizens from Hawaii to Alaska to Baja should be demanding information about what’s going on and what’s being done in response to the threat. The civil war in Syria is infinitesimally less threatening to the millions of people to be affected.” –OSJ

By Washington’s Blog:

Radiation Levels Will Concentrate in Pockets In Baja California and Other West Coast Locations

An ocean current called the North Pacific Gyre is bringing Japanese radiation to the West Coast of North America:

North Pacific Subtropical Convergence Zone FDA Refuses to Test Fish for Radioactivity ... Government Pretends Radioactive Fish Is Safe

The leg of the Gyre closest to Japan – the Kuroshio current – begins right next to Fukushima:

Kuroshio Current - Colour show water speed.  Blue slowest; red fastest

While many people assume that the ocean will dilute the Fukushima radiation, a previously-secret 1955 U.S. government report concluded that the ocean may not adequately dilute radiation from nuclear accidents, and there could be “pockets” and “streams” of highly-concentrated radiation.

The University of Hawaii’s International Pacific Research Center created a graphic showing the projected dispersion of debris from Japan (see pic at top)

Last year, scientists from the National Oceanic and Atmospheric Administration’s (NOAA) Pacific Marine Environmental Laboratory and 3 scientists from the GEOMAR Research Center for Marine Geosciences showed that radiation on the West Coast of North America could end up being 10 times higher than in Japan:

After 10 years the concentrations become nearly homogeneous over the whole Pacific, with higher values in the east, extending along the North American coast with a maximum (~1 × 10−4) off Baja California. 

***

With caution given to the various idealizations (unknown actual oceanic state during release, unknown release area, no biological effects included, see section 3.4), the following conclusions may be drawn. (i) Dilution due to swift horizontal and vertical dispersion in the vicinity of the energetic Kuroshio regime leads to a rapid decrease of radioactivity levels during the first 2 years, with a decline of near-surface peak concentrations to values around 10 Bq m−3 (based on a total input of 10 PBq). The strong lateral dispersion, related to the vigorous eddy fields in the mid-latitude western Pacific, appears significantly under-estimated in the non-eddying (0.5°) model version. (ii) The subsequent pace of dilution is strongly reduced, owing to the eastward advection of the main tracer cloud towards the much less energetic areas of the central and eastern North Pacific. (iii) The magnitude of additional peak radioactivity should drop to values comparable to the pre-Fukushima levels after 6–9 years (i.e. total peak concentrations would then have declined below twice pre-Fukushima levels). (iv) By then the tracer cloud will span almost the entire North Pacific, with peak concentrations off the North American coast an order-of-magnitude higher than in the western Pacific.

(“Order-of-magnitude” is a scientific term which means 10 times higher.  The “Western Pacific” means Japan’s East Coast.)

In May, a team of scientists from Spain, Australia and France concluded that the radioactive cesium would look more like this:
And a team of top Chinese scientists has just published a study in the Science China Earth Sciences journal showing that the radioactive plume crosses the ocean in a nearly straight line toward North America, and that it appears to stay together with little dispersion:

On March 30, 2011, the Japan Central News Agency reported the monitored radioactive pollutions that were 4000 times higher than the standard level. Whether or not these nuclear pollutants will be transported to the Pacific-neighboring countries through oceanic circulations becomes a world-wide concern.

***

The time scale of the nuclear pollutants reaching the west coast of America is 3.2 years if it is estimated using the surface drifting buoys and 3.9 years if it is estimated using the nuclear pollutant particulate tracers.

***

The half life of cesium-137 is so long that it produces more damage to human. Figure 4 gives the examples of the distribution of the impact strength of Cesium-137 at year 1.5 (panel (a)), year 3.5 (panel (b)), and year 4 (panel (c)).

***

It is worth noting that due to the current near the shore cannot be well reconstructed by the global ocean reanalysis, some nuclear pollutant particulate tracers may come to rest in near shore area, which may result in additional uncertainty in the estimation of the impact strength.

***

Since the major transport mechanism of nuclear pollutants for the west coast of America is the Kuroshio-extension currents, after four years, the impact strength of Cesium-137 in the west coast area of America is as high as 4%.

Note: Even low levels of radiation can harm health.

Some Credible Scientists Believe Humanity Is Irreparably Close To Destruction

In Uncategorized on August 23, 2013 at 7:34 pm

Oldspeak: “Scientists are putting out the warning call that rapid, life-threatening climate change lies ahead in our near future—but most are drowned out by the political arguments and denialist rhetoric of climate change skeptics. …. The general public seems to be getting ready for some sort of societal collapse… there are some clarion calls coming from…well-respected scientists and journalists who have come to some scarily-sane sounding conclusions about the threat of human-induced climate change on the survival of the human species… Recent data seems to suggest that we may have already tripped several irrevocable, non-linear, positive feedback loops (melting of permafrost, methane hydrates, and arctic sea ice) that make an average global temperature increase of only 2°C by 2100 seem like a fairy tale. Instead, we’re talking 4°C, 6°C, 10°C, 16°C (????????) here….The link between rapid climate change and human extinction is basically this: the planet becomes uninhabitable by humans if the average temperature goes up by 4-6°C. It doesn’t sound like a lot because we’re used to the temperature changing 15°C overnight, but the thing that is not mentioned enough is that even a 2-3°C average increase would give us temperatures that regularly surpass 40°C (104°F) in North America and Europe, and soar even higher near the equator. Human bodies start to break down after six hours at a wet-bulb (100% humidity) temperature of 35°C (95°F). This makes the 2003 heat wave in Europe that killed over 70,000 people seem like not a very big deal…Factoring in the increase we’re already seeing in heat waves, droughts, wildfires, massive storms, food and water shortages, deforestation, ocean acidification, and sea level rise some are seeing the writing on the wall: We’re all gonna die! -Nathan Curry

“Tick, Tick, Tick, Tick, Tick….. Meanwhile, news outlets are focusing on the civil war in Syria, Chelsea Manning’s sexual transition, the 50th Anniversary of the March On Washington.  Don’t get that shit ATAL. You have life long scientists quitting their jobs to become environmental activists. Something ain’t right. “OSJ

By Nathan Curry @ Vice Magazine:

If you were to zoom out and take a comparative look back at our planet during the 1950s from some sort of cosmic time-travelling orbiter cube, you would probably first notice that millions of pieces of space trash had disappeared from orbit.

The moon would appear six and a half feet closer to Earth, and the continents of Europe and North America would be four feet closer together. Zooming in, you would be able to spot some of the industrial clambering of the Golden Age of Capitalism in the West and some of the stilted attempts at the Great Leap Forward in the East. Lasers, bar codes, contraceptives, hydrogen bombs, microchips, credit cards, synthesizers, superglue, Barbie dolls, pharmaceuticals, factory farming, and distortion pedals would just be coming into existence.

There would be two thirds fewer humans on the planet than there are now. Over a million different species of plants and animals would exist that have since gone extinct.  There would be 90 percent more fish, a billion less tons of plastic, and 40 percent more phytoplankton (producers of half the planet’s oxygen) in the oceans. There would be twice as many trees covering the land and about three times more drinking water available from ancient aquifers. There would be about 80 percent more ice covering the northern pole during the summer season and 30 percent less carbon dioxide and methane in the atmosphere. The list goes on…

Most educated and semi-concerned people know that these sorts of sordid details make up the backdrop of our retina-screened, ethylene-ripened story of progress, but what happens when you start stringing them all together?

If Doomsday Preppers, the highest rated show on the National Geographic Channel is any indication, the general public seems to be getting ready for some sort of societal collapse. There have always been doomsday prophets and cults around and everyone has their own personal view of how the apocalypse will probably go down (ascension of pure souls, zombie crows), but in the midst of all of the Mayan Calendar/Timewave Zero/Rapture babble, there are some clarion calls coming from a crowd that’s less into bugout bags and eschatology: well-respected scientists and journalists who have come to some scarily-sane sounding conclusions about the threat of human-induced climate change on the survival of the human species.

Recent data seems to suggest that we may have already tripped several irrevocable, non-linear, positive feedback loops (melting of permafrost, methane hydrates, and arctic sea ice) that make an average global temperature increase of only 2°C by 2100 seem like a fairy tale. Instead, we’re talking 4°C, 6°C, 10°C, 16°C (????????) here.

The link between rapid climate change and human extinction is basically this: the planet becomes uninhabitable by humans if the average temperature goes up by 4-6°C. It doesn’t sound like a lot because we’re used to the temperature changing 15°C overnight, but the thing that is not mentioned enough is that even a 2-3°C average increase would give us temperatures that regularly surpass 40°C (104°F) in North America and Europe, and soar even higher near the equator. Human bodies start to break down after six hours at a wet-bulb (100% humidity) temperature of 35°C (95°F). This makes the 2003 heat wave in Europe that killed over 70,000 people seem like not a very big deal.

Factoring in the increase we’re already seeing in heat waves, droughts, wildfires, massive storms, food and water shortages, deforestation, ocean acidification, and sea level rise some are seeing the writing on the wall:

We’re all gonna die!

If you want to freak yourself the fuck out, spend a few hours trying to refute the mounting evidence of our impending doom heralded by the man who gave the Near Term Extinction movement its name, Guy McPherson, on his blog Nature Bats Last. McPherson is a former Professor Emeritus of Natural Resources and Ecology and Evolutionary Biology at the University of Arizona, who left his cushy tenured academic career and now lives in a straw-bale house on a sustainable commune in rural New Mexico in an attempt to “walk away from Empire.” There are a lot of interviews and videos available of Dr. McPherson talking about NTE if you want to boost your pessimism about the future to suicidal/ruin-any-dinner-party levels.


If you are in need of an ultimate mind-fuck, there is a long essay on McPherson’s site entitled “The Irreconcilable Acceptance of Near Term Extinction” written by a lifelong environmental activist named Daniel Drumright. He writes about trying to come to terms with what it means to be on a clear path toward extinction now that it’s probably too late to do anything about it (hint: suicide or shrooms). As Drumright points out, the entirety of human philosophy, religion, and politics doesn’t really provide a framework for processing the psychological terror of all of humanity not existing in the near future.

Outside of the official NTE enclave, there are a lot of scientists and journalists who would probably try to avoid being labeled as NTE proponents, but are still making the same sort of dire predictions about our collective fate. They may not believe that humans will ALL be gone by mid-century, but massive, catastrophic “population decline” due to human-induced rapid climate change is not out of the picture.

James Hansen, the former head of NASA’s Goddard Institute for Space Studies and one of the world’s leading climatologists has recently retired from his position after 43 years in order to concentrate on climate-change activism. He predicts that without full de-carbonization by 2030, global CO2 emissions will be 16 times higher than in 1950, guaranteeing catastrophic climate change. In an essay published in April of this year, Hansen states:

“If we should ‘succeed’ in digging up and burning all fossil fuels, some parts of the planet would become literally uninhabitable, with some times during the year having wet bulb temperatures exceeding 35°C. At such temperatures, for reasons of physiology and physics, humans cannot survive… it is physically impossible for the environment to carry away the 100W of metabolic heat that a human body generates when it is at rest. Thus even a person lying quietly naked in hurricane force winds would be unable to survive.”

Bill McKibben, prominent green journalist, author, distinguished scholar, and one of the founders of 350.org—the movement that aims to reduce atmospheric CO2 levels to 350ppm in the hopes of avoiding runaway climate change—wrote a book in 2011 called Eaarth: Making a Life on a Tough New Planet. In it he highlights current environmental changes that have put us past the predictions that had previously been reserved for the end of the 21st century. He emphasizes that the popular political rhetoric that we need to do something about climate change for our “grandchildren” is sorely out of touch with reality. This is happening now. We’re already living on a sci-fi planet from a parallel universe:

“The Arctic ice cap is melting, the great glacier above Greenland is thinning, both with disconcerting and unexpected speed. The oceans are distinctly more acid and their level is rising…The greatest storms on our planet, hurricanes and cyclones, have become more powerful…The great rain forest of the Amazon is drying on its margins…The great boreal forest of North America is dying in a matter of years… [This] new planet looks more or less like our own but clearly isn’t… This is the biggest thing that’s ever happened.”


Climate Change protesters in Melbourne. via Flickr.

Peter Ward is a paleontologist and author whose 2007 book Under a Green Sky: Global Warming, the Mass Extinctions of the Past, and What they Can Tell Us About the Future, provides evidence that all but one of the major global extinction events (dinosaurs) occurred due to rapid climate change caused by increased atmospheric carbon dioxide levels. This time around, the carbon dioxide increase happens to be coming from humans figuring out how to dig billions of tons of carbon out of the ground—and releasing it into the air. Ward states that during the last 10,000 years in which human civilization has emerged, our carbon dioxide levels and climate have remained anomalously stable, but the future doesn’t look so good:

“The average global temperature has changed as much as 18°F [8°C] in a few decades. The average global temperature is 59°F [15°C]. Imagine that it shot to 75°F [24°C] or dropped to 40°F [4°C], in a century or less. We have no experience of such a world… at minimum, such sudden changes would create catastrophic storms of unbelievable magnitude and fury…lashing the continents not once a decade or century but several times each year…For most of the last 100,000 years, an abruptly changing climate was the rule, not the exception.”

Far from being a Mother Earth lover, Ward has also developed an anti-Gaia hypothesis that he calls the “Medea Hypothesis” in which complex life, instead of being in symbiotic harmony with the environment, is actually a horrible nuisance. In this hypothesis, the planet and microbial life have worked together multiple times to trigger mass extinction events that have almost succeeded in returning the earth to its microbe-dominant state. In other words, Mother Earth might be Microbe Earth and she might be trying to kill her kids.

Scientists are putting out the warning call that rapid, life-threatening climate change lies ahead in our near future—but most are drowned out by the political arguments and denialist rhetoric of climate change skeptics. The well-funded effort by free market think tanks, energy lobbyists, and industry advocates to blur the public perception of climate science should come as no surprise. The thermodynamic forcing effects of an ice-free artic by 2015 don’t seem so threatening if you stand to gain billions of dollars by sending drill bits into the potentially huge oil reservoirs there.

It may not be the case that the southwest US will be uninhabitable by 2035, or that all of human life will be extinguished in a generation, but we should probably start to acknowledge and internalize what some of the people who have given their lives to better understand this planet are saying about it. It’s depressing to think that humans, in our current state, could be the Omega Point of consciousness. Maybe sentience and the knowledge of our inevitable death have given us a sort of survival vertigo that we can’t overcome. As the separate paths of environmental exploitation quickly and quietly converge around us, we might just tumble off the precipice, drunk on fossil fuels, making duck faces into black mirrors.

Earth’s Oceans: A Heat Sink For Energy

In Uncategorized on August 16, 2013 at 7:28 pm

Oldspeak: “90% of the Earth’s heat is contained in the oceans. The graph starts in 1960, and ever since the late 1970s, its slope looks eerily similar to Mount Everest. Starting in the late 1970s, and accelerating in the 1980s, the graph slopes steeply upwards commensurate with China discovering state capitalism and spewing enormous amounts of CO2 into the atmosphere…. The heat imbalance of the planet… compared… to the equivalent of 400,000 Hiroshima atomic bombs per day, which is nearly impossible to fathom. But, it is how much heat the Earth absorbs per day due to global warming…. The uptake of heat by the oceans, serving as a giant ‘sink’, may account for the recent hiatus in land temperature, as its rate of warming slowed; however, the totality of the earth’s heat is what counts, not just the land temperature, and according to a research paper written by Scott Doney (Woods Hole Oceanographic Institution)1 :

The ocean slows climate change by storing excess heat and by removing CO2 from the atmosphere… [however] The ocean CO2 sink may become less effective in the future due to warming, increased vertical stratification, and altered ocean circulation, which would act to accelerate climate change.” -Robert Hunziker

“Oh great, polar ice is melting faster from the bottom, and the oceans are rapidly acidifying while warming throughout the water column. At some point all the ocean wildlife will die and the oceans will turn into big toxic radioactive dead zones. Annnnnnd climate change will accelerate. Enjoy your seafood while you can kids. The oceans can only absorb so much of our waste.” -OSJ

By Robert Hunziker @ Dissident Voice:

Over the past 30 years, the Earth has absorbed unbelievably huge amounts of heat… substantially more than in prior decades. Now, scientists have discovered the whereabouts of this abnormality of excessive heat… deep in the oceans, the Earth’s Big Heat Sink! As time passes, the ocean heat sink may one day run over, in turn, prompting global warming to accelerate rapidly, very rapidly.

A little over one year ago, Dr. James Hansen, one of the world’s foremost climate scientists and former Head of NASA Goddard Institute for Space Studies, spoke at the TED Conference in Long Beach, California, explaining the heat imbalance of the planet, and he compared the imbalance to the equivalent of 400,000 Hiroshima atomic bombs per day, which is nearly impossible to fathom. But, it is how much heat the Earth absorbs per day due to global warming. According to Dr. Hansen, the imbalance means we must reduce CO2 from approximately 400 ppm, which is a new 3-million-year record, back to less than 350 ppm to restore the planet’s energy balance.

But, unfortunately, CO2 continues rising, year-by-year, and there are no signs of tapering. In fact, the rate of increase is increasing, and according to the Mauna Loa Observatory, Hawaii, as of July 9, 2013, “The concentrations of CO2 in the atmosphere are increasing at an accelerating rate from decade to decade. The latest atmospheric CO2 data is consistent with a continuation of this long-standing trend,” CO2Now.org and confirmed by the National Oceanic & Atmospheric Administration.

Forty years ago, Hansen published an article in Science magazine that changed the world’s perception of climate, and the article was repeated on the front page of the New York Times. The article concluded that observed warming of 0.4 degrees C the prior century was consistent with the greenhouse effect on increasing CO2. And, that Earth would likely warm in the 1980s. And, that the 21st century would see shifting climate zones, creation of drought-prone regions in North America and Asia, erosion of ice sheets, rising sea levels and opening of the fabled Northwest Passage — all of these impacts have happened or are well under way.

Hansen’s paper resulted in his testifying to Congress in the 1980s. His testimony emphasized that global warming increases both extremes of the Earth’s water cycle, meaning, heat waves and droughts on the one hand directly from the warming but also, because a warmer atmosphere holds more water, rainfall will become more extreme with stronger storms and greater flooding.

Forty years later, the climate is proving him correct… on all counts.

Today, he is more concerned that ever before.

The distribution of the Heat Content of Earth

According to the Journal of Geophysical Research, the total heat content of the Earth is contained within the land and the atmosphere and the oceans. The journal publishes a graph of this relationship, which shows 90% of the Earth’s heat is contained in the oceans. The graph starts in 1960, and ever since the late 1970s, its slope looks eerily similar to Mount Everest. Starting in the late 1970s, and accelerating in the 1980s, the graph slopes steeply upwards commensurate with China discovering state capitalism and spewing enormous amounts of CO2 into the atmosphere. [To get a fuller overview, one should take into account, inter alia, per capita CO2 emissions; China is ranked relatively low. — DV Ed.]

As well, the National Oceanic and Atmospheric Administration (“NOAA”) has numerous charts that show the oceans rapidly heating during this same time frame, and it is expected that, over time, the ocean heat will come back up, which is one reason why climatologists predict a looming climate shift to rapid acceleration of surface warming. As well, the enormous uptake of heat by the oceans may offer an additional explanation for why the Arctic Ocean is melting at such a rapid rate with a great deal of the ice melting from underneath.

Ocean Heat Measurement Techniques

The ocean temperature is measured by Argo floats of which 3,000 are deployed every 3 degrees (or 300km) in oceans around the world. Every 10 days, Argo floats descend to a target depth, typically to 2000m (1.24 miles), and over a period of six hours, the floats rise to surface while measuring temperature and salinity. Once back to surface, Argo floats relay data to satellites via an international collaboration with the Jason Satellite Altimeter Mission. (Argo is named after Jason’s ship in Greek mythology.)

The Payback –Acceleration of Global Warming

The uptake of heat by the oceans, serving as a giant ‘sink’, may account for the recent hiatus in land temperature, as its rate of warming slowed; however, the totality of the earth’s heat is what counts, not just the land temperature, and according to a research paper written by Scott Doney (Woods Hole Oceanographic Institution)1 :

The ocean slows climate change by storing excess heat and by removing CO2 from the atmosphere… [however] The ocean CO2 sink may become less effective in the future due to warming, increased vertical stratification, and altered ocean circulation, which would act to accelerate climate change.

Additionally, according to “Ocean Carbon Biogeochemistry and U.S. CLIVAR Joint Meeting Summary,”2 :

Atmospheric emissions of CO2 not only contribute to warming our climate, but are expected to have a significant impact on ocean circulation, biogeochemistry and ecosystem structure. Those changes will then feedback onto the atmosphere… resulting in a decrease the rates at which the ocean takes up and stores atmospheric carbon dioxide, further enhancing global warming.

As well, the Catalan Institute of Climate Sciences in Barcelona,3 analyzing the slow down of rising surface temperatures during the first decade of this century, concluded: Most of the excess energy was absorbed in the top 700m of the ocean at the onset of the warming pause with 65% of it confined to the tropical Pacific and Atlantic oceans. The uptake by the oceans, according to the lead scientist, resulted in hidden heat from the surface, but it is heat that may return to the atmosphere over the decade, which will stoke global warming.

The Earth’s total heat content since 1960, as illustrated by the Journal of Geophysical Research graph shows where the Earth’s heat has been going: Go to: Institute of Climate Studies, USA to see the graphic display (The heading of the graph is “Earth’s Total Heat Content Anomaly.”)

As mentioned earlier, it is interesting to note the dramatic liftoff in the chart, nearly perpendicular since 1970-80, as the world’s oceans have absorbed extraordinary levels of heat, ever since China discovered state capitalism (1970s-80s) and began powering CO2 into the atmosphere like there is no tomorrow, and as a result, there may not be a tomorrow… as we know it.

Postscript: “You come back impressed, once you’ve been up there, with how thin our little atmosphere is that supports all life here on Earth. So if we foul it up, there’s no coming back from something like that.” (John Glenn, first American, 1962, to orbit the Earth and former U.S. Senator.)

  1. U.S. Clivar Variations (U.S. Climate Variability and Predictability Research Program, Washington, DC), Summer 2012, Vol. 10, No.1 []
  2. Annalisa Bracco, Georgia Institute of Technology and Ken Johnson, Monterrey Bay Aquarium Research Institute, U.S. Clivar Variations, Summer 2012, Vol. 10, no. 1 []
  3. “Retrospective Prediction of the Global Warming Slowdown in the Past Decade,” Nature Climate Change, April 7, 2013, by Virginie Guemas, Francisco J. Doblas-Reyes, Isabel Andreu-Burillo and Muhammad Asif []

Robert Hunziker (MA in economic history at DePaul University, Chicago) is a former hedge fund manager and now a professional independent negotiator for worldwide commodity actual transactions and a freelance writer for progressive publications as well as business journals. He can be contacted at: rlhunziker@gmail.com. Read other articles by Robert.

Welcome To The “Era Of Persistent Conflict”: Pentagon Bracing For Public Dissent Over Climate & Energy Shocks

In Uncategorized on July 22, 2013 at 8:48 pm

https://i0.wp.com/www.davidicke.com/oi/extras/09/september/18_northcom.jpgOldspeak: “Why have Western security agencies developed such an unprecedented capacity to spy on their own domestic populations? Since the 2008 economic crash, security agencies have increasingly spied on political activists, especially environmental groups, on behalf of corporate interests. This activity is linked to the last decade of US defence planning, which has been increasingly concerned by the risk of civil unrest at home triggered by catastrophic events linked to climate change, energy shocks or economic crisis – or all three.” –Dr. Nafeez Ahmed.

“This is why your rights to dissent are being constricted. This is why your rights to assemble and petition your government for redress are being done away with. This is why the entire planet is being watched. This is why investigative journalists are being assailed, intimidated and subpoenaed. This is why whistleblowers are persecuted, hunted and silenced, zealously. This is why law-abiding citizens are being designated as “domestic terrorists”.  This is why more prisons than schools are being built. This is why the armed forces are training to operate in the homeland. The elites know what’s coming. They know there won’t be enough food, water, energy and living space for everyone. They know there will be vast areas of the planet rendered uninhabitable. This is why The Transnational Corporate Network and Governments are merging via a series of largely secret “treaties” and “trade agreements”. They know the people will not stand for it. They know there will be mass and persistent protest and unrest as the world as we know it crumbles, just as we are seeing in many other nations . They know we will need to be policed, controlled, repressed and imprisoned. War is coming. And you are the enemy.” –OSJ

By Dr. Nafeez Ahmed @ The U.K. Guardian:

Top secret US National Security Agency (NSA) documents disclosed by the Guardian have shocked the world with revelations of a comprehensive US-based surveillance system with direct access to Facebook, Apple, Google, Microsoft and other tech giants. New Zealand court records suggest that data harvested by the NSA’s Prism system has been fed into the Five Eyes intelligence alliance whose members also include the UK, Canada, Australia and New Zealand.

But why have Western security agencies developed such an unprecedented capacity to spy on their own domestic populations? Since the 2008 economic crash, security agencies have increasingly spied on political activists, especially environmental groups, on behalf of corporate interests. This activity is linked to the last decade of US defence planning, which has been increasingly concerned by the risk of civil unrest at home triggered by catastrophic events linked to climate change, energy shocks or economic crisis – or all three.

Just last month, unilateral changes to US military laws formally granted the Pentagon extraordinary powers to intervene in a domestic “emergency” or “civil disturbance”:

“Federal military commanders have the authority, in extraordinary emergency circumstances where prior authorization by the President is impossible and duly constituted local authorities are unable to control the situation, to engage temporarily in activities that are necessary to quell large-scale, unexpected civil disturbances.”

Other documents show that the “extraordinary emergencies” the Pentagon is worried about include a range of environmental and related disasters.

In 2006, the US National Security Strategy warned that:

“Environmental destruction, whether caused by human behavior or cataclysmic mega-disasters such as floods, hurricanes, earthquakes, or tsunamis. Problems of this scope may overwhelm the capacity of local authorities to respond, and may even overtax national militaries, requiring a larger international response.”

Two years later, the Department of Defense’s (DoD) Army Modernisation Strategy described the arrival of a new “era of persistent conflict” due to competition for “depleting natural resources and overseas markets” fuelling “future resource wars over water, food and energy.” The report predicted a resurgence of:

“… anti-government and radical ideologies that potentially threaten government stability.”

In the same year, a report by the US Army’s Strategic Studies Institute warned that a series of domestic crises could provoke large-scale civil unrest. The path to “disruptive domestic shock” could include traditional threats such as deployment of WMDs, alongside “catastrophic natural and human disasters” or “pervasive public health emergencies” coinciding with “unforeseen economic collapse.” Such crises could lead to “loss of functioning political and legal order” leading to “purposeful domestic resistance or insurgency…

“DoD might be forced by circumstances to put its broad resources at the disposal of civil authorities to contain and reverse violent threats to domestic tranquility. Under the most extreme circumstances, this might include use of military force against hostile groups inside the United States. Further, DoD would be, by necessity, an essential enabling hub for the continuity of political authority in a multi-state or nationwide civil conflict or disturbance.”

That year, the Pentagon had begun developing a 20,000 strong troop force who would be on-hand to respond to “domestic catastrophes” and civil unrest – the programme was reportedly based on a 2005 homeland security strategy which emphasised “preparing for multiple, simultaneous mass casualty incidents.”

The following year, a US Army-funded RAND Corp study called for a US force presence specifically to deal with civil unrest.

Such fears were further solidified in a detailed 2010 study by the US Joint Forces Command – designed to inform “joint concept development and experimentation throughout the Department of Defense” – setting out the US military’s definitive vision for future trends and potential global threats. Climate change, the study said, would lead to increased risk of:

“… tsunamis, typhoons, hurricanes, tornadoes, earthquakes and other natural catastrophes… Furthermore, if such a catastrophe occurs within the United States itself – particularly when the nation’s economy is in a fragile state or where US military bases or key civilian infrastructure are broadly affected – the damage to US security could be considerable.”

The study also warned of a possible shortfall in global oil output by 2015:

“A severe energy crunch is inevitable without a massive expansion of production and refining capacity. While it is difficult to predict precisely what economic, political, and strategic effects such a shortfall might produce, it surely would reduce the prospects for growth in both the developing and developed worlds. Such an economic slowdown would exacerbate other unresolved tensions.”

That year the DoD’s Quadrennial Defense Review seconded such concerns, while recognising that “climate change, energy security, and economic stability are inextricably linked.”

Also in 2010, the Pentagon ran war games to explore the implications of “large scale economic breakdown” in the US impacting on food supplies and other essential services, as well as how to maintain “domestic order amid civil unrest.”

Speaking about the group’s conclusions at giant US defence contractor Booz Allen Hamilton’s conference facility in Virginia, Lt Col. Mark Elfendahl – then chief of the Joint and Army Concepts Division – highlighted homeland operations as a way to legitimise the US military budget:

“An increased focus on domestic activities might be a way of justifying whatever Army force structure the country can still afford.”

Two months earlier, Elfendahl explained in a DoD roundtable that future planning was needed:

“Because technology is changing so rapidly, because there’s so much uncertainty in the world, both economically and politically, and because the threats are so adaptive and networked, because they live within the populations in many cases.”

The 2010 exercises were part of the US Army’s annual Unified Quest programme which more recently, based on expert input from across the Pentagon, has explored the prospect that “ecological disasters and a weak economy” (as the “recovery won’t take root until 2020”) will fuel migration to urban areas, ramping up social tensions in the US homeland as well as within and between “resource-starved nations.”

NSA whistleblower Edward Snowden was a computer systems administrator for Booz Allen Hamilton, where he directly handled the NSA’s IT systems, including the Prism surveillance system. According to Booz Allen’s 2011 Annual Report, the corporation has overseen Unified Quest “for more than a decade” to help “military and civilian leaders envision the future.”

The latest war games, the report reveals, focused on “detailed, realistic scenarios with hypothetical ‘roads to crisis'”, including “homeland operations” resulting from “a high-magnitude natural disaster” among other scenarios, in the context of:

“… converging global trends [which] may change the current security landscape and future operating environment… At the end of the two-day event, senior leaders were better prepared to understand new required capabilities and force design requirements to make homeland operations more effective.”

It is therefore not surprising that the increasing privatisation of intelligence has coincided with the proliferation of domestic surveillance operations against political activists, particularly those linked to environmental and social justice protest groups.

Department of Homeland Security documents released in April prove a “systematic effort” by the agency “to surveil and disrupt peaceful demonstrations” linked to Occupy Wall Street, according to the Partnership for Civil Justice Fund (PCJF).

Similarly, FBI documents confirmed “a strategic partnership between the FBI, the Department of Homeland Security and the private sector” designed to produce intelligence on behalf of “the corporate security community.” A PCJF spokesperson remarked that the documents show “federal agencies functioning as a de facto intelligence arm of Wall Street and Corporate America.”

In particular, domestic surveillance has systematically targeted peaceful environment activists including anti-fracking activists across the US, such as the Gas Drilling Awareness Coalition, Rising Tide North America, the People’s Oil & Gas Collaborative, and Greenpeace. Similar trends are at play in the UK, where the case of undercover policeman Mark Kennedy revealed the extent of the state’s involvement in monitoring the environmental direct action movement.

A University of Bath study citing the Kennedy case, and based on confidential sources, found that a whole range of corporations – such as McDonald’s, Nestle and the oil major Shell, “use covert methods to gather intelligence on activist groups, counter criticism of their strategies and practices, and evade accountability.”

Indeed, Kennedy’s case was just the tip of the iceberg – internal police documents obtained by the Guardian in 2009 revealed that environment activists had been routinely categorised as “domestic extremists” targeting “national infrastructure” as part of a wider strategy tracking protest groups and protestors.

Superintendent Steve Pearl, then head of the National Extremism Tactical Coordination Unit (Nectu), confirmed at that time how his unit worked with thousands of companies in the private sector. Nectu, according to Pearl, was set up by the Home Office because it was “getting really pressured by big business – pharmaceuticals in particular, and the banks.” He added that environmental protestors were being brought “more on the radar.” The programme continues today, despite police acknowledgements that environmentalists have not been involved in “violent acts.”

The Pentagon knows that environmental, economic and other crises could provoke widespread public anger toward government and corporations in coming years. The revelations on the NSA’s global surveillance programmes are just the latest indication that as business as usual creates instability at home and abroad, and as disillusionment with the status quo escalates, Western publics are being increasingly viewed as potential enemies that must be policed by the state.

Dr Nafeez Ahmed is executive director of the Institute for Policy Research & Development and author of A User’s Guide to the Crisis of Civilisation: And How to Save It among other books. Follow him on Twitter @nafeezahmed

 

 

Put the Environment At The Center Of The Global Economy: An Argument For The Eco-Currency

In Uncategorized on May 21, 2013 at 4:50 pm

Oldspeak: “The solution to both the problem of currency and of climate change is obvious: we must hardwire the health of the ecosystem directly to the standard measurements of economic health so that the state of the environment is immediately visible in all economic transactions. Global finance, trade and investment must all be conducted within a system that makes the preservation of the climate, rather than profit, the highest priority. One possible approach is the introduction of a global “eco-currency.” The international community would establish an international currency, an “eco-currency,” whose value is linked directly to the state of the climate (both globally and locally) and that currency would serve either as a universal currency within which international transactions take place, or it could be a factor that significantly impacts all the global currencies.” –Emanuel Pastreich. YES! Brilliant! Tie our monetary measures of health to the health of our planet! Discard extractive, profit driven, imaginary computer generated “market-based” economic systems and replace them with naturally regenerative, resource and ecosystem based economic systems.  An ecosystem based society… Focusing on carbon emissions only allow focus to be placed on a single aspect of the ecosystem. While our bodies,  oceans, streams, lands, fellow lifeforms and food are poisoned.” This system requires consideration of the ecosystem as a whole. Species extinction would have to be accounted for and avoided… Waste would have to be minimally or non-toxic, bio-degradable and recyclable. Extraction would be highly regulated done in a manner that would require replenishment or conservation. All kinds of wonderful side-effects would arise. Reduced pollution, healthier food, cleaner water, reduced poverty, reduced inequality, greater bio-diversity, etc, etc, etc, the possibilities are endless! Barefoot Economics par excellence!

By Emanuel Pastreich @ Truthout:

The environmental challenges we face today, from spreading deserts to rising oceans, compel us to reconsider the conventional concepts of growth and recognize that they cannot easily be reconciled with the dangerous implications of runaway consumption and unlimited development.

Above all, we must get away from a speculative economy born of an irrational dependence on finance, which has becoming increasingly unstable as digital technology accelerates and financial transactions take place without any objective review. We must return to a stable and long-term economy. In part, that process concerns the restoration of regulation on the banking system, but the change must also involve the very conception of finance and banking. Finance must be aimed at stable, long-term projects which have relevance for ordinary people.

Nothing could possibly be more helpful in this process than large-scale projects to restore the environment and address the damage done to the climate by human activity. These projects are absolutely necessary for human survival and they will take decades, if not centuries, to complete. By grounding the economy in adaptation and mitigation, we can return to a predictable system in which green bonds funding 30-50 year projects directly related to our well-being are dominant and we can escape from the flighty digital economy of thousands-per-second transactions.

In addition to the development of a “green bonds” system for funding long-term meaningful projects to address the climate crisis, we should also consider the role of currency itself. We are engaged in a dangerous race to devalue currency around the world in the expectation of increasing advantage in trade. This activity is profoundly destabilizing for our economy and at a higher level also causes chaos in the process by which we assign value in general.

The solution to both the problem of currency and of climate change is obvious: we must hardwire the health of the ecosystem directly to the standard measurements of economic health so that the state of the environment is immediately visible in all economic transactions. Global finance, trade and investment must all be conducted within a system that makes the preservation of the climate, rather than profit, the highest priority.

One possible approach is the introduction of a global “eco-currency.” The international community would establish an international currency, an “eco-currency,” whose value is linked directly to the state of the climate (both globally and locally) and that currency would serve either as a universal currency within which international transactions take place, or it could be a factor that significantly impacts all the global currencies.

Such an eco-currency would require a calculation of the state of the environment on which its value would be based. First we need to come up with a system for evaluating the state of the environment in real-time which could be converted into a set of figures for the calculation of the total state of the global and the local ecosystems. That set of figures would then be the basis for the eco-currency’s value. Such a system would be complex and far from perfect, but it would be a massive improvement over the current factors employed in calculating gross domestic product which are limited to consumption and production and exclude the state of the environment entirely.

There exist indices such as Yale’s Environmental Performance Index that do part of that process, but so far, there is no total agreed on standard for evaluating the state of the total environment that could be used to periodically measure the state of the environment in a manner that could be employed as a universal reference. Only then could the amount of, or the value of, the eco-currency possessed by a nation reflect an objective evaluation of the health of the climate.

If the eco-currency were to serve as one of several factors impacting all global currencies, it might serve as an instrument akin to the SDR (special drawing rights) system currently employed by the International Monetary Fund. According to the IMF website, member [nations] with sufficiently strong external positions are designated by the Fund to buy SDR s with freely usable currencies up to certain amounts from members with weak external positions. In the case of the eco-currency, that strong external position could be redefined so as to consist primarily, or entirely, of environmental criterion.

The eco currency could alternatively serve as a gold standard for all nations of the world, permitting each nation to increase its money supply in direct proportion to the environmental credits that it has accumulated through wise and effective policies by reducing emissions and preserving water and soil.

After all, in that the previous gold standard was based on a mineral that was exceptionally rare and valuable, so it could be a logical extension of that concept to argue that a healthy ecosystem is the most valuable commodity available. The ecosystem is far more valuable to human society than is gold. Each nation would continue to have sovereignty with regards to its own currency, but the calculation of each currency’s exchange rate would take into account the environmental status of the country and its share of a calculated total of environmental credits for the entire world.

Whether it serves as a universal currency, or as a factor impacting all hard currencies, the total amount eco-currency available would be calculated as equal to a global sum of the total value of the ecosystem. Those credits would be assigned to a country based on an evaluation of how good a job that nation does reducing harmful emissions, preserving undeveloped lands, caring for its water supplies and otherwise implementing policies that have a positive effect on the environment.

The serious problems faced by the European emissions trading system suggest that we need to move bravely to a new approach to putting the environment itself, and not merely carbon emissions, at the center of our calculations of the economy. An international currency program based on environmental credits as part of a total biosphere would make the environmental crisis visible in the financial world. The eco-currency could be the first step towards forcing those making fiscal and developmental policy at the national and international level to engage in a serious debate on the implications of their policies for the climate. No longer would it be possible to think separately about monetary policy and environmental policy; the two would be effectively yoked together.

 

 

Survey Finds 97% Of Climate Science Literature Agree Human Activity Is Driving Global Warming

In Uncategorized on May 21, 2013 at 2:53 pm
Hacked climate science emails : Porters Descending with Ice Core Samples

Porters carry cores of ancient glacial ice down from the 6542m summit of Mt Sajama in Bolivia. 97% of scientific papers taking a position on climate change say it is man-made. Photograph: George Steinmetz/Corbis

Oldspeak: “In the wake of the most recent tragic devastating American natural disaster my heart goes out to the victims. As predicted for decades, natural disasters are becoming more frequent, more intense and less predictable. This is yet ANOTHER sign of what’s to come, while we continue to ignore the devastating impact our species is having on our environment, our environment is having a devastating impact on us.  Its simple physics. Every action has an equal and opposite reaction. “32.4 million people were forced to flee their homes last year due to natural disasters such as floods, storms and earthquakes. Climate change is believed to play an increasingly significant role in global disasters. …According to the 2012 Special Report from the Intergovernmental Panel on Climate Change, 98% of those uprooted were displaced by climate- and weather-related events” Untold habitats, and the life they support is being wiped out by human activity. In tandem, human habitats and the life they support are being wiped out as well. Yet still, in the face of all this devastation, locally and globally our response has been as it usually is. Reactive. Tepid. Nibbling around the edges. Continuing to dump vital resources into the  extractive-energy systems that are causing the problems, while failing to ramp up & largely ignoring regenerative and sustainable energy systems which could help mitigate the problems… The research is clear. Man is causing these calamities. Yet there is little questioning or debating the efficacy of the systems in place that are accelerating global ecological destruction. Our leaders are barely speaking about and making inconsequential actions to counter climate change. Obama will soon be giving a major speech about the U.S. drone assassination program. While important, it is inconsequential when compared to the health of our planet. He’s promised to “respond to the threat of climate change, knowing that the failure to do so would betray our children and future generations.  Some may still deny the overwhelming judgment of science, but none can avoid the devastating impact of raging fires, and crippling drought, and more powerful storms.  The path towards sustainable energy sources will be long and sometimes difficult.  But America cannot resist this transition; we must lead it” Continuing to subsidize extractive energy sources like oil, gas and radioactive is not leading. It’s not a logical response.  Some argue, he’s already given up on dealing with climate change, with no definitive climate related actions outlined and a 3.5% cut to the Environmental Protection Agency in his latest budget. A Leading, logical response  would be making a major policy speech to herald America’s transition from dirty energy to clean energy on a national scale.  Expending the same effort that was expended in response to World War 2, because make no mistake, This is an actual war to save our World.  We need to start acting like it. Changing whole industries to support the war effort. Requiring all polluters to drastically reduce their toxic emissions and waste.  Banning petrochemical based products. Localizing food and energy production. Recycling EVERYTHING. Converting all gasoline powered car plants to produce clean energy powered vehicles. Using all available idle, underutilized, and obsolete energy producing factories to produce solar panels, wind turbines, geothermal power plants, and other regenerative energy systems. Putting solar panels on top of every building in the country. Embedding them in every paved road.  Retrofitting all extractive energy using systems for regenerative energy use…. Etc, etc, etc… . Greening our infrastructure. Transformative, and radically different policy is what we need. Not nibbling. The time for grand action is now. ”

By John Abraham & Dana Nuccitelli @ The U.K. Guardian:

Our team of citizen science volunteers at Skeptical Science has published a new survey in the journal Environmental Research Letters of over 12,000 peer-reviewed climate science papers, as the Guardian reports today. This is the most comprehensive survey of its kind, and the inspiration of this blog’s name: Climate Consensus – the 97%.

The survey

In 2004, Naomi Oreskes performed a survey of 928 peer-reviewed climate papers published between 1993 and 2003, finding none that rejected the human cause of global warming. We decided that it was time to expand upon Oreskes’ work by performing a keyword search of peer-reviewed scientific journal publications for the terms ‘global warming’ and ‘global climate change’ between the years 1991 and 2011.

Our team agreed upon definitions of categories to put the papers in: explicit or implicit endorsement of human-caused global warming, no opinion, and implicit or explicit rejection or minimization of the human influence, and began the long process of rating over 12,000 abstracts.

We decided from the start to take a conservative approach in our ratings. For example, a study which takes it for granted that global warming will continue for the foreseeable future could easily be put into the implicit endorsement category; there is no reason to expect global warming to continue indefinitely unless humans are causing it. However, unless an abstract included language about the cause of the warming, we categorized it as ‘no opinion’.

Each paper was rated by at least two people, and a dozen volunteers completed most of the 24,000 ratings. The volunteers were a very internationally diverse group. Team members’ home countries included Australia, USA, Canada, UK, New Zealand, Germany, Finland, and Italy.

We also decided that asking the scientists to rate their own papers would be the ideal way to check our results. Who knows what the papers say better than the authors who wrote them? We received responses from 1,200 scientists who rated a total of over 2,100 papers. Unlike our team’s ratings that only considered the summary of each paper presented in the abstract, the scientists considered the entire paper in the self-ratings.

The results

Based on our abstract ratings, we found that just over 4,000 papers took a position on the cause of global warming, 97.1% of which endorsed human-caused global warming. In the scientist self-ratings, nearly 1,400 papers were rated as taking a position, 97.2% of which endorsed human-caused global warming. Many papers captured in our literature search simply investigated an issue related to climate change without taking a position on its cause.

Our survey found that the consensus has grown slowly over time, and reached about 98% as of 2011. Our results are also consistent with several previous surveys finding a 97% consensus amongst climate experts on the human cause of global warming.

Consensus growth over time The growth of the scientific consensus on human-caused global warming in the peer-reviewed literature from 1991 to 2011

Why is this important?

Several studies have shown that people who are aware of scientific consensus on human-caused global warming are more likely to support government action to curb greenhouse gas emissions. This was most recently shown by a paper just published in the journal Climatic Change. People will generally defer to the judgment of experts, and they trust climate scientists on the subject of global warming.

However, vested interests have long realized this and engaged in a campaign to misinform the public about the scientific consensus. For example, a memo from communications strategist Frank Luntz leaked in 2002 advised Republicans,

“Should the public come to believe that the scientific issues are settled, their views about global warming will change accordingly. Therefore, you need to continue to make the lack of scientific certainty a primary issue in the debate

This campaign has been successful. A 2012 poll from US Pew Research Center found less than half of Americans thought scientists agreed humans were causing global warming. The media has assisted in this public misconception, with most climate stories “balanced” with a “skeptic” perspective. However, this results in making the 2–3% seem like 50%. In trying to achieve “balance”, the media has actually created a very unbalanced perception of reality. As a result, people believe scientists are still split about what’s causing global warming, and therefore there is not nearly enough public support or motivation to solve the problem.

Check our results for yourself

We chose to submit our paper to Environmental Research Letters because it is a well-respected, high-impact journal, but also because it offers the option of making a paper open access, free for anyone to download.

We have also set up a public ratings system at Skeptical Science where anybody can duplicate our survey. Read and rate as many abstracts as you like, and see what level of consensus you find. You can compare your results to our abstract ratings, and to the author self-ratings.

Human-caused global warming

We fully anticipate that climate contrarians will respond by saying “we don’t dispute that humans cause some global warming.” First, there are a lot of people who do dispute that humans cause any global warming. Our paper shows that their position is not supported in the scientific literature.

Most papers don’t quantify the human contribution to global warming, because it doesn’t take tens of thousands of papers to establish that reality. However, as noted above, if a paper minimized the human contribution, we classified that as a ‘rejection’. For example, if a paper were to say “the sun caused most of the global warming over the past century,” that would be included in the less than 3% of papers rejecting or minimizing human-caused global warming.

Many studies simply defer to the expert summary of climate science research put together by the Intergovernmental Panel on Climate Change (IPCC), which says that most of the global warming since the mid-20th century has been caused by humans. And according to recent research, that statement is actually too conservative. Of the papers which specifically examine the contributors to global warming, they virtually all conclude that humans are the dominant cause over the past 50 to 100 years.

Results of eight global warming attribution studies Summary of results from 8 studies of the causes of global warming.Most studies simply accept this fact and go on to examine the consequences of this human-caused global warming and associated climate change.

Another important point is that once you accept that humans are causing global warming, you must also accept that global warming is still happening. We cause global warming by increasing the greenhouse effect, and our greenhouse gas emissions just keep accelerating. This ties in to the fact that as recent research has showed, global warming is accelerating. If you accept that humans are causing global warming, as over 97% of peer-reviewed scientific papers do, then this conclusion should not be at all controversial. Global warming cannot have suddenly stopped.

Spread the word

Given the importance of the scientific consensus on human-caused global warming in peoples’ decisions whether to support action to reduce greenhouse gas emissions, and the public lack of awareness of the consensus, we need to make people aware of these results. To that end, design and advertising firm SJI Associates generously created a website pro-bono, centered around the results of our survey. The website can be viewed at TheConsensusProject.com, and it includes a page where consensus graphics can be shared via social media or email. Skeptical Science also has a new page of consensus graphics.

Quite possibly the most important thing to communicate about climate change is that there is a 97% consensus amongst the scientific experts and scientific research that humans are causing global warming. Let’s spread the word and close the consensus gap.

Global Carbon Dioxide Levels In Atmosphere Identical To Those Last Seen In Prehistoric Pliocene Era

In Uncategorized on May 12, 2013 at 7:05 pm

MAUNA LOA OBSERVATORYOldspeak: “We are creating a prehistoric climate in which human societies will face huge and potentially catastrophic risks, only by urgently reducing global emissions will we be able to avoid the full consequences of turning back the climate clock by 3 million years.” –Bob Ward, policy director at the Grantham Research Institute on Climate Change at the London School of Economics. Yep. It’s that serious. Huge and catastrophic risks to all life on this planet are upon us. Meanwhile the U.S.’s selected officials are holding congressional inquiries into what happened in Benghazi, Libya, last year. Creating an immigration reform” bill with mandates for national biometric identification databases to contain information about all adult americans buried in them. Inflating bubbles to create imaginary economic growth with computer generated fiat money. Why are we paying so much  attention to what happened in the past and what is yet to happen in the future, ignoring the clear and present dangers? Why are so many resources being devoted to manufactured scandals & crises, social control plans & billionaires stealing fake money, while infinitely fewer resources, are devoted to the preeminent problem of our time? There are plans to extract dead fossil energy from the seas under the soon to be completely melted polar ice caps, which will generate more toxic emissions and quicken climate change. All plans to slow climate change are ‘market-based’ and profit driven. We know how the corporatocracy is preparing for our dystopic future.  Private armies and gated communities, while cutting resources to the poor, sick and elderly. The mass of people and the planet are not a priority. The people need to demand immediate, coherent, decisive, sustainable and drastically different energy policy.”

By Damian Carrington @ The UK Guardian:

For the first time in human history, the concentration of climate-warming carbon dioxide in the atmosphere has passed the milestone level of 400 parts per million (ppm). The last time so much greenhouse gas was in the air was several million years ago, when the Arctic was ice-free, savannah spread across the Sahara desert and sea level was up to 40 metres higher than today.

These conditions are expected to return in time, with devastating consequences for civilisation, unless emissions of CO2 from the burning of coal, gas and oil are rapidly curtailed. But despite increasingly severe warnings from scientists and a major economic recession, global emissions have continued to soar unchecked.

“It is symbolic, a point to pause and think about where we have been and where we are going,” said Professor Ralph Keeling, who oversees the measurements on a Hawaian volcano, which were begun by his father in 1958. “It’s like turning 50: it’s a wake up to what has been building up in front of us all along.”

“The passing of this milestone is a significant reminder of the rapid rate at which – and the extent to which – we have increased the concentration of greenhouse gases in the atmosphere,” said Prof Rajendra Pachauri, chair of the Intergovernmental Panel on Climate Change, which serves as science adviser to the world’s governments. “At the beginning of industrialisation the concentration of CO2 was just 280ppm. We must hope that the world crossing this milestone will bring about awareness of the scientific reality of climate change and how human society should deal with the challenge.”

The world’s governments have agreed to keep the rise in global average temperature, which have already risen by over 1C, to 2C, the level beyond which catastrophic warming is thought to become unstoppable. But the International Energy Agency warned in 2012 that on current emissions trends the world will see 6C of warming, a level scientists warn would lead to chaos. With no slowing of emissions seen to date, there is already mounting pressure on the UN summit in Paris in 2015, which is the deadline set to settle a binding international treaty to curb emissions.

Edward Davey, the UK’s energy and climate change secretary, said: “This isn’t just a symbolic milestone, it’s yet another piece of clear scientific evidence of the effect human activity is having on our planet. I’ve made clear I will not let up on efforts to secure the legally binding deal the world needs by 2015 to avoid the worst effects of climate change.”

Two CO2 monitoring stations high on the Hawaiian volcano of Mauna Loa are run by the US National Oceanic and Atmospheric Administration and the Scripps Institution of Oceanography and provide the global benchmark measurement. Data released on Friday shows the daily average has passed 400ppm for the first time in its half century of recording. The level peaks in May each year as the CO2 released by decaying vegetation is taken up by renewed plant growth in the northern hemisphere, where the bulk of plants grow.

Analysis of fossil air trapped in ancient ice and other data indicate that this level has not been seen on Earth for 3-5 million years, a period called the Pliocene. At that time, global average temperatures were 3 or 4C higher than today’s and 8C warmer at the poles. Reef corals suffered a major extinction while forests grew up to the northern edge of the Arctic Ocean, a region which is today bare tundra.

“I think it is likely that all these ecosystem changes could recur,” said Richard Norris, a colleague of Keeling’s at Scripps. The Earth’s climate system takes time to adjust to the increased heat being trapped by high greenhouse levels and it may take hundreds of years for the great ice caps in Antarctica and Greenland to melt to the small size of the Pliocence and sea level far above many of the world’s major cities.

But the extreme speed at which CO2 in now rising – perhaps 75 times faster than in pre-industrial time – has never been seen in geological records and some effects of climate change are already being seen, with extreme heatwaves and flooding now more likely. Recent wet and cold summer weather in Europe has been linked to changes in the high level jetstream winds, in turn linked to the rapidly melting sea ice in the Arctic, which shrank to its lowest recorded level in September.

“We are creating a prehistoric climate in which human societies will face huge and potentially catastrophic risks,” said Bob Ward, policy director at the Grantham Research Institute on Climate Change at the London School of Economics. “Only by urgently reducing global emissions will we be able to avoid the full consequences of turning back the climate clock by 3 million years.”

“The 400ppm threshold is a sobering milestone and should serve as a wake up call for all of us to support clean energy technology and reduce emissions of greenhouse gases, before it’s too late for our children and grandchildren,” said Tim Lueker, a carbon cycle scientist at Scripps.

Professor Bob Watson, former IPCC chair and UK government chief scientific adviser, said: “Passing 400ppm of carbon dioxide in the atmosphere is indeed a landmark and the rate of increase is faster than ever and shows no sign of abating due to a lack of political committment to address the urgent issue of climate change – the world is now most likely committed to an increase in surface temperature of 3C-5C compared to pre-industrial times.”

The graph of the rising CO2 at Mauna Loa is known as the Keeling curve, after the late Dave Keeling, the scientist who began the measurements in March 1958. The isolated Hawaiian island is a good location for measurements as it is far from the main sources of CO2, meaning it represents a good global average.