"In a time of universal deceit telling the truth is a revolutionary act." -George Orwell

Posts Tagged ‘Environmental Pollution’

Scientific Analysis Find Rise In Rapid, Catastrophic, Animal Die-Offs Over The Past 75 Years

In Uncategorized on January 22, 2015 at 5:45 pm

Oldspeak: “Hmm. Curious. The oceans are dying. Scientists have found greater proportions of these die offs among birds, fish, and invertebrates, who just happen to spend most of their time in the oceans.  This data also correlates closely with the what many scientists call “The Great Acceleration”; the time around 1950 to present, characterized by relentless and ever-increasing levels of  human consumption and population and GDP growth. Coincidence? Probably not. All is intimately connected.” -OSJ

By Sarah Yang @ Science Daily:

An analysis of 727 mass die-offs of nearly 2,500 animal species from the past 70 years has found that such events are increasing among birds, fish and marine invertebrates. At the same time, the number of individuals killed appears to be decreasing for reptiles and amphibians, and unchanged for mammals.

Such mass mortality events occur when a large percentage of a population dies in a short time frame. While the die-offs are rare and fall short of extinction, they can pack a devastating punch, potentially killing more than 90 percent of a population in one shot. However, until this study, there had been no quantitative analysis of the patterns of mass mortality events among animals, the study authors noted.

“This is the first attempt to quantify patterns in the frequency, magnitude and cause of such mass kill events,” said study senior author Stephanie Carlson, an associate professor at the University of California, Berkeley’s Department of Environmental Science, Policy and Management.

The study, published Monday, Jan. 12 in the Proceedings of the National Academy of Sciences, was led by researchers at UC Berkeley, the University of San Diego and Yale University.

The researchers reviewed incidents of mass kills documented in scientific literature. Although they came across some sporadic studies dating back to the 1800s, the analysis focused on the period from 1940 to the present. The researchers acknowledged that some of their findings may be due to an increase in the reporting of mass die-offs in recent decades. But they noted that even after accounting for some of this reporting bias, there was still an increase in mass die-offs for certain animals.

Overall, disease was the primary culprit, accounting for 26 percent of the mass die-offs. Direct effects tied to humans, such as environmental contamination, caused 19 percent of the mass kills. Biotoxicity triggered by events such as algae blooms accounted for a significant proportion of deaths, and processes directly influenced by climate — including weather extremes, thermal stress, oxygen stress or starvation — collectively contributed to about 25 percent of mass mortality events.

The most severe events were those with multiple causes, the study found.

Carlson, a fish ecologist, and her UC Berkeley graduate students had observed such die-offs in their studies of fish in California streams and estuaries, originally piquing their interest in the topic.

“The catastrophic nature of sudden, mass die-offs of animal populations inherently captures human attention,” said Carlson. “In our studies, we have come across mass kills of federal fish species during the summer drought season as small streams dry up. The majority of studies we reviewed were of fish. When oxygen levels are depressed in the water column, the impact can affect a variety of species.”

The study found that the number of mass mortality events has been increasing by about one event per year over the 70 years the study covered.

“While this might not seem like much, one additional mass mortality event per year over 70 years translates into a considerable increase in the number of these events being reported each year,” said study co-lead author Adam Siepielski, an assistant professor of biology at the University of San Diego. “Going from one event to 70 each year is a substantial increase, especially given the increased magnitudes of mass mortality events for some of these organisms.

This study suggests that in addition to monitoring physical changes such as changes in temperature and precipitation patterns, it is important to document the biological response to regional and global environmental change. The researchers highlighted ways to improve documentation of such events in the future, including the possible use of citizen science to record mass mortality events in real time.

“The initial patterns are a bit surprising, in terms of the documented changes to frequencies of occurrences, magnitudes of each event and the causes of mass mortality,” said study co-lead author Samuel Fey, a postdoctoral fellow in ecology and evolutionary biology at Yale. “Yet these data show that we have a lot of room to improve how we document and study these types of rare events.”

Funding from the Environmental Protection Agency and the National Science Foundation helped support this research.

 

Why Good News For The Ozone Layer Is Bad News For The Climate

In Uncategorized on September 29, 2014 at 12:19 am

2014 927 ozone fwOldspeak: “The “good news” arrived via the Associated Press on September 11: Thanks to the Montreal Protocol, atmospheric ozone is recovering. Scientists have been monitoring atmospheric ozone since 1989, the year the Montreal Protocol on Substances that Deplete Ozone (a protocol to the Vienna Convention for the Protection of the Ozone Layer) came into effect (it was negotiated in 1987). The scientists released their latest assessment on September 10, the subject of the Associated Press report….According to NASA scientist Paul A. Newman, ozone levels climbed 4 percent in mid-northern latitudes at about 30 miles up from 2000 to 2013… The very slight thickening of the ozone layer is, as claimed, due to the phase-out of CFCs and other bad ozone actors. But it’s also due to the increased concentration of carbon dioxide and other heat-trapping gases in the atmosphere. Greenhouse gases cool the upper stratosphere. As that region of the heavens cools, ozone is rebuilt. The good ozone news is thus bad climate news….Among the most powerful greenhouse gases are HFCs, the non-ozone-destroying substitute for CFCs. Some HFCs have a global warming potential (GWP) 10,000 times that of carbon dioxide (the most commonly used, R-134a, has a GWP of 1430). The growth in their use is clear… without global action, HFC use is expected to increase significantly over the next three or four decades with dire consequences for the climate…Pretending that miniscule improvement in atmospheric ozone levels is cause for celebration is not that big of a deal. The more serious problem is continuing to suggest that the Montreal Protocol is a model for international action on climate change. Dealing with CFCs and their problematic substitutes was, and is, infinitely easier than confronting climate chaos. Banning gases with especially high global warming potential (GWPs) is necessary, but nowhere near sufficient. Carbon emissions are the lifeblood of the global economy, of affluent life styles lived by the few but aspired to by the many. A vigorous climate convention requires far-reaching shifts in virtually every corner of daily life in the developed world.” -Steven Breyman

“This is what’s it’s come to in our sad state of affairs. Manufacturing a “victory” and “one of the great success stories of international collective action in addressing a global environmental change phenomenon.” out of something that actually signifies defeat and failure in addressing the global environmental change phenomenon. The reality, is the chemicals that were used to replace the chemicals found to deplete the ozone layer, are thousands of times more potent and harmful than carbon dioxide, the greenhouse gas most of our attention is focused on. The use of these chemicals are expected to increase significantly over the next 3 to 4 decades. How can this reality be couched as good news? Only in a reality where words, artfully and duplicitously weaved together, mean their complete opposite. An Orwellian world, where “War is Peace”, “Freedom is Slavery”, and “Ignorance is Strength”.  No matter how we choose to perceive reality, Earth’s 6th mass extinction keeps rolling along.” –OSJ

 

By Stephen Breyman @ Truthout:

We live in a world hungry for good environmental news. But that’s no excuse for journalistic or scientific spin passing as an unvarnished victory for the environment, nor for exaggeration of the value of a narrowly focused environmental treaty as a model for a universal agreement.

The “good news” arrived via the Associated Press on September 11: Thanks to the Montreal Protocol, atmospheric ozone is recovering. Scientists have been monitoring atmospheric ozone since 1989, the year the Montreal Protocol on Substances that Deplete Ozone (a protocol to the Vienna Convention for the Protection of the Ozone Layer) came into effect (it was negotiated in 1987). The scientists released their latest assessment on September 10, the subject of the Associated Press report.

Some background is in order. The Montreal Protocol is important on its own merits. A world of thinning atmospheric ozone is a world of increased skin cancer, eye problems and reduced agricultural yields and phytoplankton production. Every member state of the United Nations ratified the Protocol. But it is as a model for climate change negotiations and agreement that it takes on greater importance. The successful negotiation of the Montreal Protocol required agreement among policymakers, scientists and corporations, as will the replacement for the Kyoto Protocol.

The original Montreal Protocol achieved iconic status – Kofi Annan called it “perhaps the single most effective international agreement to date” – because it phased out production of five chlorofluorocarbons (CFCs) known to destroy atmospheric ozone. CFCs were most widely used as refrigerants, solvents, blowing agents and fire extinguishers, as are their substitutes today. There have been five effectiveness-improving amendments to the original Protocol.

The Protocol and its amendments were possible for five reasons:

First, given the phase-in of the phase-out (zero production and use of the five CFCs was not required until 1996) DuPont, the dominant firm in the business, had time to research and manufacture the economical and less destructive substitute hydrochlorofluorocarbons (HCFCs), and the nondestructive hydrofluorocarbons (HFCs), even though it had to be pushed hard to do so. Lacking a chlorine atom, HFCs do not attack the ozone layer. HFCs and HCFCs are also less persistent in the atmosphere than CFCs, from two to 40 years for the former, to up to 150 years for the latter.

Second, CFCs were going off patent, so it was in DuPont’s interest to protect the multibillion-dollar market by developing HCFCs and HFCs.

Third, the science was clear on the Antarctic ozone hole, with but a handful of companies, led by DuPont, working to deny it.

Fourth, other ozone killers – several halons and some other CFCs – were not phased out until 2010.

Fifth, mandated phaseout of HCFCs does not begin until 2015, with zero production and consumption required by 2030.

The Montreal Protocol came to be because it posed a minor challenge to the profits of but a few firms, allowed time for new substitutes to come to market, and permitted use of less dangerous ozone-destroying chemicals, or those posing no threat at all.

Now back to the alleged good news report: According to NASA scientist Paul A. Newman, ozone levels climbed 4 percent in mid-northern latitudes at about 30 miles up from 2000 to 2013. (The tiny change for the better explains why it is hard to see much if any improvement between 1989 and 2010, or between 2006 and 2010, in the photos above.) The Associated Press does not tell us about ozone concentrations at other latitudes or other altitudes (except for 50 miles up, but no specific improvement figure is reported; this probably means the improvement was less than 4 percent elsewhere in the upper atmosphere).

The improvement is a “victory for diplomacy and for science, and for the fact that we were able to work together,” said Nobel Prize chemist Mario Molina, one of the scientists who first made the connection between certain chemicals and ozone depletion. Achim Steiner, executive director of the UN Environment Program, hailed the slight recovery of atmospheric ozone as “one of the great success stories of international collective action in addressing a global environmental change phenomenon.” Political scientist Paul Wapner said the latest findings were “good news in an often-dark landscape.”

The very slight thickening of the ozone layer is, as claimed, due to the phase-out of CFCs and other bad ozone actors. But it’s also due to the increased concentration of carbon dioxide and other heat-trapping gases in the atmosphere. Greenhouse gases cool the upper stratosphere. As that region of the heavens cools, ozone is rebuilt. The good ozone news is thus bad climate news.

2014 927 chart 1Among the most powerful greenhouse gases are HFCs, the non-ozone-destroying substitute for CFCs. Some HFCs have a global warming potential (GWP) 10,000 times that of carbon dioxide (the most commonly used, R-134a, has a GWP of 1430). The growth in their use is clear in the graph below; without global action, HFC use is expected to increase significantly over the next three or four decades with dire consequences for the climate, according to MIT atmospheric scientist Susan Solomon. (Source: TEAP/EPA/UNEP)

Ready for more double-edged good news? The Obama administration appears intent on phasing out HFCs (just in time for the UN gathering and Peoples Climate March in NYC), and a chemical that is nondestructive to ozone, with only four times the global warming potential of carbon dioxide – the hydrofluoroolefin HFO-1234YF, also known as 2,3,3,3-Tetrafluoropropene – is ready to go as the latest substitute for CFCs.

The plan (as under the Montreal Protocol) is to give giant producers (including DuPont and Honeywell which own most of the patents) and massive users (including, Coca Cola, Pepsi Cola, Target and Kroger’s) time to phase in HFO-1234YF. The European Union directive that automotive air conditioners use refrigerants with global warming potential (GWPs) of 150 or lower had most European car makers begin shifting to HFO-1234YF in 2011 (a total ban on more powerful climate-changing chemicals comes in 2017). General Motors has been using HFO-1234YF in Chevys, Buicks, GMCs and Cadillacs since 2013. Chrysler reportedly plans to transition to HFO-1234YF as well.

Given the history of CFCs and their substitutes, at least some adverse effects from HFO-1234YF production and use, and some glitches in the transition are likely. German automakers worry that HFO-1234YF is both too expensive and too flammable (they’re investigating the use of carbon dioxide). In case of fire following a collision, HFO-1234YF releases highly corrosive and toxic hydrogen fluoride gas. One report had it that Daimler Benz engineers witnessed combustion in two-thirds of simulated head-on crashes. Considering the requirement that auto repair shops retool their air conditioning service equipment to use HFO-1234YF, it’s likely they’ll stick with the HFC R134a as long as possible. India is so far uninterested in moving toward replacing R134a by HFO-1234YF (China is working with the United States to jointly reduce emissions of HFCs). Canada, Mexico and the United States intend to propose amendments to the Montreal Protocol to command the phase-out of HFC production.

Pretending that miniscule improvement in atmospheric ozone levels is cause for celebration is not that big of a deal. The more serious problem is continuing to suggest that the Montreal Protocol is a model for international action on climate change. Dealing with CFCs and their problematic substitutes was, and is, infinitely easier than confronting climate chaos. Banning gases with especially high global warming potential (GWPs) is necessary, but nowhere near sufficient. Carbon emissions are the lifeblood of the global economy, of affluent life styles lived by the few but aspired to by the many. A vigorous climate convention requires far-reaching shifts in virtually every corner of daily life in the developed world.

Confronting ozone depletion permitted business as usual with but the smallest of tweaks that went unnoticed by most. Overcoming the ozone depletion denial industry was a trivial challenge compared to that posed by the forces arrayed to muddle climate science and stymie strong action.

Again: a climate change agreement that includes robust mitigation, a serious campaign to build resilience against a destabilized climate, and a foundation on the principle of climate justice requires genuine and widespread change.

Preventing catastrophic and irreversible climate change compels conversion of the complex systems of transportation, agriculture, generation of electricity, cooling and heating, waste management, manufacturing, technological innovation and more. It also requires transformation in developed countries’ sense of responsibility for past and future emissions. This is why we have yet to see one. Military budgets must be slashed and war machines stopped to free up the funds necessary for building clean green economies and to stop exacerbating the problem. How likely is that as the United States returns to Iraq for the third time in as many decades?

 

“Limits To Growth” Published In 1972 Proved Correct: New Research Indicates We’re Nearing Global Collapse

In Uncategorized on September 14, 2014 at 7:32 pm

Piles of crushed cars at a metal recycling site in Belfast, Northern Ireland.Oldspeak:If the present growth trends in world population, industrialisation, pollution, food production, and resource depletion continue unchanged, the limits to growth on this planet will be reached sometime within the next one hundred years. The most probable result will be a rather sudden and uncontrollable decline in both population and industrial capacity.” –“Limits To Growth”, 1972

Consider that statement in the context of the present reality:

Humanity’s annual demand on the natural world has exceeded what the Earth can renew in a year since the 1970s. This “ecological overshoot” has continued to grow over the years, reaching a 50 per cent deficit in 2008. This means that it takes 1.5 years for the Earth to regenerate the renewable resources that people use, and absorb the CO2 waste they produce, in that same year.  How can this be possible when there is only one Earth? Just as it is possible to withdraw money from a bank account faster than to wait for the interest this money generates, renewable resources can be harvested faster than they can be re-grown. But just like overdrawing from a bank account, eventually the resource will be depleted. At present, people are often able to shift their sourcing when this happens; however at current consumption rates, these sources will eventually run out of resources too – and some ecosystems will collapse even before the resource is completely gone. The consequences of excess greenhouse gases in the atmosphere are also already being seen, like climate change and ocean acidification. These place additional stresses on biodiversity and ecosystems. The decline in biocapacity per capita is primarily due to an increase in global population. More people have to share the Earth’s resources. The increase in the Earth’s productivity is not enough to compensate for the demands of this growing population.” –World Wildlife Foundation, 2014

“So ignore all the chatter about “climate action” and “environmental activism” and the hoopla about the upcoming “People’s Climate March” and UN Summit on Climate Change. It’s all meaningless and masturbatory theater. Our fate was sealed 40 years ago. We are running up against the physical limits of the ecology and have shown no ability or will to stop. Continued growth in population and resource consumption all but guarantee collapse of the ecology and by extension industrial civilization. We’re witnessing the early stages of global collapse right now. We’re seeing the decline in industrial outputs predicted in Limits To Growth to start in 2015, now. The mounting pollution bringing about agricultural and food production failures in addition to cuts health and education services predicted to start in 2020 is happening now.   Infinity growth is impossible on a finite planet. The anthropocene epoch is nearing its end. Tick, tick, tick, tick, tick, tick….” -OSJ

By Graham Turner and Cathy Alexander @ The U.K. Guardian:

The 1972 book Limits to Growth, which predicted our civilisation would probably collapse some time this century, has been criticised as doomsday fantasy since it was published. Back in 2002, self-styled environmental expert Bjorn Lomborg consigned it to the “dustbin of history”.

It doesn’t belong there. Research from the University of Melbourne has found the book’s forecasts are accurate, 40 years on. If we continue to track in line with the book’s scenario, expect the early stages of global collapse to start appearing soon.

Limits to Growth was commissioned by a think tank called the Club of Rome. Researchers working out of the Massachusetts Institute of Technology, including husband-and-wife team Donella and Dennis Meadows, built a computer model to track the world’s economy and environment. Called World3, this computer model was cutting edge.

The task was very ambitious. The team tracked industrialisation, population, food, use of resources, and pollution. They modelled data up to 1970, then developed a range of scenarios out to 2100, depending on whether humanity took serious action on environmental and resource issues. If that didn’t happen, the model predicted “overshoot and collapse” – in the economy, environment and population – before 2070. This was called the “business-as-usual” scenario.

The book’s central point, much criticised since, is that “the earth is finite” and the quest for unlimited growth in population, material goods etc would eventually lead to a crash.

So were they right? We decided to check in with those scenarios after 40 years. Dr Graham Turner gathered data from the UN (its department of economic and social affairs, Unesco, the food and agriculture organisation, and the UN statistics yearbook). He also checked in with the US national oceanic and atmospheric administration, the BP statistical review, and elsewhere. That data was plotted alongside the Limits to Growth scenarios.

The results show that the world is tracking pretty closely to the Limits to Growth “business-as-usual” scenario. The data doesn’t match up with other scenarios.

These graphs show real-world data (first from the MIT work, then from our research), plotted in a solid line. The dotted line shows the Limits to Growth “business-as-usual” scenario out to 2100. Up to 2010, the data is strikingly similar to the book’s forecasts.

limits to growth
Solid line: MIT, with new research in bold. Dotted line: Limits to Growth ‘business-as-usual’ scenario.
limits to growth
Solid line: MIT, with new research in bold. Dotted line: Limits to Growth ‘business-as-usual’ scenario. Photograph: Supplied
limits to growth
Solid line: MIT, and research in bold. Dotted line: Limits to Growth ‘business-as-usual’ scenario. Photograph: Supplied

As the MIT researchers explained in 1972, under the scenario, growing population and demands for material wealth would lead to more industrial output and pollution. The graphs show this is indeed happening. Resources are being used up at a rapid rate, pollution is rising, industrial output and food per capita is rising. The population is rising quickly.

So far, Limits to Growth checks out with reality. So what happens next?

According to the book, to feed the continued growth in industrial output there must be ever-increasing use of resources. But resources become more expensive to obtain as they are used up. As more and more capital goes towards resource extraction, industrial output per capita starts to fall – in the book, from about 2015.

As pollution mounts and industrial input into agriculture falls, food production per capita falls. Health and education services are cut back, and that combines to bring about a rise in the death rate from about 2020. Global population begins to fall from about 2030, by about half a billion people per decade. Living conditions fall to levels similar to the early 1900s.

It’s essentially resource constraints that bring about global collapse in the book. However, Limits to Growth does factor in the fallout from increasing pollution, including climate change. The book warned carbon dioxide emissions would have a “climatological effect” via “warming the atmosphere”.

As the graphs show, the University of Melbourne research has not found proof of collapse as of 2010 (although growth has already stalled in some areas). But in Limits to Growth those effects only start to bite around 2015-2030.

The first stages of decline may already have started. The Global Financial Crisis of 2007-08 and ongoing economic malaise may be a harbinger of the fallout from resource constraints. The pursuit of material wealth contributed to unsustainable levels of debt, with suddenly higher prices for food and oil contributing to defaults – and the GFC.

The issue of peak oil is critical. Many independent researchers conclude that “easy” conventional oil production has already peaked. Even the conservative International Energy Agency has warned about peak oil.

Peak oil could be the catalyst for global collapse. Some see new fossil fuel sources like shale oil, tar sands and coal seam gas as saviours, but the issue is how fast these resources can be extracted, for how long, and at what cost. If they soak up too much capital to extract the fallout would be widespread.

Our research does not indicate that collapse of the world economy, environment and population is a certainty. Nor do we claim the future will unfold exactly as the MIT researchers predicted back in 1972. Wars could break out; so could genuine global environmental leadership. Either could dramatically affect the trajectory.

But our findings should sound an alarm bell. It seems unlikely that the quest for ever-increasing growth can continue unchecked to 2100 without causing serious negative effects – and those effects might come sooner than we think.

It may be too late to convince the world’s politicians and wealthy elites to chart a different course. So to the rest of us, maybe it’s time to think about how we protect ourselves as we head into an uncertain future.

As Limits to Growth concluded in 1972:

If the present growth trends in world population, industrialisation, pollution, food production, and resource depletion continue unchanged, the limits to growth on this planet will be reached sometime within the next one hundred years. The most probable result will be a rather sudden and uncontrollable decline in both population and industrial capacity.

So far, there’s little to indicate they got that wrong.

 

The Pacific Ocean Has Become Acidic Enough to Dissolve Sea Snails’ Shells: Acidification Is Happening Sooner & On A Larger Scale Than Scientists Predicted; Coastal Biomes Under Threat

In Uncategorized on May 5, 2014 at 11:28 am

First evidence of marine snails from the natural environment along the U.S. West Coast with signs that shells are dissolving. (Credit: NOAA)Oldspeak: “A new study, among the first to examine how the process called ocean acidification impacts marine life, has confirmed that about half of all the pteropods off the west coast are fighting off the acid burn. It builds on previous work that has shown pteropods dissolving in other waters; it’s a disturbing trend, considering they’re a key link in the oceanic food chain….research determined that “large portions of the shelf waters are corrosive to pteropods in the natural environment…This is worrisome, not just because it’s kind of horrifying on a micro-level—imagine the air that surrounds you slowly eroding, say, your cartilage—but because these sea snails are a major food source for other important species like salmon, herring, and mackerel. Their disappearance would radically transform the coastal biome.” -Brian Merchant

It’s happening now. I’m not speculating about the distant future. The first crack in our global life support system is widening now and we are about to experience our first major systems failure….We are on the threshold of the first major eco-system collapse of the Homocene…What the great majority of people do not understand is this: unless we stop the degradation of our oceans, marine ecological systems will begin collapsing and when enough of them fail, the oceans will die… And if the oceans die, then civilization collapses and we all die… It’s as simple as that.”  -Captain Paul Watson

“It really is that simple. The degradation of our oceans is not stopping, it is in fact accelerating. The Pacific Ocean will continue to be transformed into a radioactive acid bath. Marine ecological systems will continue to collapse, and that will be that.  We’re fucked. There is no fixing this. There is no avoiding extinction.” -OSJ

 

Related Story:

NOAA-led researchers discover ocean acidity is dissolving shells of tiny snails off the U.S. West Coast

 

By Brian Merchant @ Vice Magazine:

Meet the tiny, translucent “sea butterfly,” whose home is currently being transformed into an acid bath. Off the US’s west coast, there are anywhere between 100 and 15,000 of these free-swimming sea snails per square meter. And the oceans are beginning to dissolve the tiny shells right off their backs.

A new study, among the first to examine how the process called ocean acidification impacts marine life, has confirmed that about half of all the pteropods off the west coast are fighting off the acid burn. It builds on previous work that has shown pteropods dissolving in other waters; it’s a disturbing trend, considering they’re a key link in the oceanic food chain.

The world’s oceans have absorbed a third of humans’ carbon emissions, a process that increases their acidity. Scientists have long noted the changing chemistry of the waters, and voiced concern that this leaves calcium-based creatures, like coral and pteropods, extremely vulnerable. Now, it appears, they have proof.

“These are some of the first insights into how marine creatures are affected by acidification,” Dr. Nina Bednarsek told me in a phone interview. She’s the lead author of the National Oceanographic and Atmospheric Administration study, which was just published in the Proceedings of the Royal Society BThe research determined that “large portions of the shelf waters are corrosive to pteropods in the natural environment.”

“Fifty percent of those pteropods are affected by acidification,” Bednarsek said. “It’s a lot—more than we expected.” And sooner. She tells me that acidification is happening sooner and on a larger scale than scientists predicted. “This is just an indication of how much we are changing the natural environment,” she said.

The study estimates “that the incidence of severe pteropod shell dissolution owing to anthropogenic [ocean acidification] has doubled in near shore habitats since pre-industrial conditions across this region and is on track to triple by 2050.” In other words, thanks to human carbon pollution, twice as many marine creature shells are dissolving as were before the industrial era. And three times as many will be dissolving by mid-century.

Image: NOAA

This is worrisome, not just because it’s kind of horrifying on a micro-level—imagine the air that surrounds you slowly eroding, say, your cartilage—but because these sea snails are a major food source for other important species like salmon, herring, and mackerel. Their disappearance would radically transform the coastal biome.

Acidification primarily effects the snails’ outer shell layer, and is especially dangerous to juveniles, which are born with very tiny shells. The outer shell, which is comprised of “a more soluble form, they are just dissolved away. In that sense, shells are getting more thin,” Bednarsek said. “It is just a few micron in juveniles. If you dissolve that, the whole shell can just disappear in two months time.”

This means they have to use precious energy to try to build shells with less soluble materials, while the absence of a shell restricts mobility and leave them vulnerable to infection. So is this an existential threat to a highly prevalent species?

“Yes, basically,” Bednarsek said.

“By 2100, 50 percent of the oceans would no longer be viable for pteropods,” Dr. Richard Freely, the study’s co-author, told me, if we continue emitting carbon pollution apace. And that’s exactly what’s expected to happen.

“Estimates of future carbon dioxide levels, based on business as usual emission scenarios, indicate that by the end of this century the surface waters of the ocean could be nearly 150 percent more acidic, resulting in a pH that the oceans haven’t experienced for more than 20 million years,” NOAA estimates.

In other words, the oceans are on track to become an acidic mess, and plenty of things that lived in them for millions of years may simply no longer be able to. The future, it seems, is a place where sea snails’ shells begin dissolving in acid as soon as they’re born. And then, eventually, a place without sea snails.

Abrupt Climate Change: Happening Now, Impacts Visible, Likely To Be More Extreme Than Projected & Beyond Lifeforms’ Ability To Adapt

In Uncategorized on March 25, 2014 at 3:22 am

The Larsen B ice shelf in Antarctica disintegrated between January and March of 2002. This was a floating ice shelf the size of the state of Massachusetts and 700 feet thick. Melt water, heavier than ice, squeezed its way into cracks and penetrated to the bottom of the ice shelf causing the disintegration.

Oldspeak: “…What we know now is that Earth’s climate normally changes through abrupt shifts. Climate change is mostly not a slow, glacially paced thing. The changes are fast and violent and leave ecosystems shredded in their wake. They start out slowly, but then a threshold is crossed, and the temperature jumps up or down far more radically than the slow and modest warming projected by almost all climate change models today. Universally, these abrupt climate changes dwarf climate change projected by our world’s scientific institutions in their summaries of climate change projections… With this new knowledge about abrupt climate change and the galactically large risks posed by abrupt climate change, the discussion about climate in our society today has become misplaced. Emission and eventual climate change are important, but they are fundamentally not in the same ballpark as abrupt change…abrupt changes in ecosystems, weather and climate extremes and groundwater supplies critical for agriculture are not only more likely than previously understood, but also, impacts are more likely to be more extreme… some abrupt changes have already begun – like the crash of Arctic sea ice…..Other possible abrupt climate impacts include ocean extinction events where hot spikes of weather chaos create widespread conditions beyond the evolution of ocean creatures. It’s the extremes that kill. We’ve seen previews in coral bleaching events across the world already. Seventy-five percent of complex coral reefs in the Caribbean have already been decimated…. Another worrisome abrupt climate impact that is currently taking place has happened to 64 million acres of forest in the Rockies and billions of trees in the Amazon…. Across the American West, the average temperature has been 70 percent greater than the global average… The resulting stress from drought, along with the absence of extreme beetle-killing cold, has allowed a natural pine bark beetle to kill 20 times more trees than any attack ever recorded.  Drought alone killed “several” billion trees in the Amazon and now the Amazon is a net source of CO2, not a sink.  In Texas, the drought has been perpetuated for nearly a decade with greater than average rainfall – more rain and the drought still continues because of increased evaporation. It killed over 300 million mature trees in the 2011 heat wave...” -Bruce Melton

“Our planets’ thermostats and air conditioners are failing. Not 50 or 100 years in the future. Now. Rapidly. We are aggressively and ever faster depleting and poisoning the resources we need to survive and have no viable plans to replace them, while exacerbating the conditions causing our thermostats (ice caps)  and air conditioners (forests) to fail…. We will need 2 to 3 earths to support our current levels of consumption. This is not sustainable. There isn’t much doubt that We are racing headlong to extinction. Our pathological anthropocentricity will be our undoing, as it has overridden our powers of self-preservation.  Globalized inverted corptalitarian kleptocracy trumps Survival.  At some point we’ll have no choice but to recognize and accept what we’ve wrought; the non-human scale devastation to come. The risks are too great to ignore.” -OSJ

By Bruce Melton @ Truthout:

Pine beetle kill in Rocky Mountain National Park. Over 64 million acres have been killed across the Rockies of North America by a native pine beetle gone berserk because of warming. (Photo: Bruce Melton)

Today, we are burning fossil carbon one million times faster than it was naturally put in the ground, and carbon dioxide is increasing 14,000 times faster than anytime in the last 610,000 years (1,2). Climate is now changing faster than it has during any other time in 65 million years – 100 times faster than the Paleocene/Eocene extinction event 56 million years ago see here.(3) However, “climate change” is not the most critical issue facing society today; abrupt climate change is. Climate scientists now have the knowledge necessary to guide us beyond existing climate pollution policy. New policy needs to focus on abrupt climate change, not the relatively slow changes we see in climate models of our future. The social, economic and biological disruptive potential of abrupt climate change is far greater than that of the gradual climate change present policy is predicated upon.

Over about the last 100,000 years, the world has seen about 20 abrupt climate changes, averaging 9 to 14 degrees, including in Greenland, where the temperature changed up to 25 to 35 degrees. These abrupt climate changes happen 10 to 100 times faster than the climate change projections we have all come to know and love. Mostly they happened in several decades or less, but one of the biggest changes happened in just a few years. (4)

The evidence of these abrupt changes is clear in the highly accurate findings from ancient preserved air in ice sheets. They were likely all related to feedbacks and thresholds or tipping points. There are many different kinds of feedbacks and tipping points and the science is still unclear on many of them. Feedbacks are things like the snow and ice feedback loop: snow and ice reflect up to 90 percent of the sun’s energy back into space harmlessly as light, while ocean, rocks, soil, vegetation and etc. reflect only 10 percent back into space, and the rest is absorbed and turned into heat energy that gets trapped by the greenhouse effect. The trapped energy creates more warming, that melts more snow and ice, that absorbs even more energy, changing it into heat, and the loop continues until all the snow and ice are gone.

Tipping points are a little bit more difficult to describe in environmental systems, but can easily be described in other ways. A canoe has a tipping point, beyond which a dry lovely day on the water turns into something quite different. Environmental systems behave in a similar manner. We can dump a lot of water pollution into our lakes and rivers, and nothing major seems to happen. Degradation occurs, but the lake or river generally continues to behave like a lake or river until the pollution levels reach a critical point. Then, as when one leans over just a little too far in a canoe, an algae bloom happens and the lake or river turns green or brown overnight and gets really smelly and bad tasting. This is an ecosystem tipping point. Pollution levels (nutrients from wastewater treatment plants, urban stormwater runoff agriculture, etc.) accumulate over time to a sufficiently high level that finally, an algae population explosion occurs. Then a really devastating thing sometimes occurs if the tipping point is really critical. All the algae die, sink to the bottom and decompose simultaneously. The decomposition uses up all the oxygen in the water and there is a big fish kill on top of the stinky smelly unsightly algal bloom.

Our global environment is no different from a lake or river or even a canoe. Some of these 20 or so abrupt changes happened in direct response to tipping points that preceded them, like a shutdown of the North Atlantic portion of the Gulf Stream. Some of them happened because one or more feedbacks went out of control. Pinning down exact causes of the abrupt changes, however, remains a difficult task.

Unearthing the Evidence

Abrupt climate change wasn’t really a recognized phenomenon until about 20 years ago. Strong evidence of abrupt temperature changes had been found in the 1960s in Greenland ice cores, but they were poorly understood and considered anomalies at the time. Climate change science was dominated by sediment cores from oceans and lakes and slow, glacially paced changes of the 100,000-year cycles of our climate as Earth’s tilt and orbit changed around the sun. As time passed, the early evidence of abrupt change was found again and again in subsequent ice cores. Even in Antarctica, these same abrupt signals were found.

Bubbles of ancient air preserved in Greenland ice. (Photo: Bruce Melton)

Those first records of abrupt climate change from the 1960s at Camp Century were found in ice cores from a WWII nuclear base in the middle of the Greenland Ice Sheet. Camp Century was chosen as one of the first places to drill ice because it was an existing station in a very hostile environment. The Greenland Ice Sheet is over two miles thick and 11,000 feet high. The ice cores seemed to show radical climate jumps in the clearly visible annual layers of snow, in oxygen and methane in the preserved ancient air and dust that increased and decreased dramatically according to temperature.

Over the next two decades and continued ice core drilling, the same signs of abrupt changes were seen, and some confidence began to emerge about the validity of this amazing storehouse of evidence. It was not until the early 1990s, though, that the story became clear. Two identical ice cores were drilled in one of the most stable parts of the ice sheet. The cores were identical down to 100,000 years ago, then close to bedrock, the annual layers became warped and folded. Above the level of ice at 100,000 years ago, the ice cores matched identically. The same volcanic eruptions from across the world were represented by characteristic ash from the different eruptions. Even the same dust from Siberia during really cold dry periods was found in the different ice cores. These abrupt changes were real and they were radical. Why then did sediment cores not reveal abrupt changes?

The reason was biopertubation. Bioperturbation is what happens to sediments when worms eat through organic material on the bottom of a lake or ocean. Dozens and even hundreds of years of sediment deposition per inch are mixed and remixed by the worms. It happens to almost all sediments everywhere. The best resolution in sediments at the time was really a century or more or even thousands of years. The abrupt nature of actual changes in the annual sediment layers was simply wiped out, or eaten up. Then we began to learn of areas of the globe where biopertubation did not exist.

A few areas of the ocean were identified that were stagnant and devoid of oxygen. Worms can’t live without oxygen and in these areas there is no bioperturbation. The same abrupt climate jumps as were found in Greenland were now plain to see. We have also found the same evidence in the annual layers of stalagmites and other cave formations.

It took another decade for science to catch up, but what we know now is that Earth’s climate normally changes through abrupt shifts. Climate change is mostly not a slow, glacially paced thing. The changes are fast and violent and leave ecosystems shredded in their wake. They start out slowly, but then a threshold is crossed, and the temperature jumps up or down far more radically than the slow and modest warming projected by almost all climate change models today. Universally, these abrupt climate changes dwarf climate change projected by our world’s scientific institutions in their summaries of climate change projections.

Extreme Impacts

With this new knowledge about abrupt climate change and the galactically large risks posed by abrupt climate change, the discussion about climate in our society today has become misplaced. Emission and eventual climate change are important, but they are fundamentally not in the same ballpark as abrupt change.

A new National Academy of Sciences mega-report takes on the prospect of future abrupt climate changes, asking whether changes may take place “so fast that the time between when a problem is recognized and when action is required shrinks to the point where orderly adaptation is not possible?”

The good news is that some of the more popular abrupt climate change scenarios are not likely, according to the report. Popularized and wildly exaggerated in movies like The Day After Tomorrow, a shutdown of ocean currents seems less likely in time frames that matter. Likewise, concern of serious trouble from methane outgassing from melting clathrates on the ocean floor and in permafrost seems unlikely. However, we do need to realize that the climate science consensus process is not flawless. That process told us in 2007 that Antarctica would not begin to lose ice until after 2100, but now tells us in the 2013 IPCC report that Antarctic ice loss has already caught up with Greenland’s. So, when climate change consensus opinion now tells us ocean current shutdown and clathrate off gassing are not very likely, we must understand that this opinion cannot be counted as fact.

Moreover, the mega-report notes that abrupt changes in ecosystems, weather and climate extremes and groundwater supplies critical for agriculture are not only more likely than previously understood, but also, impacts are more likely to be more extreme.(5) The report tells us that there are many more types of abrupt change than temperature and that science is now becoming good enough to help us anticipate some of them, but not all of them. It also tells us that some abrupt changes have already begun – like the crash of Arctic sea ice: “More open water conditions during summer would have potentially large and irreversible effects . . . because the Arctic region interacts with large-scale circulation systems of the oceans and atmosphere, changes in the extent of sea ice could cause shifts in climate and weather around the Northern Hemisphere.”

We have already seen how increasing energy in the Arctic has increased the magnitude of jet stream loops and the speed of those loops across the planet. These loops carry more intense storms (the polar vortex) and because of their retarded movement across the globe, these more intense weather systems stall out, increasing the dry, wet or otherwise extreme conditions associated with them.

New discoveries have shown that it is likely that one of the most abrupt of all climate changes in the last 100,000 years happened 12,000 years ago. It was called the Younger Dryas, and the temperature in Greenland jumped 25 degrees in three years. Some 1,000 years later, it fell 25 degrees in a few decades. This abrupt tipping point is now a prime candidate in the extinction of 72 percent of North American mammals, including large mammals like the saber-toothed cat and mastodon.

There are other types of abrupt changes that can be triggered by slow climate change. They are called abrupt climate impacts. The report gives the example of the mountain pika, one of my favorite alpine animals. (7) The pika is a bunny-sized, rabbit-like mammal with short little mouse-like ears and a peculiar little squeaky nasally call. It gathers grass and wildflowers in its home in the high mountains, mostly above treeline during the short high altitude summer, and stores this “pika hay” in caches in the rocks of scree slopes high on mountainsides.

As temperatures rise, the alpine meadows that the pika evolved with rise up the mountainside in response to warming. The alpine vegetation follows the cool zone up the mountain. At some point this process ends abruptly as the top of the mountain is reached and no place remains for the pika’s hay to grow. The Center for Biological Diversity has petitioned California and the US Fish and Wildlife Service to list the pika as endangered because of climate change, but has been turned down by both. Their reasoning is that the pika’s range is not in danger of disappearing in the next several decades. That is exactly what this article is about. Current policy simply does not take abrupt climate change into consideration. The consensus reports all mention it sooner or later, but then they caveat their way out of doing anything about it because too little is known about how these things actually happen. From the summary of “Abrupt Climate Change – Anticipating Surprises”:

Although many projections of future climatic conditions have predicted steadily changing conditions giving the impression that communities have time to gradually adapt, for example, by adopting new agricultural practices to maintain productivity in hotter and drier conditions, or by organizing the relocation of coastal communities as sea level rises, the scientific community has been paying increasing attention to the possibility that at least some changes will be abrupt, perhaps crossing a threshold or ‘tipping point’ to change so quickly that there will be little time to react. This concern is reasonable because such abrupt changes – which can occur over periods as short as decades, or even years – have been a natural part of the climate system throughout Earth’s history.

The Larsen B ice shelf in Antarctica disintegrated between January and March of 2002. This was a floating ice shelf the size of the state of Massachusetts and 700 feet thick. Melt water, heavier than ice, squeezed its way into cracks and penetrated to the bottom of the ice shelf causing the disintegration.

A much quicker example is the collapse of the West Antarctic Ice Sheet. The last time it happened 120,000 years ago, Earth was about the same temperature as it is today. We saw a similar collapse in 2003 when the Larsen B ice shelf, the size of Massachusetts, disintegrated in two months. Slow warming had created more and more melt on top of the Larsen B. Then a peculiar thing happened. The melt pools on top of the ice sheet became large enough and heavy enough (water is heavier than ice) to force cracks in the ice open. The cracks catastrophically opened all the way to the bottom of the floating ice sheet a thousand or more feet below and the entire thing broke into little bergy bits. We don’t know when this will happen to the Mexico-sized West Antarctic Ice Sheet (the largest remaining marine ice sheet), but we didn’t know a year ahead of time that collapse was going to happen to the Larsen B either. (8)

The current assumption as to how fast the West Antarctic Ice Sheet could collapse is a hundred years minimum. But the similarities in the Larsen B and the West Antarctic are high, and the consensus has wildly underestimated ice processes in Antarctica before.

Other possible abrupt climate impacts include ocean extinction events where hot spikes of weather chaos create widespread conditions beyond the evolution of ocean creatures. It’s the extremes that kill. We’ve seen previews in coral bleaching events across the world already. Seventy-five percent of complex coral reefs in the Caribbean have already been decimated. (9) Polar bears are at risk because their main prey, the ringed seal, rears its young on sea ice. The young ringed seals cannot swim until they mature – creating a large challenge for the perpetuation of that species with the absence of sea ice during the reproduction season. (10)

Another worrisome abrupt climate impact that is currently taking place has happened to 64 million acres of forest in the Rockies and billions of trees in the Amazon. In the Rockies, prolonged drought has been caused by warmer temperatures. Across the American West, the average temperature has been 70 percent greater than the global average and the increase is even greater at elevations where the forests are. This is a long-term shift in relative wetness, shown in the climate models and now being realized. (11) The growing season has increased by 30 days or more in the spring, which is relatively easy to measure because of the onset of snowmelt. (12) In the fall, it is more difficult to measure, but the longer growing season and the hotter temperatures both add to the warming feedback that has perpetuated drought even as normal rainfall has returned to some areas. The resulting stress from drought, along with the absence of extreme beetle-killing cold, has allowed a natural pine bark beetle to kill 20 times more trees than any attack ever recorded. (13)

Drought alone killed “several” billion trees in the Amazon and now the Amazon is a net source of CO2, not a sink. (14) In Texas, the drought has been perpetuated for nearly a decade with greater than average rainfall – more rain and the drought still continues because of increased evaporation. It killed over 300 million mature trees in the 2011 heat wave. (15)

Making Climate Science Real

So, what can we do to prepare for possible abrupt changes in the near future? The mega-report suggests setting up an Abrupt Change Early Warning System (ACEWS). Environmental systems often send out signals that a change of state is near. When weather flickers from cold to hot or wet to dry, it may be a sign that abrupt changes are to come. The ACEWS system would be integrated with a risk management system that takes into consideration the ultimate costs of an abrupt change. Example: coral bleaching events are certainly costly to some ocean systems and economies dependent on those ocean systems. An abrupt sea level jump, however, may not have near the impact on ocean systems, but have much more devastating impacts on global socio-economic factors.

Barring the creation of a full-blown abrupt change early warning system, scientists will continue to monitor ongoing changes and increase the accuracy of their measurements and their modeling efforts to simulate and recreate future and past change events. But as more knowledge on abrupt changes is discovered, one thing is becoming crystal clear: Climate change policy today has become severely dated, and we need to catch up.

Just a few years ago, when the Kyoto Protocol was still a valid way of preventing dangerous climate change, emission reduction strategies were appropriate. We did not know nearly as much about abrupt climate change and abrupt impacts as we do today. The IPCC had not pronounced that greater than 100 percent emissions reductions for a sustained period are required to prevent dangerous climate change. (16) Now we know these things, and now we know we must begin to remove CO2 directly from our atmosphere because no amount of emissions reductions can remove greater than 100 percent of annual emissions.

We also know that once fully industrialized, air capture of CO2 can be done for $25 per ton. This means the removal of 50 ppm of CO2 from the atmosphere can be done for what the US paid for healthcare in 2005 ($2.1 trillion). (17)

Why are we not yet implementing these changes? A large part of the answer is that the perceived debate has masked the facts. Climate science is not real to most people. It doesn’t really affect many of us yet; it’s not a priority, so it is not reported. Humanity needs to be brought up to speed. Once the knowledge is spread around – as crucial scientific facts, not politics – we will make the correct decisions. One only hopes we can spread that crucial knowledge before abrupt changes begin.

___________________________________________________________________________________

Notes

1. We are using fossil fuels one million times faster than Mother Nature saved them for us . . . Richard Alley, Earth: The Operators’ Manual, Norton Publishing and PBS documentary.

2. 14,000 times faster… Zeebe and Caldeira, Close mass balance of long-term carbon fluxes from ice-core CO2 and ocean chemistry records, Nature Geoscience, Advance Online Publication, April 27, 2008.

3. 100 times faster than anytime in 65 million years . . . Diffenbaugh and Field, “Changes in Ecologically Critical Terrestrial Climate Conditions, Natural Systems in Changing Climates,” Science, Special Climate Edition, Volume 341, August 2, 2013, page 490, first paragraph: “The Pleistocene/Eocene Thermal Maximum (PETM) encompassed warming of 5 degrees C in less than 10,000 years, a rate of change that is 100-fold slower than that projected by RCP8.5.”

4. Abrupt climate change as fast as a few years. Abrupt Climate Change – Anticipating Surprises, National Research Council of the National Academies of Science, December 2013, Preface, page vii, second paragraph.

9 to 15 degrees across the globe . . . Alley, The Two-Mile Time Machine: Ice Cores, Abrupt Climate Change, and Our Future, Princeton University Press, 2000, page 119, Figure 12.2.

Data for figure 12.2 is from Cuffey and Clow, “Temperature, accumulation, and ice sheet elevation in central Greenland through the last deglacial Transition,” Journal of Geophysical Research, volume 102(C12), pp 26,383 to 26,396.

Greenland temperature change is twice that of the global average. Chylek and Lohmann, “Ratio of Greenland to global temperature change – comparison of observations and climate models,” Geophysical Research Letters, July 2005. Chylek and Lohmann say the Greenland temperature change is 2.2 times greater than the global average. From Alley’s Figure 12.2 (Cuffey and Clow), the 25 to 35 degree F abrupt changes in Greenland would equal 9 to 15 degrees average across the globe.

Also see: 25 to 35 degrees in “Greenland, National Research Council, Abrupt Climate Change: Inevitable Surprises,” Committee on Abrupt Climate Change, 2002. Figure 2.5, page 37.

5. More extreme than previously understood. Abrupt Climate Change – Anticipating Surprises, National Research Council, preface, third paragraph.

6. Extinction of 72 percent of North American Mammals, ibid. page 1, second paragraph

7. Pika, ibid. page 118.

8. West Antarctic Ice Sheet, ibid., pages 7, 13, 33, 34, 59, 61, 62, 150, 161.

9. Seventy-five percent of Caribbean reefs destroyed. Alvarez-Philip, Dulvey, et. al., “Flattening of Caribbean coral reefs: Region-wide decline in architectural complexity,” Proceedings of the Royal Society-B, June 2009.

10. “Polar Bears,” Abrupt Climate Change – Anticipating Surprises, National Research Council, page 118.

11. The American West has warmed 70 percent more than the global average. Hotter and Drier, “The West’s Changed Climate,” Rocky Mountain Climate Organization, 2008, Executive Summary, page iv, paragraph 1.

12. Spring is coming 30 days sooner in the American West; 10–30 days over the 1948–2000 period. I. Stewart, D. Cayan, and M. Dettinger, “Changes in snowmelt runoff timing in western North America under a ‘business as usual’ climate change scenario,” Climatic Change 62 (2004): 217-232. Page 223, 4. Results, second paragraph.

13. “Bark Beetle Outbreaks,” Abrupt Climate Change – Anticipating Surprises, National Research Council, page 21.

14. The Amazon has flipped from a carbon sink to a carbon source; Lewis et al., “The 2010 Amazon Drought,” Science, February 4, 2011.

15. 301 million trees killed in Texas in the drought of 2011; Texas A&M Forest Service.

16. IPCC 2013: Greater than 100 percent emissions reductions; IPCC 2013, Summary for Policy Makers, E.8 “Climate Stabilization, Climate Change Commitment and Irreversibility,” p 20, fourth bullet.

17. Lower Limit for Air Capture Costs: $25 per ton CO2 or slightly lower than the suggested minimum price for flue capture; Lackner et al., “The urgency of the development of CO2 capture from ambient air,” PNAS, August 14, 2012, page 13159, paragraph 6.

Bruce Melton

Bruce Melton is a professional engineer, environmental researcher, filmmaker, and author in Austin, Texas. Information on Melton’s new book, Climate Discovery Chronicles can be found along with more climate change writing, climate science outreach and critical environmental issue documentary films on his web sites and http://www.climatediscovery.com Images copyright Bruce Melton 2012, except where referenced otherwise.

 

AAAS Report: Humans At Risk Of Pushing Climate System Toward Abrupt, Unpredictable, Unalterable Changes With Highly Damaging Impacts

In Uncategorized on March 20, 2014 at 8:31 pm

Oldspeak: “The American Association for the Advancement of Science says: The evidence is overwhelming: levels of greenhouse gases in the atmosphere are rising. Temperatures are going up. Springs are arriving earlier. Ice sheets are melting. Sea level is rising. The patterns of rainfall and drought are changing. Heat waves are getting worse, as is extreme precipitation. The oceans are acidifying…. The science linking human activities to climate change is analogous to the science linking smoking to lung and cardiovascular diseases. Physicians, cardiovascular scientists, public health experts and others all agree smoking causes cancer….This agreement is documented not just by a single study, but by a converging stream of evidence over the past two decades from surveys of scientists, content analyses of peer-reviewed studies, and public statements issued by virtually every membership organization of experts in this field… We are at risk of pushing our climate system toward abrupt, unpredictable, and potentially irreversible changes with highly damaging impacts…Disturbingly, scientists do not know how much warming is required to trigger such changes to the climate system…. as emissions continue and warming increases, the risk increases.”

AAAS Report, “What We Know: The Reality, Risks And Response To Climate Change”

“The largest and most knowledgeable general body of scientists in the world is out of its normal character, issuing its own dire warning in addition to the mounting unquestionable evidence that anthropocentric climate change is real, will get worse, and cannot be stopped. Adding the particularly terrifying nugget that basically we have no idea, when the warming will be sufficient to propel the proverbial shit toward the proverbial fan. We do know that several unalterable non-linear feedbacks loops have been initiated and earth’s 6th mass extinction is under way. We do know our world is turning upside down, with never before seen extreme environmental impacts and weather events. All previous norms and customs are no longer valid. We’re basically living on a ticking time bomb, and don’t know when it will go off.  And each day of increasing human emissions cuts the fuse faster. in short, We’re fucked. And we’ll take the vast majority of life on Mother Earth with us. Our Mother’s immune system will kill the highly virulent and destructive infection that is Humanity sooner than we think. “ -OSJ

By Alex Kirby @ Climate News Network:

In a highly unusual intervention in the debate over climate policy, US scientists say the evidence that the world is warming is as conclusive as that which links smoking and lung cancer.

LONDON, 18 March – The American Association for the Advancement of Science (AAAS) says there is a “small but real” chance that a warming climate will cause sudden and possibly unalterable changes to the planet.

This echoes the words used in its 2007 report by the Intergovernmental Panel on Climate Change (IPCC), which said climate change might bring “abrupt and irreversible” impacts.

A child with kwashiorkor, caused by evere protein deficiency: Child malnutrition may rise by about a fifth
Image: Dr Lyle Conrad, Centers for Disease Control and Prevention, via Wikimedia Commons

In a report, What We Know, the AAAS makes an infrequent foray into the climate debate. The report’s significance lies not in what it says, which covers familiar ground, but in who is saying it: the world’s largest general scientific body, and one of its most knowledgeable.

The AAAS says: “The evidence is overwhelming: levels of greenhouse gases in the atmosphere are rising. Temperatures are going up. Springs are arriving earlier. Ice sheets are melting. Sea level is rising. The patterns of rainfall and drought are changing. Heat waves are getting worse, as is extreme precipitation. The oceans are acidifying.

“The science linking human activities to climate change is analogous to the science linking smoking to lung and cardiovascular diseases. Physicians, cardiovascular scientists, public health experts and others all agree smoking causes cancer.

Few dissenters

“And this consensus among the health community has convinced most Americans that the health risks from smoking are real. A similar consensus now exists among climate scientists, a consensus that maintains climate change is happening, and human activity is the cause.”

The report’s headline messages are unambiguous. It says climate change is occurring here and now: “Based on well-established evidence, about 97% of climate scientists have concluded that human-caused climate change is happening.

“This agreement is documented not just by a single study, but by a converging stream of evidence over the past two decades from surveys of scientists, content analyses of peer-reviewed studies, and public statements issued by virtually every membership organization of experts in this field.

“We are at risk of pushing our climate system toward abrupt, unpredictable, and potentially irreversible changes with highly damaging impacts…Disturbingly, scientists do not know how much warming is required to trigger such changes to the climate system.

Expensive to delay

“The sooner we act, the lower the risk and cost. And there is much we can do…as emissions continue and warming increases, the risk increases”.

The AAAS says there is scarcely any precedent for the speed at which this is happening: “The rate of climate change now may be as fast as any extended warming period over the past 65 million years, and it is projected to accelerate in the coming decades.”

Historically rare extreme weather like once-in-a-century floods, droughts and heat waves could become almost annual occurrences, it says, and there could be large-scale collapses of the Antarctic and Greenland ice sheets, and of part of the Gulf Stream, loss of the Amazon rain forest, die-off of coral reefs, and mass extinctions.

The authors acknowledge that what the AAAS is doing is unusual: “As scientists, it is not our role to tell people what they should do or must believe about the rising threat of climate change.

“But we consider it to be our responsibility as professionals to ensure, to the best of our ability, that people understand what we know: human-caused climate change is happening…”

More child malnutrition

At the end of March the IPCC, the UN’s voice on climate science, is due to release a summary of the report of its Working Group II, on impacts, adaptation and vulnerability to climate change.

The London daily The Independent, which says it has seen a draft of the report’s final version, says it will spell out a prospect of “enormous strain, forcing mass migration, especially in Asia, and increasing the risk of violent conflict.”

The newspaper says the report predicts that climate change “will reduce median crop yields by 2% per decade for the rest of the century”, against a backdrop of rising demand set to increase by 14% per decade until 2050. “This will in turn push up malnutrition in children by about a fifth”, it adds.

Other predictions in the draft, The Independent says, include possible global aggregate economic losses of between 0.2 and 2.0%; more competition for fresh water; and by 2100 hundreds of millions of people affected by coastal flooding and displaced by land loss, mainly in Asia

Exaustive Study Finds Atmospheric Concentrations Of Methane Gas Up To 75% Higher Than EPA Estimates

In Uncategorized on February 25, 2014 at 8:51 pm

America's natural gas system is leaky and in need of a fix, new study findsOldspeak: “Duh. When you understand that methane (b.k.a. “Natural’)  gas extraction; “fracking” creates “alarmingly high” uncontrolled gas emissions into the atmosphere. indefinitely. When you understand that methane gas leaks are persistent throughout the extraction, production and consumption cycle, this cannot be surprising. What is surprising to me is that anyone took the EPAs estimates seriously, when they for some reason, excluded natural methane sources, like wetlands and geologic seeps. With the largest sea floor methane seep in the fucking world  right off the coast of the Carolinas, and scientists have no idea how many more are out there, this makes no sense. And for some other ridiculously corrupt reason allowed methane gas extracting corporations to “self report” the emissions levels from their operations. That’s right. They don’t have to allow EPA access to their sites unless they feel like it. They just tell EPA whatever they like, and EPA has zero authority to trust but verify the numbers provided. And if Obama gets his wish to dramatically expand Methane gas extraction operations, ignoring the environmental destruction and contamination its extraction begets, we can expect this madness to get worse. Short explaination? We’re fucked.” -OSJ

By Mark Golden @ Stanford  News Service:

A review of more than 200 earlier studies confirms that U.S. emissions of methane are considerably higher than official estimates. Leaks from the nation’s natural gas system are an important part of the problem. This finding has important implications for natural gas as a possible replacement fuel for coal.

Oil and gas processing plants are significant sources of methane, Stanford researchers have found. (INSAGO / Shutterstock)

The first thorough comparison of evidence for natural gas system leaks confirms that organizations including the Environmental Protection Agency (EPA) have underestimated U.S. methane emissions generally, as well as those from the natural gas industry specifically.

Natural gas consists predominantly of methane. Even small leaks from the natural gas system are important because methane is a potent greenhouse gas – about 30 times more potent than carbon dioxide. A study, “Methane Leakage from North American Natural Gas Systems,” published in the Feb. 14 issue of the journal Science, synthesizes diverse findings from more than 200 studies ranging in scope from local gas processing plants to total emissions from the United States and Canada.

“People who go out and actually measure methane pretty consistently find more emissions than we expect,” said the lead author of the new analysis, Adam Brandt, an assistant professor of energy resources engineering at Stanford University. “Atmospheric tests covering the entire country indicate emissions around 50 percent more than EPA estimates,” said Brandt. “And that’s a moderate estimate.”

The standard approach to estimating total methane emissions is to multiply the amount of methane thought to be emitted by a particular kind of source, such as leaks at natural gas processing plants or belching cattle, by the number of that source type in a region or country. The products are then totaled to estimate all emissions. The EPA does not include natural methane sources, like wetlands and geologic seeps.

The national natural gas infrastructure has a combination of intentional leaks, often for safety purposes, and unintentional emissions, like faulty valves and cracks in pipelines. In the United States, the emission rates of particular gas industry components – from wells to burner tips – were established by the EPA in the 1990s.

Since then, many studies have tested gas industry components to determine whether the EPA’s emission rates are accurate, and a majority of these have found the EPA’s rates too low. The new analysis does not try to attribute percentages of the excess emissions to natural gas, oil, coal, agriculture, landfills, etc., because emission rates for most sources are so uncertain.

Several other studies have used airplanes and towers to measure actual methane in the air, so as to test total estimated emissions. The new analysis, which is authored by researchers from seven universities, several national laboratories and federal government bodies, and other organizations, found these atmospheric studies covering very large areas consistently indicate total U.S. methane emissions of about 25 to 75 percent higher than the EPA estimate.

Some of the difference is accounted for by the EPA’s focus on emissions caused by human activity. The EPA excludes natural methane sources like geologic seeps and wetlands, which atmospheric samples unavoidably include. The EPA likewise does not include some emissions caused by human activity, such as abandoned oil and gas wells, because the amounts of associated methane are unknown.

However, the analysis also finds that some recent studies showing very high methane emissions in regions with considerable natural gas infrastructure are not representative of the entire gas system. “If these studies were representative of even 25 percent of the natural gas industry, then that would account for almost all the excess methane noted in continental-scale studies,” said a co-author of the study, Eric Kort, an atmospheric science professor at the University of Michigan. “Observations have shown this to be unlikely.”

Natural gas as a replacement fuel

Even though the gas system is almost certainly leakier than previously thought, generating electricity by burning gas rather than coal still reduces the total greenhouse effect over 100 years, the new analysis shows. Not only does burning coal release an enormous amount of carbon dioxide, mining it releases methane.

Perhaps surprisingly though, the analysis finds that powering trucks and buses with natural gas instead of diesel fuel probably makes the globe warmer, because diesel engines are relatively clean. For natural gas to beat diesel, the gas industry would have to be less leaky than the EPA’s current estimate, which the new analysis also finds quite improbable.

“Fueling trucks and buses with natural gas may help local air quality and reduce oil imports, but it is not likely to reduce greenhouse gas emissions. Even running passenger cars on natural gas instead of gasoline is probably on the borderline in terms of climate,” Brandt said.

The natural gas industry, the analysis finds, must clean up its leaks to really deliver on its promise of less harm. Fortunately for gas companies, a few leaks in the gas system probably account for much of the problem and could be repaired. One earlier study examined about 75,000 components at processing plants. It found some 1,600 unintentional leaks, but just 50 faulty components were behind 60 percent of the leaked gas.

“Reducing easily avoidable methane leaks from the natural gas system is important for domestic energy security,” said Robert Harriss, a methane researcher at the Environmental Defense Fund and a co-author of the analysis. “As Americans, none of us should be content to stand idly by and let this important resource be wasted through fugitive emissions and unnecessary venting.”

One possible reason leaks in the gas industry have been underestimated is that emission rates for wells and processing plants were based on operators participating voluntarily. One EPA study asked 30 gas companies to cooperate, but only six allowed the EPA on site.

“It’s impossible to take direct measurements of emissions from sources without site access,” said Garvin Heath, a senior scientist with the National Renewable Energy Laboratory and a co-author of the new analysis. “But self-selection bias may be contributing to why inventories suggest emission levels that are systematically lower than what we sense in the atmosphere.”

The research was funded by the nonprofit organization Novim through a grant from the Cynthia and George Mitchell Foundation. “We asked Novim to examine 20 years of methane studies to explain the wide variation in existing estimates,” said Marilu Hastings, sustainability program director at the Cynthia and George Mitchell Foundation. “Hopefully this will help resolve the ongoing methane debate.”

Other co-authors of the Science study are Francis O’Sullivan of the MIT Energy Initiative; Gabrielle Pétron of the National Oceanic and Atmospheric Administration (NOAA) and the University of Colorado; Sarah M. Jordaan of the University of Calgary; Pieter Tans, NOAA; Jennifer Wilcox, Stanford; Avi Gopstein of the U.S. Department of State; Doug Arent of the National Renewable Energy Laboratory and the Joint Institute for Strategic Energy Analysis; Steven Wofsy of Harvard University; Nancy Brown of the Lawrence Berkeley National Laboratory; independent consultant Richard Bradley; and Galen Stucky and Douglas Eardley, both of the University of California-Santa Barbara. The views expressed in the study are those of the authors, and do not necessarily reflect those of the U.S. Department of State or the U.S. government.

The Health Crisis You Love: Radiation, Wireless Technology And The Digital Toxification Of America

In Uncategorized on February 13, 2014 at 8:05 pm
http://skyvisionsolutions.files.wordpress.com/2013/08/rf-health-hazard-image.jpg?w=300&amph=193“As a multitude of hazardous wireless technologies are deployed in homes, schools and workplaces, government officials and industry representatives continue to insist on their safety despite growing evidence to the contrary. A major health crisis looms that is only hastened through the extensive deployment of “smart grid” technology.”

Oldspeak:.In April 2012 the AAEM (American Academy of Environmental Medicine) issued a formal position paper on the health effects of RF and EMF exposure based on a literature review of the most recent research. The organization pointed to how government and industry arguments alleging the doubtful nature of the science on non-thermal effects of RF were not defensible in light of the newest studies. “Genetic damage, reproductive defects, cancer, neurological degeneration and nervous system dysfunction, immune system dysfunction, cognitive effects, protein and peptide damage, kidney damage, and developmental effects have all been reported in the peer‐reviewed scientific literature… “When you put the science together, we come to the irrefutable conclusion that there’s a major health crisis coming, probably already underway,” George Carlo cautions. “Not just cancer, but also learning disabilities, attention deficit disorder, autism, Alzheimer’s, Parkinson’s, and psychological and behavioral problems—all mediated by the same mechanism. That’s why we’re so worried. Time is running out.” 

-James F. Tracy

“I’m just gonna go ahead and repost my comments from my post on this from 2011. Safe to assume in the time since , environmental conditions have, as expected gotten worse…‘These are some of  the documented deleterious effects of prolonged exposure to RF-EMF radiation. It is reasonable to assume there are many more we are unaware of.  And we’re exposed to it CONSTANTLY. But hey as long as Big Pharma gets to keep getting paid pumping our kids full of the ADHD meds they need because this radiation is making them go haywire and adversely affecting their memory and learning abilities, everything is fine, pay no attention to the men behind the curtain. Welcome to the largest human experiment EVER. And very few people are even aware of it. What’s most disturbing is the non-partisan research is being ignored, in favor of obviously bought and paid for by industry research….Silent Weapons For Quiet Wars surround us. We’ve come to see the technology that destroys us as indispensable. Ignorance Is Strength. -OSJ

Related Story

Radiation From Cell Phones & WiFi Networks Are Making People Sick — Are We All at Risk?

By James F. Tracy @ Global Research:

In October 2009 at Florida Power and Light’s (FPL) solar energy station President Barack Obama announced that $3.4 billion of the American Reinvestment and Recovery Act would be devoted to the country’s “smart energy grid” transition. Matching funds from the energy industry brought the total national Smart Grid investment to $8 billion. FPL was given $200 million of federal money to install 2.5 million “smart meters” on homes and businesses throughout the state.[1]

By now many residents in the United States and Canada have the smart meters installed on their dwellings. Each of these meters is equipped with an electronic cellular transmitter that uses powerful bursts of electromagnetic radiofrequency (RF) radiation to communicate with nearby meters that together form an interlocking network transferring detailed information on residents’ electrical usage back to the utility every few minutes or less. Such information can easily be used to determine individual patterns of behavior based on power consumption.

The smart grid technology is being sold to the public as a way to “empower” individual energy consumers by allowing them to access information on their energy usage so that they may eventually save money by programming “smart” (i.e, wireless enabled) home appliances and equipment that will coordinate their operability with the smart meter to run when electrical rates are lowest. In other words, a broader plan behind smart grid technology involves a tiered rate system for electricity consumption that will be set by the utility to which customers will have no choice but to conform.

Because of power companies’ stealth rollout of smart meters a large majority of the public still remains unaware of the dangers they pose to human health. This remains the case even though states such as Maine have adopted an “opt out” provision for their citizens. The devices have not been safety-tested by Underwriters Laboratory and thus lack the UL approval customary for most electronics.[2] Further, power customers are typically told by their utilities that the smart meter only communicates with the power company “a few times per day” to transmit information on individual household energy usage. However, when individuals obtained the necessary equipment to do their own testing they found the meters were emitting bursts of RF radiation throughout the home far more intense than a cell phone call every minute or less.[3]

America’s Telecom-friendly Policy for RF Exposure
A growing body of medical studies is now linking cumulative RF exposure to DNA disruption, cancer, birth defects, miscarriages, and autoimmune diseases. Smart meters significantly contribute to an environment already polluted by RF radiation through the pervasive stationing of cellular telephone towers in or around public spaces and consumers’ habitual use of wireless technologies. In the 2000 Salzburg Resolution European scientists recommended the maximum RF exposure for humans to be no more than one tenth of a microwatt per square centimeter. In the United States RF exposure limits are 1,000 microwatts per centimeter, with no limits for long term exposure.[4] Such lax standards have been determined by outdated science and the legal and regulatory maneuvering of the powerful telecommunications and wireless industries.

The Environmental Protection Agency (EPA) ceased studying the health effects of radiofrequency radiation when the Senate Appropriations Committee cut the department’s funding and forbade it from further research into the area.[5] Thereafter RF limits were codified as mere “guidelines” based on the EPA’s tentative findings and are to this day administered by the Federal Communications Commission (FCC).

These weakly enforced standards are predicated on the alleged “thermal effect” of RF. In other words, if the energy emitted from a wireless antenna or device is not powerful enough to heat the skin or flesh then no danger is posed to human health.[6] This reasoning is routinely put forward by utilities installing smart meters on residences, telecom companies locating cellular transmission towers in populated areas, and now school districts across the US allowing the installation of cell towers on school campuses.[7]

The FCC’s authority to impose this standard was further reinforced with the passage of the 1996 Telecommunications Act that included a provision lobbied for by the telecom industry preventing state and local governments from evaluating potential environmental and health effects when locating cell towers “so long as ‘such facilities comply with the FCC’s regulations concerning such emissions.’”[8]

In 2001 an alliance of scientists and engineers with the backing of the Communications Workers of America filed a federal lawsuit hoping the Supreme Court would reconsider the FCC’s obsolete exposure guidelines and the Telecom Act’s overreach into state and local jurisdiction. The high court refused to hear the case. When the same group asked the FCC to reexamine its guidelines in light of current scientific studies the request was rebuffed.[9] Today in all probability millions are suffering from a variety of immediate and long-term health effects from relentless EMF and RF exposure that under the thermal effect rationale remain unrecognized or discounted by the telecom industry and regulatory authorities alike.

Growing Evidence of Health Risks From RF Exposure
The main health concern with electromagnetic radiation emitted by smart meters and other wireless technologies is that EMF and RF cause a breakdown in the communication between cells in the body, interrupting DNA repair and weakening tissue and organ function. These are the findings of Dr. George Carlo, who oversaw a comprehensive research group commissioned by the cell phone industry in the mid-1990s.

When Carlo’s research began to reveal how there were indeed serious health concerns with wireless technology, the industry sought to bury the results and discredit Carlo. Yet Carlo’s research has since been upheld in a wealth of subsequent studies and has continuing relevance given the ubiquity of wireless apparatuses and the even more powerful smart meters. “One thing all these conditions have in common is a disruption, to varying degrees, of intercellular communication,” Carlo observes. “When we were growing up, TV antennas were on top of our houses and such waves were up in the sky. Cell phones and Wi-Fi have brought those things down to the street, integrated them into the environment, and that’s absolutely new.”[10]

In 2007 the BioInitiative Working Group, a worldwide body of scientists and public health experts, released a 650-page document with over 2000 studies linking RF and EMF exposure to cancer, Alzheimer’s disease, DNA damage, immune system dysfunction, cellular damage and tissue reduction.[11]

In May 2011 the World Health Organization’s International Agency for Research on Cancer categorized “radiofrequency electromagnetic fields as possibly carcinogenic to humans based on an increased risk for glioma, a malignant type of brain cancer, associated with wireless cellphone use.”[12]

In November 2011 the Board of the American Academy of Environmental Medicine (AAEM), a national organization of medical and osteopathic physicians, called on California’s Public Utilities Commission to issue a moratorium on the continued installation of smart meters in residences and schools “based on a scientific assessment of the current available literature.” “[E]xisting FCC guidelines for RF safety that have been used to justify installations of smart meters,” the panel wrote,

“only look at thermal tissue damage and are obsolete, since many modern studies show metabolic and genomic damage from RF and ELF exposure below the level of intensity which heats tissues … More modern literature shows medically and biologically significant effects of RF and ELF at lower energy densities. These effects accumulate over time, which is an important consideration given the chronic nature of exposure from ‘smart meters.’”[13]

In April 2012 the AAEM issued a formal position paper on the health effects of RF and EMF exposure based on a literature review of the most recent research. The organization pointed to how government and industry arguments alleging the doubtful nature of the science on non-thermal effects of RF were not defensible in light of the newest studies. “Genetic damage, reproductive defects, cancer, neurological degeneration and nervous system dysfunction, immune system dysfunction, cognitive effects, protein and peptide damage, kidney damage, and developmental effects have all been reported in the peer‐reviewed scientific literature,” AAEM concluded.[14]

Radiating Children
The rollout of smart meters proceeds alongside increased installation of wireless technology and cell phone towers in and around schools in the US. In 2010 Professor Magda Havas conducted a study of schools in 50 US state capitols and Washington DC to determine students’ potential exposure to nearby cell towers. A total 6,140 schools serving 2.3 million students were surveyed using the antennasearch.com database. Of these, 13% of the schools serving 299,000 students have a cell tower within a quarter mile of school grounds, and another 50% of the schools where 1,145,000 attend have a tower within a 0.6 mile radius. The installation of wireless networks and now smart meters on and around school properties further increases children’s RF exposure.[15]

Many school districts that are strapped for cash in the face of state budget cuts are willing to ignore the abundance of scientific research on RF dangers and sign on with telecom companies to situate cell towers directly on school premises. Again, the FCC’s thermal effect rule is invoked to justify tower placement together with a disregard of the available studies.

The School District of Palm Beach County, the eleventh largest school district in the US, provides one such example. Ten of its campuses already have cell towers on their grounds while the district ponders lifting a ban established in 1997 that would allow for the positioning of even more towers. When concerned parents contacted the school district for an explanation of its wireless policies, the administration assembled a document, “Health Organization Information and Academic Research Studies Regarding the Health Effects of Cell Tower Signals.” The report carefully selected pronouncements from telecom industry funded organizations such as the American Cancer Society and out-of-date scientific studies supporting the FCC’s stance on wireless while excluding the long list of studies and literature reviews pointing to the dangers of RF and EMF radiation emitted by wireless networks and cell towers. [16]

The Precautionary Principle / Conclusion
Surrounded by the sizable and growing body of scientific literature pointing to the obvious dangers of wireless technology, utility companies installing smart meters on millions of homes across the US  and school officials who accommodate cell towers on their grounds are performing an extreme disservice to their often vulnerable constituencies. Indeed, such actions constitute the reckless long term endangerment of public health for short term gain, sharply contrasting with more judicious decision making.

The 1992 Rio Declaration on Environment & Development adopted the precautionary principle as a rule to follow in the situations utilities and school districts find themselves in today. “Where there are threats of serious or irreversible damage lack of full scientific certainty shall not be used as a reason for postponing cost effective measures to prevent environmental degradation.”[17] In exercising the precautionary principle, public governance and regulatory bodies should “take preventive action in the face of scientific uncertainty to prevent harm. The focus is no longer on measuring or managing harm, but preventing harm.”[18]

Along these lines, the European Union and the Los Angeles School District have prohibited cell phone towers on school grounds until the scientific research on the human health effects of RF are conclusive. The International Association of Fire Fighters also interdicted cell towers on fire stations pending “’a study with the highest scientific merit and integrity on health effects of exposure to low-intensity [radio frequency/microwave] radiation is conducted and it is proven that such sitings are not hazardous to the health of our members.’”[19]

Unwitting families with smart meters on their homes and children with cell towers humming outside their classrooms suggest the extent to which the energy, telecom and wireless industries have manipulated the regulatory process to greatly privilege profits over public health. Moreover, it reveals how the population suffers for want of meaningful and conclusive information on the very real dangers of RF while the telecom and wireless interests successfully cajole the media into considering one scientific study at a time.

“When you put the science together, we come to the irrefutable conclusion that there’s a major health crisis coming, probably already underway,” George Carlo cautions. “Not just cancer, but also learning disabilities, attention deficit disorder, autism, Alzheimer’s, Parkinson’s, and psychological and behavioral problems—all mediated by the same mechanism. That’s why we’re so worried. Time is running out.”[20]

Notes

[1] Energy.gov, “President Obama Announces $3.4 Billion Investment to Spur Transition to Smart Energy Grid,” October 27, 2009,
http://energy.gov/articles/president-obama-announces-34-billion-investment-spur-transition-smart-energy-grid

[2] Ilya Sandra Perlingieri, “Radiofrequency Radiation: The Invisible Hazards of Smart Meters,” August 19, 2011, GlobalReserach.ca, http://www.globalresearch.ca/index.php?context=va&aid=26082

[3] Dr. Bill Deagle, “Smart Meters: A Call for Public Outrage,” Rense.com, August 30, 2011, http://www.rense.com/general94/smartt.htm. Some meters installed in California by Pacific Gas and Electric carry a “’switching mode power-supply’ that ‘emit sharp spikes of millisecond bursts’ around the clock and is a chief cause of ‘dirty electricity.’” See Perlingieri, “Radiofrequency Radiation: The Invisible Hazards of Smart Meters.” This author similarly measured bursts of radiation in excess of 2,000 microwatts per meter every 30 to 90 seconds during the day, and once every two-to-three minutes at night.

[4] Magda Havas, BRAG Antenna Ranking of Schools, 2010,
http://electromagnetichealth.org/wp-content/uploads/2010/04/BRAG_Schools.pdf

[5] Susan Luzzaro, “Field of Cell Phone Tower Beams,” San Diego Reader, May 18, 2011,
http://www.sandiegoreader.com/news/2011/may/18/citylights2-cell-phone-tower/?page=1&

[6] FCC Office of Engineering and Technology, http://www.fcc.gov/oet/rfsafety

[7] Luzzaro, “Field of Cell Phone Tower Beams”; Marc Freeman, “Cell Towers Could Be Coming to More Schools,” South Florida Sun Sentinel, January 5, 2012,
http://articles.sun-sentinel.com/2012-01-05/news/fl-cell-towers-schools-palm-20120105_1_cell-towers-cellular-phone-towers-stealth-towers

[8] Amy Worthington, “The Radiation Poisoning of America,” GlobalResearch.ca, October 9, 2007, http://www.globalresearch.ca/index.php?context=va&aid=7025

[9] Worthington, “The Radiation Poisoning of America.”

[10] Sue Kovach, “The Hidden Dangers of Cell Phone Radiation,” Life Extension Magazine, August 2007, http://www.lef.org/magazine/mag2007
/aug2007_report_cellphone_radiation_01.htm

[11] Susan Luzzaro, “Field of Cell Phone Tower Beams”; Bioinitiative Report: A Rationale For a Biologically-based Public Exposure Standard For Electromagnetic Fields, http://www.bioinitiative.org/freeaccess/report/index.htm.

[12] World Health Organization International Agency for Research on Cancer, “IARC Classifies Radiofrequency Electromagnetic Fields as Possibly Carcinogenic,” May 31, 2011, www.iarc.fr/en/media-centre/pr/2011/pdfs/pr208_E.pdf; Joseph Mercola, “Be Aware: These Cell Phones Can Emit 28 Times More Radiation,” Mercola.com, June 18, 2011,
http://articles.mercola.com/sites/articles/archive/2011/06/18/finally-experts-admit-cellphones-are-a-carcinogen.aspx.

[13] American Academy of Environmental Medicine, “Proposed Decision of Commissioner Peevy [Mailed 11/22/2011] Before the Public Utilities Commission of the State of California,” January 19, 2012. www.aaemonline.org

[14] American Academy of Environmental Medicine, “The American Academy of Environmental Medicine Calls for Immediate Caution regarding Smart Meter Installation,” April 12, 2012, http://www.aaemonline.org/

[15] Havas, BRAG Antenna Ranking of Schools, 31-38.

[16] Donna Goldstein, “Health Organization Information and Academic Research Studies Regarding the Health Effects of Cell Tower Signals,”Planning and Real Estate Development, Palm Beach County School District, January 30, 2012.

[17] Havas, BRAG Antenna Ranking of Schools, 17.

[18] Multinational Monitor, “Precautionary Precepts: The Power and Potential of the Precautionary Principle: An Interview with Carolyn Raffensperger,” September 2004, http://multinationalmonitor.org/mm2004/09012004/september04interviewraffen.html.

[19] Luzzaro, “Field of Cell Phone Tower Beams.”

[20] Kovach, “The Hidden Dangers of Cell Phone Radiation.”

James F. Tracy is Associate Professor of Media Studies at Florida Atlantic University. He is an affiliate of Project Censored and blogs at memorygap.org.