SI
SI
discoversearch

 Politics | The Exxon Free Environmental Thread


Previous 10 | Next 10 
To: freelyhovering who wrote (9791)3/14/2012 10:05:56 AM
From: Wharf Rat
   of 21676
 
Saw it. I'll give Palin credit; She was dumber than I had imagined.

Share Recommend | Keep | Reply | Mark as Last Read | Read Replies (1)

From: Wharf Rat3/14/2012 10:15:53 AM
   of 21676
 
Roy Spencer's Bad Economics
Posted on 15 March 2012 by dana1981

While they can't seem to agree about what is causing global warming (except that it must somehow mainly be something other than human CO2 emissions), the few climate scientist "skeptics" do all seem to agree on one issue - that somehow reducing greenhouse gas emissions will harm the economy. Why these climate scientists consider themselves economics experts is a mystery, but as we have frequently detailed, climate economic studies consistently show that CO2 limits will actually save money. Yale economist William Nordhaus, who is quite conservative in his estimates regarding future climate change costs (also see this explanation of Nordhaus' optimism by Joe Romm), recently made this point emphatically:

"...the cost of waiting fifty years to begin reducing CO2 emissions is $2.3 trillion in 2005 prices. If we bring that number to today’s economy and prices, the loss from waiting is $4.1 trillion....The claim that cap-and-trade legislation or carbon taxes would be ruinous or disastrous to our societies does not stand up to serious economic analysis."

Even economists who are conservative and relatlively optimistic about the potential impacts of climate change agree that limiting CO2 emissions will save trillions of dollars. Yet certain individuals (who lack economics expertise) are convinced that taking these measures will somehow cripple the economy.

No climate scientist "skeptic" embodies this trait of bad economic arguments more than Roy Spencer, who has gone as far as to publish a book about free market economics. Ironically, the proposed solutions to climate change which Spencer opposes are free market concepts, originated by the Republican Party. As we will see in this post, Spencer has a very incorrect view on a number of climate-related economic issues.

Particulate Pollution In a recent post on his blog, Spencer criticized the US Environmental Protection Agency (EPA) regulations of fine particulate matter. This is a poor choice of regulations to criticize from an economic standpoint. The Office of Management and Budget (OMB) estimates the costs and benefits of US federal regulations every year. In 2011, the OMB report concluded as follows (emphasis added).

"the rules with the highest benefits and the highest costs, by far, come from the Environmental Protection Agency and in particular its Office of Air. More specifically, EPA rules account for 62 to 84 percent of the monetized benefits and 46 to 53 percent of the monetized costs.18 The rules that aim to improve air quality account for 95 to 97 percent of the benefits of EPA rules.

It is important to emphasize that the large estimated benefits of EPA rules are mostly attributable to the reduction in public exposure to a single air pollutant: fine particulate matter. Of its 20 air rules, the rule with the highest estimated benefits is the Clean Air Fine Particle Implementation Rule, with benefits ranging from $19 billion to $167 billion per year."

In the blog post, Spencer also shows Figure 1 below, claiming

"You will note that the most “polluted” air occurs where almost no one is around to pollute: in the deserts. This is because wind blowing over bare soil causes dust particles."



Figure 1: fine particulate measurements

The people in China and India would undoubtedly object to Spencer's claim that the most polluted air only occurs in deserts. The Chinese government is taking steps to better monitor and reduce fine particulates because of their detrimental health effects.

Additionally, one of the biggest pollution threats in developing regions like Africa is indoor combustion from biomass fires without proper ventilation, which may be responsible for 800,000 to 2.4 million premature deaths each year.

Spencer's reason for discounting pollution in India and China and biomass burning in Africa is to support this argument:

"...the most “polluted” air occurs where almost no one is around to pollute: in the deserts...If you really are worried about fine particulate air pollution, do not go outside on a windy day."

However, most dust and sand particulates are larger than the small particulates addressed by EPA regulations. Moreover, the USA is not covered in deserts, and thus Spencer is comparing apples (supposed particulates from desert sand) and oranges (supposed particulates from airborne dust).

Particulates from combustion are typically much smaller than sand or dust, remaining suspended in the atmosphere and penetrating deep into the lungs. They can also be coated with carcinogenic compounds, which explains their much greater adverse health effects and why the EPA regulates their emissions in order to protect public health and welfare. To equate larger dust particulates with smaller combustion particulates reveals a lack of understanding of the health threats Spencer is dismissing.

Spencer's Tragedy Spencer finishes the particulates blog post by trying to connect the dots to climate mitigation efforts.

"And I haven’t even mentioned carbon dioxide regulations. Even if we could substantially reduce U.S. CO2 emissions in the next 20 years, which barring some new technology is virtually impossible, the resulting (theoretically-computed) impact on U.S or global temperatures would be unmeasurable….hundredths of a degree C at best.

The cost in terms of human suffering, however, will be immense."

The main error in this argument is the Tragedy of the Commons - a dilemma arising from the situation in which multiple individuals, acting independently and rationally consulting their own self-interest, will ultimately deplete a shared limited resource even when it is clear that it is not in anyone's long-term interest for this to happen.

There is a nugget of truth to Spencer's argument, but if every nation were to make his argument (and if the USA can make it as one of the world's largest emitters, then every nation can make it), then nobody would reduce their CO2 emissions. However, the most optimal result involves every nation reducing emissions, which is why there are international climate conferences like Kyoto and Copenhagen, to try and achieve international agreements for all nations to reduce emissions. If everyone thought like Roy Spencer, this most optimal result would become impossible to achieve.

Moreover, the USA is one of the world's largest per capita CO2 emitters, and the largest historical CO2 emitter. We should be leading the way in reducing CO2 emissions, not making excuses that our emissions are too small to matter. Spencer's argument is simply irresponsible.

And of course, Spencer's argument that CO2 emissions will result in 'immense human suffering' is entirely without basis. CO2 limits will both help the economy and the poor. In fact, one of the worst climate ironies is that the poorest nations, which contribute the least to the problem, will tend to be the most impacted by human-caused climate change (Figure 2).




Figure 2: Per capita emissions vs. vulnerability to climate change, from Samson et al. (2011)

Thus Spencer has it exactly backwards - if we follow his advice and fail to reduce CO2 emissions, that is the scenario in which human suffering will be maximized.

Spencer's Ill-Conceived Motivation Spencer of course sees things differently, and does not intend to harm the poor. In a recent interview, Spencer discussed his motivations for speaking out on climate and economics issues (emphasis his):

"[the journalist] provided several paragraphs alluding to why scientists on the other side of the issue speak out, but nowhere could I find reasons why WE speak out.

I had told her that ill-conceived energy policies that hurt economic growth kill poor people."

This is consistent with Spencer's previous statement that

"...my job has helped save our economy from the economic ravages of out-of-control environmental extremism.

I view my job a little like a legislator, supported by the taxpayer, to protect the interests of the taxpayer and to minimize the role of government."

In short, Spencer thinks government (especially environmental) regulation harms the economy, which he believes in turn kills poor people. Of course, we have already seen that Spencer's views could not be further from the truth, as EPA particulate regulations are saving tens to hundreds of billions of dollars per year in the USA, and climate economics experts agree that reducing CO2 emissions will similarly save money. Moreover, exactly what "ill-conceived energy policies" does Spencer refer to? Most climate mitigation policies center around putting a price on carbon emissions, which does not hurt economic growth. For example, aggressive installation of solar panels has lowered electric prices in Germany, and renewable energy standards have no statistically significant impact on electricity rates.

Spencer's misunderstanding of climate economics is based on his anti-government views, as he further illustrates in his pollution-defending blog post:

"government jobs programs...only create special interest jobs at the expense of more useful (to the consumer) private sector jobs."

This concept that the government cannot create jobs because any government jobs will "crowd out" private sector jobs is fairly common among political conservatives, but simply has no basis in our current reality. Note that fellow climate "skeptic" Bob Carter made a similar argument.

In reality, there are only a few circumstances in which this "crowding out" argument holds true; generally when the economy's resources are being fully utilized, which is rarely the case. It's certainly not true in today's stagnant economic conditions, when private investment and growth is low. Under these conditions, public investment provide jobs to the unemployed without "crowding out" private investment.

In fact, reality is disproving Spencer's "crowding out" argument at this very moment. If Spencer were correct, then cutting government spending (a.k.a. "austerity") would lead to private sector job growth and decreased unemployment. On the contrary, in countries currently practicing austerity (such as Ireland, Spain, Portugal, Greece, and the UK), unemployment is rising.

Demonizing James Hansen Spencer has also claimed that

"James Hansen...actively campaign[s] for Malthusian energy policy changes"

Malthusianism generally refers to the concern that human population growth and resource depletion are unsustainable and will eventually lead to an ecological collapse or other catastrophe. While this may be an accurate description of Hansen's concerns (that human fossil fuel combustion will lead to catastrophic climate change if it continues unabated) Hansen's suggested energy policy is in no way Malthusian.

A Malthusian energy policy would involve limiting the human population, or rationing consumption for the sake of sustaining energy supplies. On the contrary, Hansen thinks we have more fossil fuel resources than we can afford to burn (i.e. Kharecha and Hansen 2008). The energy policy changes Hansen advocates for are quite straightforward - transition away from our reliance on fossil fuels by taxing carbon emissions, and return 100% of the taxed funds to the public through a dividend. This is a free market solution, and thus a proposal that Spencer, as a free market proponent, should support, rather than attaching negative labels to it. Hansen's straightforward explanation of this simple market-based approach is well worth viewing towards the end of the video in the link above.

Simple Economics Misunderstood The economics of carbon pricing are actually quite simple. Even climate scientist "skeptics" like Spencer agree that human CO2 emissions are causing some climate change. There is a price tag associated with that climate change, just like there is a price tag associated with the detrimental health effects of particulate pollution.

When that cost is not reflected in the market price of the products which result in those emissions - as is currently the case with fossil fuels and CO2 - this is called a negative economic " externality," which economists consider a market failure. The problem is that when these externalities are not reflected in market prices, consumers are unable to factor them into their decision making. For example, putting a price on particulate emissions reflects their true cost on public health, encourages consumers to consume less of the products which result in these emissions, and demand lower-emissions products, which drives technological innovations and thus leads to lower overall emissions.

The same principle would hold true for greenhouse gases. Prior to putting a price on particulate emissions, industries claimed the costs would be immense and damage the economy, as "skeptics" now claim about CO2 limits. In fact, industries and researchers have consistently overestimated the costs of environmental regulations. As we saw above, the costs of reducing particulate emissions have been much smaller than claimed, and have been far exceeded by the economic benefits. In fact, two of the largest externalities associated with coal combustion come from air pollution and climate change (Figure 3).



Figure 3: Average US coal electricity price vs. MMN11 and Epstein 2011 best estimate coal external costs.

That is not to say the costs will be small - the OMB found that environmental regulations have been among the most costly government regulations, but they have also resulted in the largest savings, and the largest net benefit to the economy of any government regulations. There is no reason to expect CO2 limits to cripple the economy, especially since economic studies conclude they will save money.

Spencer appears to think otherwise because he believes that all government and environmental regulations will harm the economy. However, Spencer's belief is factually wrong, and he should leave economics to the economists (Figure 4).





Figure 4: New York University survey results of economists with climate expertise when asked under what circumstances the USA should reduce its emissions

skepticalscience.com

Share Recommend | Keep | Reply | Mark as Last Read | Read Replies (1)

To: Wharf Rat who wrote (9784)3/14/2012 4:19:40 PM
From: John Vosilla
   of 21676
 


As the planet warms from greenhouse pollution, the Washington DC Cherry Blossom Festival is beginning earlier and earlier. This year, the single-flowered Yoshino cherries and double-flowered Kwanzan cherries may peak at their earliest yet. The Yoshinos may come to peak bloom even before the current record of 2000, when they peaked on March 17, the Washington Post’s Jason Samenow writes:

The May-like warmth forecast over the next week promises to give the cherry blossoms a big shot of adrenaline, bringing them to peak bloom considerably earlier than normal (which is around April 1). With the big temperature spike ahead, the peak bloom date could come close to the earliest on record of March 17, 2000.

This early bloom is no aberration — it’s part of a long-term trend of earlier blooming. The “normal” is moving with the warming of the earth. The National Park Service’s Robert DeFeo has records of the peak bloom dates of Washington DC’s heralded cherry trees since 1921. As this chart prepared by ThinkProgress Green shows, the average blooming time for the trees has moved about 10 days earlier in the last 90 years:

thinkprogress.org

Share Recommend | Keep | Reply | Mark as Last Read | Read Replies (1)


To: John Vosilla who wrote (9794)3/14/2012 8:02:49 PM
From: Wharf Rat
   of 21676
 
Message 27994464

:>)

Share Recommend | Keep | Reply | Mark as Last Read | Read Replies (3)

To: Wharf Rat who wrote (9795)3/14/2012 8:04:59 PM
From: Wharf Rat
   of 21676
 

Mar 6, 2012
February ice extent low in the Barents Sea, high in the Bering Sea
Analysis
As in January, sea ice extent in February was low on the Atlantic side of the Arctic, but unusually high on the Pacific side of the Arctic, remaining lower than average overall. At the end of the month, ice extent rose sharply, as winds changed and started spreading out the ice cover.

Sea ice extent in late winter can go up and down very quickly, getting pushed together or dispersed by strong winds. Ice extent usually reaches its annual maximum sometime in late February or March, but the exact date varies widely from year to year.


Arctic sea ice extent for February 2012 was 14.56 million square kilometers (5.62 million square miles). The magenta line shows the 1979 to 2000 median extent for that month. The black cross indicates the geographic North Pole. Sea Ice Index data.

Credit: National Snow and Ice Data Center
High Resolution Image


Overview of conditions
Arctic sea ice extent in February 2012 averaged 14.56 million square kilometers (5.62 million square miles). This is the fifth-lowest February ice extent in the 1979 to 2012 satellite data record, 1.06 million square kilometers (409,000 square miles) below the 1979 to 2000 average extent.

Continuing the pattern established in January, conditions differed greatly between the Atlantic and Pacific sides of the Arctic. On the Atlantic side, especially in the Barents Sea, air temperatures were higher than average and ice extent was unusually low. February ice extent for the Barents Sea was the lowest in the satellite record. Air temperatures over the Laptev, Kara and Barents seas ranged from 4 to 8 degrees Celsius (7 to 14 degrees Fahrenheit) above average at the 925 hectopascal (hPa ) level (about 3000 feet above sea level). In contrast, on the Pacific side, February ice extent in the Bering Sea was the second highest in the satellite record, paired with air temperatures that were 3 to 5 degrees Celsius (5 to 9 degrees Fahrenheit) below average at the 925 hPa level.



The graph above shows daily Arctic sea ice extent as of March 5, 2012, along with the ice extents for the previous four years. 2011 is shown in light blue, 2010 is in pink, 2009 in dark blue, 2008 is in purple, and 2007, the year with the record low minimum, is dashed green. The gray area around the average line shows the two standard deviation range of the data. Sea Ice Index data.

Credit: National Snow and Ice Data Center
High Resolution Image


Conditions in context

Overall, the Arctic gained 956,000 square kilometers (369,000 square miles) of ice during the month. This was 486,000 square kilometers (188,000 square miles) more than the average ice growth for February 1979 to 2000. The overall low ice extent for the month stemmed mostly from the low ice extent in the Barents Sea: the extensive ice in the Bering Sea was not enough to compensate. On average, the Barents Sea has 865,000 square kilometers (334,000 square miles) of ice for the month of February. This year there were only 401,000 square kilometers (155,000 square miles) of ice in that region, the lowest recorded in the satellite data record.

At the end of February, ice extent rose sharply. Data from the NSIDC Multisensor Analyzed Sea Ice Extent (MASIE) showed that the rise came mainly from the Bering Sea and Baffin Bay. In the Bering Sea and Baffin Bay, winds pushed the ice extent southward. Ice growth in the Kara Sea also contributed to the rise in ice extent. In the Kara Sea, westerly winds that had been keeping the area ice-free shifted, allowing the open water areas to freeze over. During late winter, ice extent can change quickly as winds push extensive ice cover together, or spread out ice floes over a greater area.



Monthly February ice extent for 1979 to 2012 shows a decline of 3.0% per decade.

Credit: National Snow and Ice Data Center
High Resolution Image


February 2012 compared to past years
Arctic sea ice extent for February 2012 was the fifth lowest in the satellite record. Including the year 2012, the linear rate of decline for February ice extent over the satellite record is 3.0% per decade. Based on the satellite record, through 2003, average February ice extent had never been lower than 15 million square kilometers (5.79 million square miles). February ice extent has not exceeded that mark eight out of the nine years since 2003.



This photograph of sea ice near Greenland was taken on March 18, 2011 from the NASA P3 aircraft. The IceBridge mission is collecting data on ice thickness, an important measure of the health of sea ice.

Credit: NASA/ATM automatic Cambot system
High Resolution Image


IceBridge thickness data
Measuring ice thickness is critical to assessing the overall health of Arctic sea ice. The passive microwave data that NSIDC presents here provide only ice extent, a two-dimensional measure of ice cover. But ice can vary in thickness from a few centimeters to several meters, and scientists want to know if the ice pack is thinning overall as well as declining in extent. A new study by NASA scientist Ron Kwok compared ice thickness data collected by airplanes during the ongoing Operation IceBridge with thickness data from the NASA Ice, Cloud and Land Elevation Satellite (ICESat), which ended its mission in 2009. IceBridge is an airborne data-collection mission that started in 2009, in order to bridge the data gap between the first ICESat and ICESat-2, which is scheduled to launch in 2016.

Kwok found good agreement between simultaneous IceBridge and ICESat freeboard measurements made in 2009. Freeboard is the elevation of sea ice above the ocean surface, and provides a measure of ice thickness. These results show that IceBridge measurements will be able to bridge the gap between the ICESat and ICESat-2 satellite missions and add to other ice thickness data from the European Space Association (ESA) Cryosat-2. Satellite measurements of ice thickness provide a third dimension of information on the changing sea ice cover, helping scientists to more accurately assess the amount of sea ice in the Arctic.

Data collected by the IceBridge mission is archived and distributed by the NSIDC IceBridge Data program.



These images show the general effects of the positive phase (left) and negative phase (right) of the NAO. Red dots show the location of harp seal breeding grounds.

Credit: Johnston, et. al., 2012
High Resolution Image


Regional ice conditions and harp seals
Many animals rely on sea ice as part of their habitat. Harp seals, for example, give birth to and care for their young on floes of sea ice. Recent research by David Johnston and colleagues at Duke University showed that harp seals in the northwest Atlantic have higher mortality rates during years when the North Atlantic Oscillation (NAO) is in its negative phase, a pattern that favors low ice cover in the Labrador Sea and Gulf of St. Lawrence, where harp seals breed.

This winter, the NAO has mostly been in a positive phase and ice conditions in the Labrador Sea and Gulf of St. Lawrence have been at near-normal levels. However, in recent years, ice conditions in the region have been very low. The study showed a longer-term decline in sea ice cover of up to 6% per decade across all North Atlantic harp seal breeding grounds since 1979. While harp seals are well-suited to deal with natural short-term shifts in ice conditions, they may not be able to adapt to the combined effects of both short-term variability and long-term climate change.


Further reading
Friedlaender, A.S., D.W. Johnston and P.N. Halpin. 2010. Effects of the North Atlantic Oscillation on sea ice breeding habitats of harp seals (Pagophilus groenlandicus) across the North Atlantic, Progress in Oceanography, 86, 261-266.

Johnston DW, Bowers MT, Friedlaender AS , Lavigne DM. 2012. The Effects of Climate Change on Harp Seals (Pagophilus groenlandicus). PLoS ONE, 7(1): e29158. doi:10.1371/journal.pone.0029158.

Kwok, R., G.F. Cunningham, S.S. Manizade and W.B. Krabill. 2012. Arctic sea ice freeboard from IceBridge acquisitions in 2009: Estimates and comparisons with ICESat, Journal of Geophysical Research, Vol. 117, C02018, doi:10.1029/2011JC007654.
nsidc.org

Share Recommend | Keep | Reply | Mark as Last Read | Read Replies (1)

To: Wharf Rat who wrote (9796)3/14/2012 8:07:58 PM
From: Wharf Rat
   of 21676
 
False Balance Lives At The New York Times

By Joe Romm on Mar 14, 2012 at 4:58 pm




One of the country’s best climate reporters proves once again that false balance is alive and well at even the best papers.

The article in question is “ Rising Sea Levels Seen as Threat to Coastal U.S.” by Justin Gillis. It’s on the new Climate Central report whose news release we reposted earlier today. As Gillis explains:

About 3.7 million Americans live within a few feet of high tide and risk being hit by more frequent coastal flooding in coming decades because of the sea level rise caused by global warming, according to new research.

If the pace of the rise accelerates as much as expected, researchers found, coastal flooding at levels that were once exceedingly rare could become an every-few-years occurrence by the middle of this century.

This isn’t terribly controversial among climatologists I talk to, though this report appears to be the first to add storm surges to warming-driven sea rise, spell out the danger in every U.S. coastal region and ”estimate the proportion of the national population at risk from the rising sea.”

Gillis quotes the author, of course:

“Sea level rise is like an invisible tsunami, building force while we do almost nothing,” said Benjamin H. Strauss, an author, with other scientists, of two new papers outlining the research. “We have a closing window of time to prevent the worst by preparing for higher seas.”

But Strauss is the only scientist quoted in the article. To ‘balance’ Strauss, the Times quotes one of the top anti-scientist disinformers in the country, Myron Ebell, of the could-not-be-more debunked Competitive Enterprise Institute (see, for instance, “ Santer, Jones, and Schneider respond to CEI’s phony attack on the temperature record“).

I’m assuming it’s the New York Times editors who are the ones who are still demanding this nonsensical balance — see Science Times stunner: “… a majority of the section’s editorial staff doubts that human-induced global warming represents a serious threat to humanity”).

Even so, that’s no excuse for this misleading paragraph:



The rise appears to have accelerated lately, to a rate of about a foot per century, and many scientists expect a further acceleration as the warming of the planet continues. One estimate that communities are starting to use for planning purposes suggests the ocean could rise a foot over the next 40 years, though that calculation is not universally accepted among climate scientists.

The handful of climate researchers who question the scientific consensus about global warming do not deny that the ocean is rising. But they often assert that the rise is a result of natural climate variability, they dispute that the pace is likely to accelerate, and they say that society will be able to adjust to a continuing slow rise.

Myron Ebell, a climate change skeptic at the Competitive Enterprise Institute, a Washington research group, said that “as a society, we could waste a fair amount of money on preparing for sea level rise if we put our faith in models that have no forecasting ability.”

First off, the way this is written, it suggests Ebell is one of “the handful of climate researchers,” which he most certainly is not. He’s just a spokesperson for anti-science disinformation promoted and partially funded by polluters.

Second, why does the New York Times have to publish a full paragraph of the erroneous and questionable beliefs of the tiny fraction of actual climate researchers who don’t accept the broad scientific understanding?

Third, again, why cite only one actual scientist and put him up against a non-scientist disinformer? When Gillis wrote about sea leavel rise in November 2010, he quoted a dozen actual climate and cryo-scientists with only John Christy for ‘balance.’ (see Coastal studies experts: “For coastal management purposes, a [sea level] rise of 7 feet (2 meters) should be utilized for planning major infrastructure”). At least that was closer to the right ratio.

For the record, the one-foot rise over the next 40 years is a projection from a major scientific study by five leading experts, including lead author Eric Rignot of the Jet Propulsion Lab (see “ JPL bombshell: Polar ice sheet mass loss is speeding up, on pace for 1 foot sea level rise by 2050“).

Most of the leading experts on sea level rise who have published or spoken about this are now expecting one meter (39 inches) of SLR by 2100 or more if we keep listening to folks like Ebell and do nothing to reduce emissions. Indeed, as Gillis notes in his earlier piece, the two most widely used means of projecting future sea level rise, “yield approximately the same answer: that sea level could rise by 2 1/2 to 6 1/2 feet between now and 2100. A developing consensus among climate scientists holds that the best estimate is a little over three feet.”

Well, if seas are projected to rise 39 inches in the next 90 years (with much more upside risk than downside), it’s certainly not very controversial that they would rise around 12 inches in the next 40 years. So again, there’s no particular need to counterbalance any of these projections. Indeed, it would have made more sense to talk to an expert who thinks the projection is too low, because if we are headed for 6 1/2 feet by 2100, then one foot by 2050 could be on the low side:

Boykoff on “Exaggerating Denialism: Media Representations of Outlier Views on Climate Change”: Freudenburg: “Reporters need to learn that, if they wish to discuss ‘both sides’ of the climate issue, the scientifically legitimate “other side” is that, if anything, global climate disruption is likely to be significantly worse than has been suggested in scientific consensus estimates to date.”

New York Times coverage of climate change has been improving, but they keep falling into the trap of false balance:

They would do well to adopt the NPR ethics handbook, which explicitly targets and rejects false balance

Note: Michael Tobis (and Stephen Ban) gave us the top figure.

thinkprogress.org

Share Recommend | Keep | Reply | Mark as Last Read | Read Replies (1)


To: Wharf Rat who wrote (9797)3/14/2012 8:11:26 PM
From: Wharf Rat
   of 21676
 
Report: Keystone XL Tar Sands Pipeline More Of An Economic Liability Than Benefit

By Climate Guest Blogger on Mar 14, 2012 at 3:56 pm




by Danielle Droitsch, reposted from NRDC’s Switchboard

A new report from the Cornell University’s Global Labor Institute shows how the Keystone XL tar sands pipeline is an economic liability with the potential to cause significant job losses from a major tar sands spill.

Because tar sands oil is more corrosive and toxic than conventional oil, it can increase the frequency of pipeline spills. Moreover, a tar sands spill causes far more damage than a conventional oil spill. Take, for example, the 1.2 million gallon tar sands spill on the Kalamazoo River in Marshall Michigan in 2010 where the clean up costs have been 10 times higher than a typical conventional oil spill.

While there has been a lot of attention to the possible jobs created from the Keystone XL pipeline – far less than what proponents claim – there has been very little attention to jobs that could be lost from a tar sands spill. Keystone XL is expected experience up to 91 significant spills over a 50-year period. Which jobs are at risk? Hundreds of thousands of workers in the agricultural and tourism sectors contribute ten of billions of dollars to the economy in the Keystone XL pipeline states. The Cornell report helps illustrate yet one more reason why the Keystone XL tar sands pipeline should be rejected.

[iframe height=250 src="http://www.youtube.com/embed/7SS_oTglqsQ" frameBorder=0 width=400][/iframe]

Here are some of the key findings from the report:

Tar sands spills more likely

Tar sands oil is more corrosive and toxic than conventional oil and can therefore increase the frequency of pipeline spills. According to the Cornell report, “Between 2007 and 2010, pipelines transporting tar sands oil in the northern Midwest have spilled three times more per mile than the U.S. national average for conventional crude.”

Keystone XL likely to experience significant spills

An independent analysis conducted by the University of Nebraska concluded that Keystone XL over a 50-year period is expected to experience 91 significant spills (greater than 50 barrels). In fact, the University of Nebraska study found Keystone XL could spill as much as 6.9 million gallons of raw tar sands crude oil at the Yellowstone River crossing. In just its first year of operation, the first Keystone pipeline operated by TransCanada has spilled 35 times in the United States and Canada in 2010. This spill frequency is 100 times higher than forecast by TransCanada.

A spill from Keystone XL threatens jobs and the economy in pipeline states

While tar sands spills can have a tremendous impact on the environment, a tar sands spill on the Keystone XL pipeline through America’s agricultural heartland could cause significant economic damage and job losses. The farming, ranching, and tourism sectors are major sources of employment along the pipeline’s route employing 571,000 workers with an output of $76 billion. The pipeline will also cross over 90 miles of recreational lands in the pipeline state including state parks, national historic trails, and wildlife refuges.

“Despite TransCanada’s assurances, we know there will be leaks and spills…It is not a matter of it, it is a matter of when, how often, and how much leakage there will be…When a leak happens, it will be [the farmers’] drinking water, their livestock water supply, and their irrigation supply that will be contaminated. Their economic well-being is directly impacted by spills and leaks.” Nebraska Farmers Union

Tar sands spills more devastating than conventional oil spills

The Cornell report also looked closely at the largest tar sands spill in U.S. history on the Kalamazoo River in Michigan in 2010 where the costs have escalated to $750 million – 10 times as much per litre as conventional crude. The clean up of the Kalamazoo river spill which has lasted almost two years has been especially difficult because conventional oil response techniques have been ineffective according to the EPA. Today, 20 months since the tar sands spill, the entire length of the river (35+ miles) remains closed. While conventional oil floats on the surface, tar sands is thick and heavy and sinks in water making it very difficult to clean up.

“Enbridge compensated us for the initial shutdown of our business, but we are concerned about the long-term impact that the spill has had on our business…one and a half years later our business is still suffering financially…” Debra Miller, Carpet Story Owner

An accurate assessment of the economic risks still needed

Ultimately, the report said that while there has been significant attention to the Keystone XL pipeline’s potential to create jobs, “scant attention has been given to how existing jobs and economic sector would be impacted from Keystone XL leaks and spills.” The report said a more detailed risk assessment of the Keystone XL pipeline – one that considers job losses and economic harms from one or more tar sands spills – has not been completed.

Until such as assessment is completed and a full accounting of potential job losses from pipeline spills are considered, the Obama administration should not issue any approvals allowing TransCanada to move ahead with construction of the pipeline.

thinkprogress.org

Share Recommend | Keep | Reply | Mark as Last Read

To: Wharf Rat who wrote (9792)3/14/2012 11:02:11 PM
From: freelyhovering
   of 21676
 
I agree but McCain's team set her up with rushed vetting and not having one of her 'homies' be her tutor when she started her meltdown. Schmidt did put her on about the polls in Alaska until she pressed him and by then she was already paranoid.

Share Recommend | Keep | Reply | Mark as Last Read

To: Wharf Rat who wrote (9795)3/14/2012 11:25:10 PM
From: T L Comiskey
   of 21676
 



Never trust a Cherry tree..

damn pinkos...are liars


youtube.com

Share Recommend | Keep | Reply | Mark as Last Read | Read Replies (1)


From: Eric3/14/2012 11:53:40 PM
2 Recommendations   of 21676
 
Amory Lovins Lays Out His Clean Energy Plan

For four decades, Amory Lovins has been a leading proponent of a renewable power revolution that would wean the U.S. off fossil fuels and usher in an era of energy independence. In an interview with Yale Environment 360, he talks about his latest book, which describes his vision of how the world can attain a green energy future by 2050.

e360.yale.edu

Share Recommend | Keep | Reply | Mark as Last Read | Read Replies (1)
Previous 10 | Next 10 

Copyright © 1995-2014 Knight Sac Media. All rights reserved.Stock quotes are delayed at least 15 minutes - See Terms of Use.