We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.

   PoliticsPolitics of Energy

Previous 10 Next 10 
To: Brumar89 who wrote (82699)11/13/2019 4:58:46 PM
From: Brumar89
1 Recommendation   of 83319
Wind turbine waste

By Craig Rucker |November 8th, 2019| Energy| 4 Comments

The answer to our energy challenges is not blowing in the wind.

CFACT senior policy advisor Paul Driessen explains in detail at

Wind “farms”? Like some cute, rustic Old McDonald family farm? Are you kidding me? These would be massive offshore electricity factories, with thousands, even millions, of turbines and blades towering 500-700 feet above the waves. Only a certifiable lunatic, congenital liar, complete true believer, would-be global overseer or campaign-cash-hungry politician could possibly repeat this IEA hype – or call these wind energy factories renewable, sustainable or eco-friendly.

Take a look at Driessen’s full account. The case against wind turbines is overwhelming, as countries such as Spain and Germany have already found out to their detriment. Their citizens are revolting and subsidies have been slashed.

Wind energy is expensive, intermittent, inefficient, highly subsidized, born from the dirtiest strip mines on Earth and kills birds. When they wear out they can’t be recycled.

Every wind turbine that goes up means higher prices for everyone. They are not up to the task of powering the future.

Meanwhile, the Green-Left, blithe to the inefficiencies of so-called “renewable” energy, now wants to also kill our ever-popular and affordable gas cooking and heating.

Steve Goreham posted details on plans to ban gas appliances at

Some cooks love their electric cooktops. Many, including serious chefs, prefer the precise control and instant on and off they get cooking with gas. Pretty much everyone heating with gas loves it.

Green ideologues want to simultaneously mandate electricity, while at the same time making electricity more expensive and less reliable.

We should take ideology out of energy, and replace it with smarter thinking and consumer choice.

Share RecommendKeepReplyMark as Last Read

From: Brumar8911/15/2019 6:55:59 AM
1 Recommendation   of 83319
170 Years of Earth Surface Temperature Data Show No Evidence of Significant Warming
charles the moderator

Author: Thomas K. Bjorklund, University of Houston, Dept. of Earth and Atmospheric Sciences

October16, 2019

Key Points

1. From 1850 to the present, the noise-corrected, average warming of the surface of the earth is less than 0.07 degrees C per decade.

2. The rate of warming of the surface of the earth does not correlate with the rate of increase of fossil fuel emissions of CO2 into the atmosphere.

3. Recent increases in surface temperatures reflect 40 years of increasing intensities of the El Nino Southern Oscillation climate pattern.


This study investigates the relationships between surface temperatures from 1850 to the present and reported long-range temperature predictions of global warming. A crucial component of this analysis is the calculation of an estimate of the warming curve of the surface of the earth. The calculation removes errors in temperature measurements and fluctuations due to short-duration weather events from the recorded data. The results show the average rate of warming of the surface of earth for the past 170 years is less than 0.07 degrees C per decade. The rate of warming of the surface of the earth does not correlate with the rate of increase of CO2 in the atmosphere. The perceived threat of excessive future global temperatures may stem from misinterpretation of 40 years of increasing intensities of the El Nino Southern Oscillation (ENSO) climate pattern in the eastern Pacific Ocean. ENSO activity culminated in 2016 with the highest surface temperature anomaly ever recorded. The rate of warming of the earth’s surface has dropped 41 percent since 2006.


Section 1


The results of this study suggest the present movement to curtail global warming may by premature. Both the highest ever recorded warming currents in the Pacific Ocean and technologically advanced methods to collect ocean temperature data from earth orbiting satellites coincidently began in the late 1970s. This study describes how newly acquired high-resolution temperature data and Pacific Ocean transient warming events may have convolved to result in long-range temperature predictions that are too high.

HadCRUT4 Monthly Temperature Anomalies

This analysis uses the HadCRUT. version of monthly medians of the global time series of temperature anomalies, Column 2, 1850/01 to 2019/08 (Morice, C. P., et. al. 2012). The NASA Goddard Institute for Space Studies data set of global-mean annual land and sea surface temperature anomalies, 1880 to 2018, was also analyzed using the methodology described in this report. The results are essentially the same as the results from the HadCRUT4 data set analyses. The HadCRUT4 data analysis was used for this report because the time series is longer, and the monthly global temperature anomalies are easier to import to Excel.

Only in recent years have high-resolution satellites provided simultaneously observed data on properties of the land, ocean and atmosphere (Palmer, P.I., 2018). NOAA-6 was launched in December 1979 and NOAA-7 was launched in 1981. Both were equipped with microwave radiometry devices (Microwave Sounding Unit-MSU) to precisely monitor sea-surface temperature anomalies over the eastern Pacific Ocean and the areas of ENSO activity (Spencer, et al., 1990). These satellites were among the first to use this technology.

The initial analyses of the high-resolution satellite data yielded a remarkable result. Spencer, et al. (1990), concluded the following: “The period of analysis (1979–84) reveals that Northern and Southern hemispheric tropospheric temperature anomalies (from the six-year mean) are positively correlated on multi-seasonal time scales but negatively correlated on shorter time scales. The 1983 ENSO dominates the record, with early 1983 zonally averaged tropical temperatures up to 0.6 degrees C warmer than the average of the remaining years. These natural variations are much larger than that expected of greenhouse enhancements and so it is likely that a considerably longer period of satellite record must accumulate for any longer-term trends to be revealed”.

Karl, et al. (2015) claim that the past 18 years of stable global temperatures is due to the use of biased ocean buoy-based data. Karl, et al. state that a “bias correction involved calculating the average difference between collocated buoy and ship SSTs. The average difference globally was -0.12°C, a correction that is applied to the buoy SSTs at every grid cell in ERSST version 4.” This analysis is not consistent with the interpretation of the past 18-year pause in global warming. The discussion below of the first derivative of a temperature anomaly trendline shows the rate of increase of relatively stable and nearly noise-free temperatures peaked in 2006 and has since declined in rate of increase to the present.

The following is a summary of conclusions by Karl, et al. (2015) (called K15 below) by Mckitrick (2015): “All the underlying data (NMAT, ship, buoy, etc.) have inherent problems and many teams have struggled with how to work with them over the years. The HadNMAT2 data are sparse and incomplete. K15 take the position that forcing the ship data to line up with this dataset makes them more reliable. This is not a position other teams have adopted, including the group that developed the HadNMAT2 data itself. It is very odd that a cooling adjustment to SST records in 1998-2000 should have such a big effect on the global trend, namely wiping out a hiatus that is seen in so many other data sets, especially since other teams have not found reason to make such an adjustment. The outlier results in the K15 data might mean everyone else is missing something, or it might simply mean that the new K15 adjustments are invalid.”

Mears and Wentz (2016) discuss adjustments to satellite data and their new dataset, which “shows substantially increased global-scale warming relative to the previous version of the dataset, particularly after 1998. The new dataset shows more warming than most other middle tropospheric data records constructed from the same set of satellites.” The discussion below shows the warming curve of the earth has been decreasing in rate of increase of slope since July 1988; that is, the curve has been concave downward. Based on this observation alone, their new dataset should not show “substantially increased global-scale warming.”

Analysis of Temperature Anomalies

All temperature measurements used in this study are calculated temperature anomalies and not absolute temperatures. A temperature anomaly is the difference of the absolute measured temperature from a baseline average temperature; in this case, the average annual mean temperature from 1961 to 1990. This conversion process is intended to minimize the effects on temperatures related to the location of the measurement station (e.g., in a valley or on a mountain top) and result in better recognition of regional temperature trends.

In Figure 1, the black curve is a plot of monthly mean surface temperature anomalies. The jagged character of the black temperature anomaly curve is data noise (inaccuracies in measurements and random, short term weather events). The red curve is an Excel sixth-degree polynomial best fit trendline of the temperature anomalies. The curve-fitting process removes high-frequency noise. The green curve, a first derivative of the trendline, is the single most important curve derived from the global monthly mean temperature anomalies. The curve is a time-series of the month-to-month differences in mean surface temperatures in units of degrees Celsius change per month. These very small numbers are multiplied by 120 to convert the units to degrees per decade (left vertical axis of the graph). Degrees per decade is a measure of the rate at which the earth’s surface is cooling or warming; it is sometimes referred to as the warming (or cooling) curve of the surface of the earth. The green curve temperature values are similar in magnitude to noise-free earth surface temperature estimates determined by University of Alabama in Huntsville for single points (Christy, J. R. May 8, 2019). The green curve has not previously been reported and is critical to analyzing long-term temperature trends.

Figure 1. The black curve is the HadCRUT4 time series of the mean monthly global land and sea surface temperature anomalies, 1850-present). Anomalies are deviations from the 1961-1990 annual mean temperatures in degrees Celsius. The red curve is the trendline of the HadCRUT4 data set, an Excel sixth-degree polynomial best fit of the temperature anomalies. The green curve is the first derivative of the trendline converted from units of degrees C per month to degrees C per decade, that is; the slope of the trendline curve.

In a recent talk, John Christy (May 8, 2019), director of the Earth System Science Center at the University of Alabama in Huntsville, reported estimates of noise-free earth warming in 1994 and 2017 of 0.09 and 0.095 degrees C per decade, respectively. The 2017 average value for the green curve is 0.154: this value is 0.059 degrees per decade higher than the UAH estimate. The latest value in August 2019 for the green curve is 0.125 degrees C per decade. The average degrees C per decade value of earth warming based on the green curve over 2,032 months since 1850 is 0.068 degrees C per decade. The average rate of warming from 1850 through 1979, to the beginning of the most recent El Nino Southern Oscillation (ENSO), is 0.038 degrees C per decade.

A warming rate of 0.038 degrees C per decade would need to significantly increase or decrease to support a prediction of a long-term change in the earth’s surface temperature. If the earth’s surface temperature increased continuously starting today at a rate of 0.038 degrees C per decade, in 100 years the increase in the earth’s temperature would be only 0.4 degrees C., which is not indicative of a global warming threat to humankind.

The 0.038 degrees C per decade estimate is likely beyond the accuracy of the temperature measurements from 1850 to 1979. Recent statistical analyses conclude that 95% uncertainties of global annual mean surface temperatures range between 0.05 degrees C to 0.15 degrees C over the past 140 years; that is, 95 measurements out 100 are expected to be within the range of uncertainty estimates (Lenssen, N. J. L., et al. 2019). Very little measurable warming of the surface of the earth has occurred from 1850 to 1979.

In Figure 2, the green curve is the warming curve; that is, a time series of the rate of change of the temperature of the surface of the earth in degrees per decade. The blue curve is a time series of the concentration of fossil fuel emissions of CO2 in parts per million in the atmosphere. The green curve is generally level from 1900 to 1979 and then rises slightly due to lower frequency noise remaining in the temperature anomalies from 40 years of ENSO activity. The warming curve declined since early 2000 to the present. The concentration of CO2 increased steadily from 1943 to 2019. There is no correlation between a rising CO2 concentration in the atmosphere and a relatively stable, low rate of warming of the surface of the earth from 1943 to 2019.

Figure 2. The green curve is the first derivative of the trendline converted to units of degrees C per decade, that is, the rate of change of the surface temperature of the earth. See Figure 1 for the same curve along with the temperature anomalies curve and the trendline curve. The blue dotted curve showing total parts per million CO2 emissions from fossil fuels in the atmosphere is modified from Boden, T. A., et al. (2017); the time frame shows only emissions since 1900, and the total reported million metric tons of carbon are converted to parts per million CO2 for the graph. The single blue point is the most recent report from NOAA of an average 414.7 parts per million CO2 in May 2019 (Research News. June 4, 2019). There is no correlation between the curves.

In Figure 3, the December 1979 temperature spike (Point A) is associated with a weak El Nino event. During the following 39 years, five strong to very strong intensity El Nino events are recorded; the last one, in 2015-2016, the highest intensity El Nino ever recorded (Goldengate Weather Services. (2019). The highest ever mean global monthly temperature anomaly of 1.111 degrees C was recorded in February 2016. Since then, monthly global temperature anomalies declined 35 percent to a temperature of 0.724 degrees C in August 2019 as the El Nino decreased in intensity.

Figure 3. An enlarged portion of Figure 1 from 1963 to 2019 with modified vertical scales to emphasize important changes in the shape of the green curve.

Points A, B and C mark very significant changes in the shape of the green warming curve (values on left vertical axis).

1. The green curve values increased each month from 0.085 degrees C per decade in December 1979 (Point A) to 0.136 degrees C per decade in July 1988 (Point B); this is a 60 percent increase in rate of warming in nearly 9 years. The warming curve is concave upward. Point A marks a weak El Nino and the beginning of increasing ENSO intensity.

2. From July 1988 to September 2006, the rate of warming increased from 0.136 degrees C per decade to 0.211 degrees per decade (Point C); this is a 55 percent increase in 18 years but about one-half the total rate of the previous 9 years because of a decrease in the rate of increase each month. The July 1988 point on the x-axis is an inflexion point at which the warming curve becomes concave downward.

3. September 2006 (Point C) marks a very strong El Nino and the peak of the nearly 40-year ENSO transient warming trend, imparting a lazy S shape to the green curve. The rate of warming has declined every month since peaking at 0.211 degrees per decade in September 2006 to 0.125 in August 2019; this is a 41 percent decrease in 13 years.

Section 2

Truth and Consequences

The “hockey stick graph”, which had been cited by the media frequently as evidence for out-of-control global warming over the past 20 years, is not supported by the current temperature record (Mann, M., Bradley, R. and Hughes, M. 1998). The graph is no longer seen in the print media.

None of 102 climate models of the mid-troposphere mean temperature comes close enough to predicting future temperatures to warrant changes in environmental policies. The models start in the 1970s at the beginning of a time period that culminated in the strongest ENSO ever recorded and by 2015, less than 40 years, the average predicted temperature of all the models is nearly 2.4 times greater than the observed global tropospheric temperature anomaly in 2015 (Christy, J. R. May 8, 2019). The true story of global climate change has yet to be written.

The peak surface warming during the ENSO was 0.211 degrees C per decade in September 2006. The highest global mean surface temperature ever recorded was 1.111 degrees C in February 2016; these occurrences are possibly related to the increased quality and density of ocean temperature data from the two, earth orbiting MSU satellites described previously. Earlier large intensity ENSO events may not have been recognized due to the absence of advanced satellite coverage over oceans.

The use of a temperature trendline to remove high frequency noise did not eliminate the transient effects of the longer wavelength components of ENSO warming over the past 40 years; so, estimates of the rate of warming for that period in this study still include background noise from the ENSO. A noise-free signal for the past 40 years probably lies closer to 0.038 degrees C per decade, the average rate of warming from 1850 to the beginning of the ENSO in 1979 than the average rate from 1979 to the present, 0.168 C degrees per decade. The higher number includes uncorrected residual ENSO effects.

Foster and Rahmstorf (2011) used average annual temperatures from five data sets to estimate average earth warming rates from 1979 to 2010. Noise removed from the raw mean annual temperature data is attributed to ENSO activities, volcanic eruptions and solar variations. The result is said to be a noise-adjusted temperature anomaly curve. The average warming rate of the five data sets over 32 years is 0.16 degrees C per decade compared to 0.17 degrees C per decade determined by this study from 384 monthly points derived from the derivative of the temperature trendline. Foster and Rahmstorf (2011) assume the warming trend is linear based on one averaged estimate, and their data cover only 32 years. Thirty years is generally considered to be a minimum period to define one point on a trend. This 32-year time period includes the highest intensity ENSO ever recorded and is not long enough to define a trend. The warming curve in this study is curvilinear over nearly 170 years (green curve on Figures 1 and 3) and is defined by 2,032 monthly points derived from the temperature trendline derivative. From 1979 to 2010, the rate of warming ranges from 0.08 to 0.20 degrees C per decade. The warming trend is not linear.

The perceived threat of excessive future temperatures may stem from an underestimation of the unusually large effects of the recent ENSO on natural global temperature increases. Nearly 40 years of natural, transient warming from the largest ENSO ever recorded may have been misinterpreted to include warming due to anthropogenic activities. There is no evidence of a significant anthropogenic contribution to surface temperatures measured over the last 40 years.

Caltech recently announced the start of a 5-year project with several other research centers to build a new climate model “from the ground up” (Perkins, R. 2018). During these five years, the world’s understanding of the causes of climate change should be greatly improved.

The scientific goal must be to narrow the range of uncertainty of predictions with better data and better models until human intervention makes sense. We have the time to get it right. A rational environmental protection program and a vibrant economy can co-exist. The challenge is to allow scientists the time and freedom to work without interference from special interests.

Acknowledgments and Data

All the raw data used in this study can be downloaded from the HadCRUT4 and NOAA websites.


1. Boden, T.A., Marland, G., and Andres, R.J. (2017). National CO2 Emissions from Fossil-Fuel Burning, Cement Manufacture, and Gas Flaring: 1751-2014, Carbon Dioxide Information Analysis Center, Oak Ridge National Laboratory, U.S. Department of Energy, doi:10.3334/CDIAC/00001_V2017.

2. Christy, J. R., May 8, 2019. The Tropical Skies Falsifying Climate Alarm. Press Release, Global Warming Policy Foundation.

3. Foster, G. and Rahmstorf, S., 2011. Environ. Res. Lett. 6 044022

4. Golden Gate Weather Services, Apr-May-Jun 2019. El Niño and La Niña Years and Intensities.

5. HadCrut4 dataset.

6. Karl, T. R., Arguez, A., Huang, B., Lawrimore, J. H., McMahon, J. R., Menne, M. J., et al.

Science 26 June 2015. Vol. 348 no. 6242 pp. 1469-1472.

7. Mann, M., Bradley, R. and Hughes, M. (1998). Global-scale temperature patterns and climate forcing over the past six centuries. Nature, Volume 392, Issue 6678, pp. 779-787.

8. Mckitrick, R. Department of Economics, University of Guelph., A First Look at ‘Possible artifacts of data biases in the recent global surface warming hiatus’ by Karl et al., Science 4 June 2015

9. Mears, C. and Wentz, F. (2016). Sensitivity of satellite-derived tropospheric
temperature trends to the diurnal cycle adjustment. J. Climate. doi:10.1175/JCLID-

10. Morice, C. P., Kennedy, J. J., Rayner, N. A., Jones, P. D., (2012). Quantifying uncertainties in global and regional temperature change using an ensemble of observational estimates: The HadCRUT4 dataset. Journal of Geophysical Research, 117, D08101, doi:10.1029/2011JD017187.

11. Lenssen, N. J. L., Schmidt, G. A., Hansen, J. E., Menne, M. J., Persin, A., Ruedy, R, et al. (2019). Improvements in the GISTEMP Uncertainty Model. Journal of Geophysical Research: Atmospheres, 124, 6307–6326. 1029/2018JD029522

12. NOAA Research News: June 4, 2019.

13. Palmer, P. I. (2018). The role of satellite observations in understanding the impact of El Nino on the carbon cycle: current capabilities and future opportunities. Phil. Trans. R. Soc. B 373: 20170407.

14. Perkins, R. (2018).

15. Spencer, R. W., Christy, J. R. and Grody, N. C. (1990). Global Atmospheric Temperature Monitoring with Satellite Microwave Measurements: Method and Results 1979–84. Journal of Climate, Vol. 3, No. 10 (October) pp. 1111-1128. Published by American Meteorological Society.

Share RecommendKeepReplyMark as Last ReadRead Replies (1)

To: Brumar89 who wrote (82701)11/15/2019 7:01:43 AM
From: Brumar89
   of 83319
The mammoth gap between energy trends and climate goals

Climate protestors outside Congress. Photo: John Lamparski/Getty Images

Existing and announced policies worldwide won't be nearly enough to rein in carbon emissions, despite the strong growth of climate-friendly energy sources, according to a new report from the International Energy Agency.

Why it matters: The IEA's annual World Energy Outlook reports are among the most prominent attempts to model where energy systems are headed in the decades ahead. These big and data-rich studies (this year's weighs in at 810 pages) are widely cited by policymakers, analysts and other stakeholders.

What they did: The report models the long-term effect of three core scenarios on energy demand and how it is met...

Existing policiesThe combination of current and announced plansA "sustainable" pathway consistent with the Paris Agreement goal of holding temperature rise well under 2°C.

Adapted from IEA's 2019 World Energy Outlook; Chart: Axios Visuals
What they found: Check out the chart above. Even under nations' announced policies, energy demand is projected to rise by roughly 1% annually until 2040 (the end of the modeled period).

Under that pathway, the increase in global carbon emissions slows but does not peak, instead rising roughly 100 million tonnes annually from 2018 and 2040.That's a far cry from the deep emissions cuts needed to meet the Paris goals, which are a benchmark for avoiding some of the most damaging effects of warming.The big picture: Under the announced policies scenario, low-carbon sources — notably solar — meet more than half of demand growth through 2040.

Use of natural gas rises significantly too, while coal demand in 2040 is slightly below today's levels.Global oil demand grows but then "flattens out" in the 2030s. In 2040, demand is roughly 106 million barrels per day.Oil use in passenger cars peaks in the late 2020s, but that's offset by rising demand for oil in the petrochemical and other sectors.Overall, fossil fuels would still have a 74% share of the global energy mix in 2040.The bottom line: IEA executive director Fatih Birol urged emphasis on deploying the basket of technologies and bringing about efficiency gains consistent with their Paris-aligned scenario.

“The world urgently needs to put a laser-like focus on bringing down global emissions," he said in a statement. "This calls for a grand coalition encompassing governments, investors, companies and everyone else who is committed to tackling climate change."

Share RecommendKeepReplyMark as Last ReadRead Replies (1)

To: Brumar89 who wrote (82702)11/15/2019 7:04:29 AM
From: Brumar89
   of 83319
'Deep Electrification’ Means More Natural Gas

Jude ClementeContributor

I cover oil, gas, power, LNG markets, linking to human development.

For environmental reasons, there’s an ongoing push to “electrify everything,” from cars to port operations to heating.

The idea is that a “deep electrification” will help lower greenhouse gas emissions and combat climate change.

The reality, however, is that more electrification will surge the need for electricity, an obvious fact that seems to be getting forgotten.

The majority of this increase occurs in the transportation sector: electric cars can increase home power usage by 50% or more.

The U.S. National Renewable Energy Laboratory (NREL) says that “electrification has the potential to significantly increase overall demand for electricity.”

NREL reports that a “high” electrification scenario would up our power demand by around 40% through 2050.

A high electrification scenario would grow our annual power consumption by 80 terawatt hours per year.

For comparison, that is like adding a Colorado and Massachusetts of new demand each year.

The Electric Power Research Institute (EPRI) confirms that electrification could boom our power demand by over 50%.

Electricity is slated to more than double its share of final energy consumption to around 50% in the decades ahead.

From load shifting to higher peak demand, deep electrification will present major challenges for us.

Share RecommendKeepReplyMark as Last ReadRead Replies (1)

To: Brumar89 who wrote (82703)11/15/2019 7:16:57 AM
From: Brumar89
   of 83319
Global governments ‘still coming to terms’ with America’s energy abundance, State Department official says



The U.S. has undergone an “unprecedented energy transformation” and is changing international energy market dynamics to such an extent that other countries are only just accepting it, according to the U.S. assistant secretary of state for energy resources.The boom in energy production in the states means that in the fourth quarter of 2020, the U.S. is expected to become a net energy exporter, exporting more energy products than it imports.

An oil tanker is anchored near the Port of Long Beach, California, U.S.
Tim Rue | Bloomberg | Getty Images

The U.S. has undergone an “unprecedented energy transformation” and is changing international energy market dynamics to such an extent that other countries are only just accepting it, according to Frank Fannon, the U.S. assistant secretary of state for energy resources.

Speaking to an audience at the Abu Dhabi International Petroleum Exhibition and Conference (ADIPEC) Tuesday, Fannon said that the U.S.' evolution into a global leader in energy production offered new opportunities, and challenges, to consumers and producers around the world.

“Governments around the world are still coming to terms with this new reality and reconciling the implications, this is understandable. The U.S. shift from scarcity to abundance occurred with unprecedented speed and scale,” Fannon said.

“For more than 40 years, U.S. law prohibited oil exports but Congress lifted the ban in 2015 and the private sector responded,” Fannon said, “today, the U.S. is the largest producer in the world and on track to produce 13 million barrels (per day) next month.”

He also noted that the country had gone from being the 15th largest exporter of liquefied natural gas (LNG) in 2016, to the third largest exporter currently and that it was changing energy market dynamics by introducing liquidity and choice into the market. The U.S. is the second largest producer of renewable energy, Fannon also stated.

America has certainly shaken up the norms when it comes to energy production and supply. Aside from its so-called “shale oil revolution” that has made it the largest crude oil producer in the world, closely followed by Russia and Saudi Arabia, it has also (in the space of a few years) become one of the major global players in terms of liquefied natural gas (LNG) exports.

The boom in energy production in the states means that in the fourth quarter of 2020, it is expected to become a net energy exporter, exporting more energy products than it imports.

Achieving that status of “energy independence” is seen as an important milestone for a country looking to reduce its dependence (and all the security implications that entails) on external producers. It also means an economic boost for the U.S., as well as potential political wins. But the country still faces stiff competition from other more established energy producers, like Russia. Europe, for example, has become something of a battleground between Washington and Moscow competing to supply natural gas to the continent.

The Russia question
Another market dynamic that the U.S. is contending with is the established, and apparently strengthening, relationship between OPEC and non-OPEC producer Russia.

Not only does the 14-member oil producing group led by Saudi Arabia have an agreement with Russia (and other producers) to curb oil output, the two countries have also sealed energy investments and partnerships in recent years — most recently when President Vladimir Putin visited Saudi Arabia in October.

Asked by CNBC’s Hadley Gamble if the U.S. had left a vacuum in the Middle East that countries like Russia were seeking to fill, Fannon said: “I don’t see a vacuum at all.”

He said countries like Russia were spurred on to seal investment deals because of competition from the U.S.

We want to continue to promote growth in Middle East: US State Department official

“I’m not surprised that Russia is eager to be in the region, and elsewhere. We might not like it but it’s rational. What they see, whether it’s in this region or around the world, is what’s happening to global energy markets because of U.S. energy and U.S. energy innovation,” he said.

“I see there’s a race going on to get here and create positions because these infrastructure (projects), these energy projects, can have 50-year time horizons. They see what’s happening and so they’re rushing to lock in these positions because they’re fearful,” he said.

Appointed to the role by President Donald Trump in January 2018, Fannon is largely responsible for energy diplomacy at the State Department. He said the U.S.' energy policy was “a proxy for other foreign policy issues.”

“It can be an area that facilitates cooperation, irrespective of perhaps historical clouds,” he said.

Share RecommendKeepReplyMark as Last ReadRead Replies (1)

To: Brumar89 who wrote (82704)11/15/2019 12:30:57 PM
From: Brumar89
   of 83319
A warmer planet necessarily means colder winters. Warmer Winters. More Snow. Less Snow. More Rain. Less Rain. Bigger Tornadoes. No Tornadoes.
What ever scares the bejebus out of you… that is what climate change will bring on. Even more acne!

Share RecommendKeepReplyMark as Last ReadRead Replies (1)

To: Brumar89 who wrote (82705)11/15/2019 12:31:16 PM
From: Brumar89
1 Recommendation   of 83319
Chicken Littles vs Adelie Penguins
Guest Blogger

By Jim Steele

Chicken Littles vs Adelie Penguins

Throughout recorded history dooms day cults attract thousands of gullible people. Charismatic cult leaders of the Order of the Solar Temple or Heaven’s Gate convinced their followers to commit suicide due to a coming “environmental apocalypse”. To prevent environmental collapse, a recent mass shooter justified his killings as reducing over-population, while a Swedish scientist has suggested cannibalism. Thus, it’s worrisome that charismatic congresswoman Ocasio-Cortez similarly warns our world is doomed in 12 years. Equally disturbing is the carefully orchestrated fear-mongering, such that the United Nations gave ill-informed, 16-year old Greta Thunberg center stage to rage that CO2 is causing ecosystem collapse. Terrifying children with ‘the sky is falling’ fears will only bring about dire, unintended consequences.

Who is filling our children’s heads with stories of ecosystem collapse?

For one Al Gore wrote in 2012, “ The fate of the Adelie Penguins, A message from Al Gore”: “As temperatures rise along the West Antarctic Peninsula and the winter sea ice blankets the ocean three months fewer per year than 30 years ago, the local ecosystem is in danger. Everything from the base of the food chain – the phytoplankton (microscopic plants and bacteria) and krill (shrimp like creatures), to one of the continent’s most iconic inhabitants, the Adelie penguins, are under threat…There is an important lesson for us in the story of the Adelie penguins.”

Indeed, Adelie penguins provide an “important lesson”. Don’t trust apocalyptic hype!

Adelie penguins may be the best studied bird on earth. In 2009, the International Union for the Conservation of Nature (IUCN) estimated between 4 and 5 million adults, happily listing them as a species of “Least Concern”. However, using dubious IPCC climate models, scientists led by ornithologist David Ainley predicted the most northerly Adelie colonies would soon disappear as ice-melting warmth crept southward. They predicted between the years 2025 and 2052, 70% of the total Adelie population would be lost. Bullied by that virtual death count, the IUCN downgraded Adelies from “Least Concern” to “Near Threatened”.

In real life, by 2016 Adelie abundance had nearly doubled to 7.6 million, and once again Adelies are a species of Least Concern. So how were scientists so misled?

Ice Age glaciers had forced Adelies to abandon most of Antarctica’s coast. With warming, glaciers retreated and Adelies rapidly returned to breed and multiply. However, there was one exception. For over 5400 years Adelies avoided ice free coastlines along Antarctica’s northwestern peninsula. Scientists dubbed this the “northern enigma”. Due to the region’s unfavorable weather, breeding Adelies still avoid much of that region, currently labeled the “Adelie Gap”. As might be expected, breeding colonies adjacent to the “Adelie Gap” are the least stable with some colonies experiencing population declines, and those declining colonies were enough to confirm some scientists’ climate fears.

In the 1990s, the northwestern sector of the Antarctic peninsula coincidently experienced rising temperatures and declining sea ice. Although Antarctica sea ice was not decreasing elsewhere, researchers believed the melting ice and warmer temperatures were just what CO2-driven climate models predicted. But then the peninsula’s winds shifted. The peninsula’s sea ice has now been growing and temperatures have been cooling for over a decade. Furthermore in contrast to Ainley’s models, colonies at the most northerly limits of the Adelies’ range are not disappearing. Those colonies are thriving and increasing such as the Sandwich Island colonies, and northerly colonies on the Antarctic peninsula’s east side.

Media headlines are guided by the maxim ‘if it bleeds it leads.’ Likewise, scientific journals. Good news about thriving colonies, or no change, fail to capture headlines. But the addiction to eye-catching catastrophes misleads the public and scientists alike. Despite no warming trend at an Emperor penguin colony, David Ainley was so inebriated by global warming fears, he fabricated a warming temperature graph to falsely explain the colony’s decline! Similarly, extreme researchers of polar bear populations wrongly argued, “we’re projecting that, by the middle of this century, two-thirds of the polar bears will be gone from their current populations”. Again, in reality polar bear abundance has increased.

By perpetuating bogus claims of a world ending in 12 years, the Chicken Littles are doing far more harm than blinding children to scientific evidence that many species, from polar bears to Adelie penguins, are thriving. Our children miss the “important lesson” that a “climate crisis” is only a theory supported by scary narratives, not facts. So how do we protect our children from Chicken Littles who seek to enroll vulnerable minds into their doomsday cults? How do we motivate our children to be good critical thinkers, and not blind group thinkers mesmerized by fear and ‘end of the earth’ scenarios?

Jim Steele is director emeritus of the Sierra Nevada Field Campus, SFSU and authored Landscapes and Cycles: An Environmentalist’s Journey to Climate Skepticism.

Share RecommendKeepReplyMark as Last ReadRead Replies (1)

To: Brumar89 who wrote (82706)11/15/2019 4:31:33 PM
From: Brumar89
   of 83319
Back to Hanoi for Jane Fonda!? Vietnam can’t build coal plants fast enough as economy booms – Fonda urged to take her DC climate protest to Hanoi instead

Vietnam’s coal and crude oil imports surged in the first ten months of this year, government data released on Tuesday showed, highlighting the Southeast Asian country’s increasing reliance on imported energy to support its fast-growing economy.

Vietnam has one of the fastest-growing economies in Asia, backed by robust exports and foreign investment. Economic growth this year is expected to surpass the government’s target range of 6.6%-6.8%, as the country benefits from the Sino-U.S. trade war.

The strong growth has boosted demand for coal. Imports of the commodity, mostly from Australia and Indonesia, during the January-October period more than doubled from a year earlier to 36.8 million tonnes, valued at $3.25 billion, the Customs Department said in a statement.

The imported coal will mostly be used for the country’s growing fleet of coal-fired power plants, which will still play a key role in its power generation mix for the years to come even as Hanoi promotes renewables.

The country’s crude oil imports rose 80.6% from a year earlier to 6.8 million tonnes during the period, the department said.

Once a key export earner for Vietnam, crude oil output of the country has been declining recently as its reserves fall at existing fields and as China’s increasingly assertive stance in the region hampers offshore exploration.

Government data showed crude oil output in the first ten months of this year fell 7.2% from a year earlier to 9.3 million tonnes. Meanwhile, its coal output rose 10.5% to 37.9 million tonnes.

The Ministry of Industry and Trade said in July Vietnam will contend with severe power shortages from 2021 as demand outpaces construction of new plants, with electricity demand expected to exceed supply by 6.6 billion kilowatt hours (kWh) in 2021, and 15 billion kWh in 2023.

Vietnam will need an average of $6.7 billion a year to expand its annual power generation capacity by 10% between 2016 and 2030, the ministry had said.

Tuesday’s customs data also showed Vietnam recorded a trade surplus of $9.01 billion during the first ten months of this year, widening from a surplus of $7.24 billion a year earlier.

Share RecommendKeepReplyMark as Last ReadRead Replies (1)

From: russet11/15/2019 6:45:03 PM
   of 83319
NOVEMBER 15, 2019 U.S. natural gas production, consumption, and exports set new records in 2018

Source: U.S. Energy Information Administration, Natural Gas Annual 2018
The U.S. Energy Information Administration’s (EIA) Natural Gas Annual 2018 shows that the United States set new records in natural gas production, consumption, and exports in 2018. In 2018, dry natural gas production increased by 12%, reaching a record-high average of 83.8 billion cubic feet per day (Bcf/d). This increase was the largest percentage increase since 1951 and the largest volumetric increase in the history of the series, which dates back to 1930. U.S. natural gas consumption increased by 11% in 2018, driven by increased natural gas consumption in the electric power sector. Natural gas gross exports totaled 10.0 Bcf/d in 2018, 14% more than the 2017 total of 8.6 Bcf/d. Several new liquefied natural gas (LNG) export facilities came online in 2018, allowing for more exports.

Source: U.S. Energy Information Administration, Natural Gas Annual 2018
U.S. natural gas consumption grew in each end-use sector. Demand for natural gas as a home heating fuel was greater in 2018 than in 2017 because of slightly colder weather during most of the winter. Similarly, the summer of 2018 saw record-high temperatures that increased demand for air conditioning and, therefore, electricity—much of which was fueled by natural gas. U.S. electric power sector consumption of natural gas grew by 14% in 2017, more than in any other end-use sector. The electric power sector has been shifting toward natural gas in the past decade because of favorable prices and efficiency gains.

Source: U.S. Energy Information Administration, Natural Gas Annual 2018
U.S. natural gas production growth was concentrated in the Appalachian, Permian, and Haynesville regions. Pennsylvania and Ohio, states that overlay the Appalachian Basin, had the first- and third-largest year-over-year increases for 2018, increasing by 2.0 Bcf/d and 1.7 Bcf/d, respectively. Louisiana had the second-largest volumetric increase in dry production, increasing by 1.8 Bcf/d as a result of increased production from the Haynesville shale formation. Texas remained the top natural gas-producing state, with a production level of 18.7 Bcf/d, as a result of continued drilling activity in the Permian Basin in western Texas and eastern New Mexico.

Principal contributor: Mike Kopalek

Share RecommendKeepReplyMark as Last Read

To: Brumar89 who wrote (82707)11/16/2019 7:02:11 AM
From: Brumar89
1 Recommendation   of 83319
New Science Scandal: ‘Fatally Flawed Hurricane Paper Should Be Retracted’
NOVEMBER 16, 2019

By Paul Homewood

Roger Pielke Jr has now weighed in on the flawed Grinsted hurricane paper:

Earlier this week a paper published by the Proceedings of the National Academy of Sciences (PNAS) by a team of authors led by Aslak Grinsted, a scientist who studies ice sheets at the University of Copenhagen, claimed that “ the frequency of the very most damaging hurricanes has increased at a rate of 330% per century.”

The press release accompanying the paper announced that United States mainland “hurricanes are becoming bigger, stronger and more dangerous” and with the new study, “doubt has been eradicated.”

If true, the paper (which I’ll call G19, using its lead author’s initial and year of publication) would overturn decades of research and observations that have indicated over the past century or more, there are no upwards trends in U.S. hurricane landfalls and no upwards trends in the strongest storms at landfall. These conclusions has been reinforced by the assessments of the Intergovernmental Panel on Climate Change ( IPCC), U.S. National Climate Assessment, and most recently of the World Meteorological Organization.

In fact, however, the new PNAS paper is fatally flawed. The conclusions of major scientific assessments remain solid. As I’ll show below, G19 contains several major errors and as a result it should be retracted.

The first big problem with G19 is that it purports to say something about climatological trends in hurricanes, but it uses no actual climate data on hurricanes. That’s right, it instead uses data on economic losses from hurricanes to arrive at conclusions about climate trends. The economic data that it uses is based on research that I and colleagues have conducted over more than two decades, which makes me uniquely situated to tell you about the mistakes in G19.

Compare the counts of hurricanes reported in G19 with those that can be found in climate data from the National Oceanic and Atmospheric Administration.

From 1900 to 1958, the first half of the period under study, NOAA reports that there were 117 total hurricanes that struck the mainland U.S.. But in contrast, G19 has only 92. They are missing 25 hurricanes. In the second half of the dataset, from 1959 to 2017, NOAA has 91 hurricanes that struck the U.S., and G19 has 155, that is 64 extra hurricanes.

The AP passed along the incorrect information when it reported that the new study looks at “ 247 hurricanes that hit the U.S. since 1900.” According to NOAA, from 1900 to 2017 there were in fact only 197 hurricanes that made 208 unique landfalls (9 storms had multiple landfalls).

Part of this difference can be explained by the fact that G19 focus on economic damage, not hurricanes. If a hurricane from early in the 20th century resulted in no reported damage, then according to G19 it did not exist. That’s one reason why we don’t use economic data to make conclusions about climate. A second reason for the mismatched counts is that G19 counts many non-hurricanes as hurricanes, and disproportionately so in the second half of the dataset.

The mismatch between hurricane counts in G19 versus those of NOAA by itself calls into question the entire paper. But it gets much worse.

The dataset on losses from hurricanes used by G19 to generate its top-line conclusions is based on my research. That dataset has been maintained by a company called ICAT located in Colorado. The ICAT dataset was initially created about a decade ago by a former student and collaborator of mine, Joel Gratz, based entirely on our 2008 hurricane loss dataset (which I’ll call P08).

In the years since, ICAT has made some significant changes to its dataset, most notably, by replacing P08 loss estimates with loss estimates from the “ billion dollar disasters” tabulation kept by the NOAA National Centers for Environmental Information (NCEI). The replacement data begins in 1980, at the start of the NCEI dataset.

This process created a new hybrid dataset, from 1900 to 1980 the ICAT dataset is based on P08 and for 1980 to 2018 it is based on NCEI. This is hugely problematic for G18, which was apparently unaware that of the details of the dataset that they had found online.

In our comprehensive update of P08 published last year ( Weinkle et al. 2018, or W18) we explained that the NCEI methodology for calculating losses included many factors that had historically not been included in tabulations of the U.S. National Hurricane Center, “for instance, to include federal disaster aid, federal flood insurance payouts, national and local agricultural commodity effects and other macro-economic impacts.”

That meant that one cannot, as ICAT has done, simply append the NCEI dataset from 1980 to the end of the P08 dataset starting in 1900. They are not apples to apples. Indeed, a big part of our work in the W18 update of P08 was to ensure that the data was apples to apples across the entire dataset, and we performed several statistical consistency checks to ensure that was the case.

The new PNAS paper, G19 unwittingly uses the ICAT dataset that staples together P08 and NCEI. I have shown with several graphs on Twitter why this matters: Before 1940, G19 and W18 loss estimates for individual are just about the same. After 1980, however, G19 loss estimates for individual storms are on average about 33% higher than those of W18. The result is a data incontinuity that introduces spurious trends to the dataset.

So what does this all mean?

It means that G19 has identified trends in hurricane losses that are the result of two datasets being improperly combined. This is why G19 results in trends that are inconsistent with the climatological record of U.S. hurricanes while W18 results in trends that are fully consistent with the climatological record of U.S. hurricanes.

When an analysis of economic loss trends from hurricanes in inconsistent with the climate record, the response should not be to claim that the climate record is flawed, but instead, to have a closer look at what biases and errors may have crept into the economic analysis.

Anyone wanting to understand trends in U.S. mainland hurricanes should look at data on U.S. mainland hurricanes, not economic data on losses. Below in the historical record of U.S. hurricanes. So far, 2019 has had two landfalls (the season ends November 30).

The figure blow shows landfalls of the strongest storms (major hurricanes, Category 3+), and 2019 has had none.

The bottom line here is that a fatally flawed paper on climate science passed peer review at a significant journal. It used a dataset found online that had not undergone peer review, much less any quality control. The flawed conclusions of G19 have been loudly promoted by activist scientists and uncritical media.

The result has been a polluting of our discussions of climate science and policy. I have no doubt that good science will win out in the long run, but if we do not enforce basic standards of research quality along the way, we will make that battle much more difficult than it need be.

Full post

As my analysis a few days ago showed, there are many other concerns about Grinsted’s paper.

Roger’s new findings are utterly damning, and it is hard to see how PNAS can do anything other than withdraw the paper.

Share RecommendKeepReplyMark as Last ReadRead Replies (1)
Previous 10 Next 10