Skip to content

Collision of Mileage Regulations and Technology

May 31, 2016
tags: , , , ,

Fleet mileage targets for 2022 – 2025 model years must be reviewed in 2018 before they can become final. They were initially augural, i.e., best estimates, when the joint EPA/NHTSA 2017 – 2025 national program was established.

The standards were established jointly by the National Highway Traffic Safety Administration (NHTSA), and the EPA.

The combined passenger car and light truck fleet wide compliance targets were set at 54.5 mpg, and 163 grams of CO2 per mile for 2025.

But the actual rules, as printed in the Federal Register, which are 577 pages long, are far more complicated.

First, the targets are set based on the vehicle’s foot print, i.e., the wheel base multiplied by the track dimensions. Smaller cars must meet higher standards.

Next, each manufacturer receives credits for adopting technologies that don’t affect test results, such as active grille shutters, stop-start systems, and high-efficiency lights.

Next, there will be extra credits for selling electric (EV), PHEV and fuel cell vehicles. For example, the number of EVs sold in 2021 will be multiplied by a factor of 1.5. There will also be air conditioning credits, and incentives for natural gas vehicles.

Vehicle mileage, October 2007 to April 2016

Vehicle mileage, October 2007 to April 2016

The above graph from the Wall Street Journal (WSJ), and the University of Michigan, establishes that light vehicle milage improved by nearly 6 mpg between 2007 and 2014, but that there has been no improvement since then.

Since mid-2014, the price of gasoline has gone from nearly $4 per gallon to around $2.

The benefit analysis, used by the NTSHA for justifying the regulations, was based on $3.86 per gallon.

Looming before Americans is the immense increase required by 2025, which is from 25.2 mpg to 54.5 mpg, or 46.2 mpg after taking the special credits, such as for air conditioning, into consideration.

The EPA says the increase can be easily achieved, but Americans will need to decide whether that’s true.

Chart depicting large increase in mpg required by 2025.

Chart depicting large increase in mpg required by 2025.

The above graph, based on the initial graph from the WSJ, depicts the huge increase in mpg required between now and the 2025 model year.

It will require automobile manufacturers to increase fleet mileage by 183%.

Merely duplicating the same rate of increase that was achieved between 2007 and 2014 would result in fleet mileage requirements of approximately 34 mpg, a 135% increase from today’s level.

Even more telling is that the regulations currently require an increase over 1/3 greater than merely replicating the increase achieved between 2007 and 2014.

Such herculean efforts seem misplaced when the original reasons for imposing fleet milage requirements have largely disappeared.

The two main reasons were to decrease oil consumption and reduce CO2 emissions.

Specifically, from the Federal Register:

“Reducing U.S. petroleum imports lowers both the financial and strategic risks caused by potential sudden disruptions in the supply of imported petroleum to the U.S.”

The U.S. now has huge reserves of oil as the result of fracking. Canada can provide additional oil, if needed.

There is no longer any reason to reduce oil consumption so as to protect the country from sudden interruptions in oil supply.

Cutting CO2 emissions hardly seems appropriate given that CO2 probably is not the cause of global warming. The latest CLOUD reports from CERN provide added support to Dr. Svensmark’s hypothesis for sun-induced global warming.

The EPA was quoted as saying just the opposite:

“The standards…function as insurance so that fuel prices don’t stall progress on cutting greenhouse gases.”

Confirming again, that the real reason for fuel mileage standards is to cut CO2 emissions.

The EPA’s reasoning is also irrational as it provides incentives for using natural gas, i.e., methane, where methane is now the target of the EPA and environmental organizations.

A review of the mileage regulations is now being initiated, and Americans should watch closely during 2017 and 2018 to ensure that fleet milage regulations don’t impose an economic and job killing penalty on the economy.

Besides, with the recent decline in gasoline prices, Americans have voiced their preference for bigger, heavier, higher horsepower vehicles, as demonstrated by the decline in fuel economy from mid 2014 to today’s 25.2 mpg.

Americans should be free to choose the vehicles they prefer, without government dictating what they can buy.

* * * * * *

Nothing to Fear, Chapter 15, An Alternative Hypothesis, describes why the sun is the far more likely cause of global warming..

Nothing to Fear is available from Amazon and some independent book sellers.

Link to Amazon: http://amzn.to/1miBhXy

Book Cover, Nothing to Fear

Book Cover, Nothing to Fear

NOTE:

It’s easy to subscribe to articles by Donn Dears.

Go to the photo on the right side of the article where it says email subscription. Click and enter your email address. You can unsubscribe at any time.

If you know people who would be interested in these articles please send them a link to the article and suggest they also subscribe.

© Power For USA, 2010 – 2016. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author, Donn Dears LLC, is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Power For USA with appropriate and specific direction to the original content.

Absurd Cost of California Wind

May 27, 2016

The Pathfinder wind farm project, would cost $8 billion, including the construction of a 2,100 MW wind farm in Wyoming, a compressed energy storage (CAES) facility in Utah, and the transmission lines needed to bring electricity from the Wyoming wind farm to Los Angeles, California.

Map of proposed 500 mile transmission line from Wyoming to Utah, CAES facility.

Map of proposed 500 mile transmission line from Wyoming to Utah, CAES facility.

(Transmission line proposals are still tentative, including a DC line from Utah to Los Angeles.)

Four companies are involved in the project, Pathfinder Renewable Wind Energy, Magnum Energy, Dresser-Rand and Duke-American Transmission.

The centerpiece of the project is the $1.5 billion CAES storage facility at the existing Intermountain Power Generation site in Utah. It will be ten times larger than the only other CAES facility in the United States, located in Alabama, which is rated 110 MW.

CAES facilities are expensive and add to the cost of electricity. Obviously, a $1.5 billion CAES facility has to be amortized against the use of electricity, the cost of which will be borne by consumers. They are however, less expensive than batteries that are not suited for such large storage requirements. Only pumped storage could provide the required amounts of storage, and they require ideal siting with the formation of a lake behind a dam, where many environmentalists are opposed to building dams.

Storage of this size would not be necessary if it weren’t for Renewable Portfolio Standards (RPS) requiring the replacement of fossil fuels with renewables such as wind.

The California Independent System Operator (CAISO) Duck curve demonstrates why storage is necessary, especially in California where the state has mandated an RPS of 33% by 2020 and 50% by 2030.

CAISO Duck Curve

CAISO Duck Curve

The belly of the curve shows how renewables displace conventionally generated electricity, with increasingly large amounts being displaced, and with increasingly large ramping up in the evening when the sun sets.

Storage is required to minimize the ramping, where ramping creates huge stresses on the system. In addition, storage is needed to replace wind and solar when the wind doesn’t blow and the sun doesn’t shine, when fossil fuels have been precluded as backup alternatives.

A more complete explanation is in Nothing to Fear.

The Utah location is ideal for a CAES facility, as the existing Intermountain generating facility is located over a large salt deposit that is approximately 10,000 feet thick, which can be excavated hydraulically to create caverns capable of storing large quantities of compressed air.

There would be four caverns, with each cavern being 300 feet in diameter and over 1,000 feet high.

The entire project, from start to finish demonstrates the absurd cost of wind energy.

  • The wind farm is likely to have a capacity factor of 35%, which is better than wind farms built in Eastern states. The entire wind farm would therefore be equivalent to one, 900 MW natural gas combined cycle (NGCC) power plant. The cost of electricity from the wind farm will be around 10 cents per kWh, before it’s transmitted to Utah.
  • The CAES facility is not 100% efficient so there will be less electricity shipped to Los Angeles than was received from the wind farm in Wyoming.
  • The amortized cost of the CAES facility must be added to the cost of electricity.
  • There are also transmission line-losses, probably twice as large as with traditional fossil fuel power generation due to the long distances involved.

Interestingly, environmental groups constantly berate line-losses from fossil fuel power plants, which are probably about half the line-losses that will be incurred by this wind project. Line-losses are typically cited at around 6% for traditional generating plants, which are rarely located more than 500 miles from where the electricity is consumed.

This wind project is shipping electricity about twice that distance, or around 1,000 miles.

  • The approximately $3 billion cost of building the nearly 1,000 miles of transmission lines should also be included. However, a new 900 MW NGCC power plant would also require a new transmission line, but that line would probably be less than 1/3 the length.

All told, the cost of electricity to the consumer from this project could be at least three times higher than would be the case with a single (NGCC) power plant, or around 18 cents per kWh for the Wyoming wind, versus 6 cents per kWh for an NGCC plant.

In simpler terms:

The entire $8 billion project could be replaced with a single 900 MW NGCC power plant located 100 miles or so from Los Angeles at 1/5 the cost, or approximately 20% of the cost of the Pathfinder and Intermountain project cost.

With a project of this size, where CAES of this size has never been built before, there is no guarantee that costs won’t escalate.

It should be noted that electricity in California is already twice as expensive as in Louisiana, Arkansas and Oklahoma, where coal or natural gas are the primary methods for generating electricity.

This is part of the penalty for trying to eliminate the use of fossil fuels for generating electricity in California.

* * * * * *

Nothing to Fear, Chapter 9, The Utility Death Spiral, explains why displacing fossil fuels with wind and solar will result in the bankruptcy of Utilities and the possible takeover of the industry by the government.

Nothing to Fear is available from Amazon and some independent book sellers.

Link to Amazon: http://amzn.to/1miBhXy

Book Cover, Nothing to Fear

Book Cover, Nothing to Fear

* * * * * *

NOTE:

It’s easy to subscribe to articles by Donn Dears.

Go to the photo on the right side of the article where it says email subscription. Click and enter your email address. You can unsubscribe at any time.

If you know people who would be interested in these articles please send them a link to the article and suggest they also subscribe.

© Power For USA, 2010 – 2016. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author, Donn Dears LLC, is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Power For USA with appropriate and specific direction to the original content.

Americans Have Been Warned

May 24, 2016

When tyrants announce their intentions, people are prone to yawn.

Reactions range from, “They don’t mean what they say,” to “It can’t happen here.”

Mein Kampf of 1925, The Communist Manifesto of 1848, or the Rules For Radicals in 1971 all describe explicitly the plan for imposing a tyrannical form of government on the people.

Book cover for Communist Manifesto

Book cover for Communist Manifesto

All too often the tyrants mean what they say, and successfully impose their will on the people.

Today, Americans have been warned by another brand of tyranny, flying under the banner of environmentalism.

It’s especially insidious, because all Americans want a clean environment, and want to do what is best for their children and grandchildren.

But we have been warned, as is clearly demonstrated by the Sierra Club.

Three years ago, the Sierra Club declared war on natural gas, as they were in the midst of their campaign to destroy the coal industry.

Their strategy wasn’t to merely stop the use of coal, it was to stop the use of all fossil fuels … and natural gas is an abundant and inexpensive fossil fuel as the result of fracking.

Lena Moffitt runs the Sierra Club’s, Beyond Dirty Fuels campaign, and here is what she recently said … a clear warning to all Americans.

“We have moved to a very clear and firm and vehement position of opposing gas. Our board recently passed a policy that we oppose any new gas-fired power plants. We also have a policy opposing fracking on our books.”

This, on the heels of the destruction of the coal industry with the resulting loss of possibly 200,000 jobs, depending on the multiplier of coal miners to employees of businesses supporting the coal industry. With all the families of those who are now unemployed being affected.

Unapologetic of the effect their efforts have had on families, Ms. Moffitt went on to say:

“We are doing everything we can to bring the same expertise that we brought to taking down the coal industry and coal-fired power in this country to taking on gas in the same way.

She continued:

I look forward to seeing the same success brought to taking down gas plants to ensure that we’re actually moving to a 100% clean energy future.” (Emphasis added.)

Can any statement be more clear?

The Sierra Club is willing to destroy industries, throw millions out of work while raising the cost of, not only electricity, but on all the other ways in which natural gas is used.

Prior to fracking, the United States was running out of natural gas, and was preparing to import expensive natural gas from the Mideast.

Thousands of jobs had been lost as fertilizer and other manufacturers dependent on inexpensive natural gas moved their jobs offshore. Some of those jobs are just now returning because of the cheap natural gas that’s now available due to fracking.

Approximately 68 million American homes are heated by natural gas. The cost of heating American homes will increase by approximately 300% if the Sierra Club is successful.

Nothing could be clearer:

Extreme environmentalists are out to destroy the American standard of living.

They are doing this in the name of climate change, when there is insufficient evidence that emissions from natural gas and other fossil fuels are the cause of global warming.

And when it’s also abundantly clear that it’s impossible to cut CO2 emissions 80%, as the extreme environmentalists dictate, without destroying America’s standard of living.

Americans have been warned.

Here is what the Investor Business Daily had to say:

These are modern-day Stalinists who will burn down the village to rebuild it. If this group has its way, it will send America back to stone-age energy sources. It will rapidly de-industrialize America, drive up unemployment and make our nation much, much poorer.”

Will China and other developing nations follow suit? Not likely.

Will Americans ignore the warning?

* * * * * *

Nothing to Fear, Chapter 14, An Impossible Objective, explains why it’s impossible to cut CO2 emissions 80% without destroying America’s standard of living.

Nothing to Fear is available from Amazon and some independent book sellers.

Link to Amazon: http://amzn.to/1miBhXy

Book Cover, Nothing to Fear

Book Cover, Nothing to Fear

* * * * * *

NOTE:

It’s easy to subscribe to articles by Donn Dears.

Go to the photo on the right side of the article where it says email subscription. Click and enter your email address. You can unsubscribe at any time.

If you know people who would be interested in these articles please send them a link to the article and suggest they also subscribe.

© Power For USA, 2010 – 2016. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author, Donn Dears LLC, is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Power For USA with appropriate and specific direction to the original content.

Regulations Hurt GDP Growth

May 20, 2016

Fracking has increased the availability of low-cost oil and natural gas.

By definition, low-cost oil and natural gas results in an increase in productivity, and an increase in GDP.

From the previous article:

∆ GDP = ∆ Population + ∆ Productivity

And:

Productivity = output / Input

The case for natural gas is straight forward.

Chart of Henry Hub Natural Gas Price from EIA

Chart of Henry Hub Natural Gas Price from EIA

Prior to the advent of fracking, the United States was running out of natural gas. It was becoming necessary for the United States to build import terminals for importing LNG at world market prices which were linked to the price pf oil. The price of natural gas had increased to around $8 per million BTU by 2008, with spikes to even higher levels.

Fracking has resulted in an abundance of natural gas at a current price of $2 and less.

Natural gas has an important effect on the economy, and the lower price means an increase in productivity and GDP. A lower input cost, in the formula for productivity, results in increased productivity.

The case for oil is somewhat convoluted, but fracking has resulted in the United States becoming the swing producer, replacing Saudi Arabia in this regard.

Until the United States developed fracking in conjunction with horizontal drilling, Saudi Arabia controlled prices, by either increasing or decreasing production depending on whether it desired a lower or higher price for oil.

Because of fracking, the price of oil will be capped at a lower price than would otherwise have been the case. If the price of oil increases above a certain point, oil output from fracking will increase, thereby increasing supply and capping the price of oil.

This results in the denominator being lower in the formula for productivity, which results in increased productivity and GDP.

The EPA has reported there are no systemic adverse environmental consequences from fracking, so there are no environmental costs.

Unfortunately, both Democrat candidates for president have vowed to outlaw fracking.

Introducing regulations or outlawing fracking will result in higher cost natural gas and oil, which means reduced GDP growth.

As with Obamacare, government intervention will result in lower GDP growth, which hurts all Americans.

Moody’s Analytics estimated that hundreds of thousands of jobs were lost as the result of the oil price crash last year, and that GDP growth was cut by half a percentage point.

Oil is now recovering, so that these hundreds of thousands of jobs will be restored as fracking and oil production increases in the United States.

But, these jobs will be permanently lost if fracking is banned.

A petition signed by 45 environmental groups urged the President to stop all offshore drilling as well as fracking, so as to limit global warming.

Such a move would hurt all Americans as per capita GDP would be hammered even further.

It’s abundantly clear that government intervention and regulations are stifling growth, and that eliminating government involvement would allow the economy to grow at a faster rate, perhaps at 3.5% again, as it did between 1950 and 2000.

As shown in the previous article, this will make a huge difference in America’s standard of living, as expressed by per capita GDP.

 

* * * * * *

Nothing to Fear, Chapter 9, The Utility Death Spiral, explains why displacing fossil fuels with wind and solar will result in the bankruptcy of Utilities and the possible takeover of the industry by the government.

Nothing to Fear is available from Amazon and some independent book sellers.

Link to Amazon: http://amzn.to/1miBhXy

Book Cover, Nothing to Fear

Book Cover, Nothing to Fear

* * * * * *

NOTE:

It’s easy to subscribe to articles by Donn Dears.

Go to the photo on the right side of the article where it says email subscription. Click and enter your email address. You can unsubscribe at any time.

If you know people who would be interested in these articles please send them a link to the article and suggest they also subscribe.

© Power For USA, 2010 – 2015. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author, Donn Dears LLC, is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Power For USA with appropriate and specific direction to the original content.

Impact of Regulations on Growth

May 17, 2016
tags: , , , ,

For the past several years, the United States has been mired in a period of slow growth.

The average rate of GDP growth between 2007 and 2015 was a paltry 1.2%.

This anemic growth compares with an average growth rate of 3.5% between 1950 and 2000.

An excellent article by John Mauldin, at mauldineconomics. com, captures the fundamentals of growth using the formula:

∆ GDP = ∆ Population + ∆ Productivity.

(Delta is the Greek symbol for change.)

His article explored the population issue in depth. It also established that productivity growth had declined from 2.6% between 2000 and 2007, to 1.2 % between 2007 and 2015.

This is a very large decline in productivity and affects future per capita GDP, a measure of economic prosperity. Per capita GDP for the United States was $54,630 in 2014, according to the World Bank.

From 2014, the difference between a 3.5% growth rate and a 1.2% growth rate over only 20 years is huge: $116,444 vs $71,023, a difference of $45,421.

While some aspects of population, i.e., more specifically growth in work force size, are not under government control, changes in productivity are largely controlled by government policies and actions.

World Map of GDP from Wikipedia

World Map of GDP from Wikipedia

(Black >$50,000,  Ranging to Yellow <$2,000)

Therefore, the key to restoring economic growth is to improve productivity by changing government policies, or, in many cases, eliminating government regulations and intervention.

Steve Forbes’ book, Reviving America, shows how government intervention in healthcare, since and during WWII, created the high-cost, inefficient healthcare system of today.

A simplified explanation of productivity is:

Achieving greater output with less input, or output divided by input.

It is essentially, a measure of the economy’s efficiency.

In Gilbreth’s day, around 1900, providing a worker with the correct size shovel, or arranging the work place to require fewer body movements that Gilbreth measured using Therbligs, would result in improved productivity.

The movie Cheaper by the Dozen was based on the lives of the Gilbreths.

Over the last few decades, computers and information technologies provided the impetus for productivity improvements.

Improvements in productivity require investment in new technologies, not making a worker work faster or harder.

Robots, software and the capturing and utilization of data are potential forces for improving productivity.

Investment requires taking risks. If government regulations increase risk, investments in new technologies and products might not be made.

The 2010 Dodd-Frank financial law has increased risk in the financial markets. A new rule from Dodd-Frank would make directors and officers of companies liable for decisions if their decisions are wrong. This alone will cripple investments in new technologies, that by their very nature may be risky investments.

EPA regulations have put thousands of coal miners out of work, and virtually killed the coal industry. Some may argue that government support for wind and solar, using taxpayer dollars, has created new jobs offsetting lost jobs in the coal industry, but this is debatable.

However, these EPA regulations have hurt productivity.

The end result is that government regulations have shifted the economy from inexpensive and abundant electricity, to unreliable and expensive electricity, which, automatically, by definition, hurts productivity and lowers GDP growth.

Referring to the above formula for GDP growth:

More expensive electricity means a larger denominator, which results in lower productivity and lower economic growth.

This further substantiates that Renewable Portfolio Standards (RPS) requiring the use of wind and solar will hurt GDP growth.

As John H. Cochrane, in his article, Ending America’s Slow-Growth Tailspin, said:

“If it takes years to get the permits to start projects and mountains of paper to hire people, if every step risks a new criminal investigation, people don’t invest, hire or innovate. The U.S. needs simple, common-sense, Adam Smith policies.”

An excellent example of how government regulations can severely harm economic growth are the regulations designed to halt fracking.

The next article will explore the effect of government regulations on oil and natural gas.

 

* * * * * *

Nothing to Fear, Chapter 9, The Utility Death Spiral, explains why displacing fossil fuels with wind and solar will result in the bankruptcy of Utilities and the possible takeover of the industry by the government.

Nothing to Fear is available from Amazon and some independent book sellers.

Link to Amazon: http://amzn.to/1miBhXy

Book Cover, Nothing to Fear

Book Cover, Nothing to Fear

* * * * * *

NOTE:

It’s easy to subscribe to articles by Donn Dears.

Go to the photo on the right side of the article where it says email subscription. Click and enter your email address. You can unsubscribe at any time.

If you know people who would be interested in these articles please send them a link to the article and suggest they also subscribe.

© Power For USA, 2010 – 2015. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author, Donn Dears LLC, is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Power For USA with appropriate and specific direction to the original content.

Betting Your Future on Computer Models

May 13, 2016

Computers are a major, if not crucial tool, used by the International Panel on Climate Change (IPCC) to predict global warming and climate change from CO2.

But how accurate are these computer models?

The IPCC has gone so far as to devote an entire page on its web site to argue that the models can be relied on, but it’s clear from recent experience that they are not able to predict future temperatures with any degree of accuracy.

Comparison of multiple IPCC models with observations, by Roy Spencer, UAH

Comparison of multiple IPCC models with observations, by Roy Spencer, UAH

(HadCRUT temperatures are compiled by the Hadley center of the UK Met Office.)
(UAH temperatures are compiled by the University of Alabama, Huntsville.)

The HadCRUT and UAH data points are actual temperatures, and they are following the most conservative of the IPCC models. They are also far below the average of all IPCC model results.

Temperatures are actually rising far more slowly than the computer models are predicting.

Yet, it’s the average of all models that is being used to establish the need for cutting CO2 emissions. Some of the more radical environmentalists are using the higher and more extreme forecasts when claiming the need to cut CO2 emissions to prevent a climate catastrophe.

The above chart by Roy Spencer, was criticized by environmental groups for various reasons, but the following chart substantiates its findings.

This chart, composed by Energy Matters, uses the IPCC’s First Assessment Report (FAR) chart in 1990, and superimposes actual temperatures.

Comparison of the IPCC FAR (1990) temperature forecasts with HadCRUT4. HadCRUT4 data was downloaded from WoodForTrees and annual averages calculated. Chart from Energy Matters

Comparison of the IPCC FAR (1990) temperature forecasts with HadCRUT4. HadCRUT4 data was downloaded from WoodForTrees and annual averages calculated. Chart from Energy Matters

(Note that the first chart is until 2030, while the second is until 2100.)

Both demonstrate that actual temperatures are following the most conservative computer projections. Note also, that actual temperatures have risen only 1 degree C over the past 160 years.

The wide divergence in results from various computer models make their results highly suspect.

But we are being asked to make huge sacrifices in our standard of living so as to cut CO2 emissions 80% by 2050, when the reality is that computer models are not providing us with good evidence that CO2 will cause drastic increase in temperatures.

We are also being asked to have a carbon tax applied to the use of fossil fuels.

Once again, computer programs are being used to demonstrate that a carbon tax will curtail CO2 emissions and prevent temperatures from rising dangerously high.

A recent wall Street Journal (WSJ) article cited computer simulations that showed it would be necessary to impose a carbon tax of $425, in today’s dollars, to prevent temperatures from rising over 2 degrees C, the supposed tipping point. According to the article, such a tax would reduce GDP by 5 to 10%.

But even staunch supporters of the need to cut CO2 emissions disagree.

From the WSJ article, “‘The models are biased on the pessimistic side,’ said Joe Romm, senior fellow at the left-leaning think tank Center for American Progress.”

Computers are marvelous tools, and have helped the United States improve its productivity, but computers are subject to human errors, such as developing inaccurate algorithms and using inaccurate data.

We shouldn’t be betting our future on computer models that are subject to the infamous GIGO, garbage in, garbage out.

Especially when their output appears to be wrong, with actual temperature rise far below the projections of the computer models.

* * * * * *

Nothing to Fear, Chapter 4, Why The CO2 Hypothesis is Wrong, explains how atmospheric CO2 has not affected temperatures, using an IPCC chart showing that CO2 has remained constant at 280 ppm prior to 1850, for the previous 2,000 years.

Nothing to Fear is available from Amazon and some independent book sellers.

Link to Amazon: http://amzn.to/1miBhXy

Book Cover, Nothing to Fear

Book Cover, Nothing to Fear

* * * * * *

NOTE:

It’s easy to subscribe to articles by Donn Dears.

Go to the photo on the right side of the article where it says email subscription. Click and enter your email address. You can unsubscribe at any time.

If you know people who would be interested in these articles please send them a link to the article and suggest they also subscribe.

© Power For USA, 2010 – 2015. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author, Donn Dears LLC, is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Power For USA with appropriate and specific direction to the original content.

Data Everywhere, But Not a Byte to Use

May 10, 2016

There is considerable discussion about data, how it can improve energy efficiency and reduce the use of electricity.

Frequently, it’s about capturing data of energy usage in the home so as to reduce the use of electricity and cut CO2 emissions. Magazines, such as Intelligent Utility and EnergyBiz, have frequent articles on the subject, but focus on the limited concept of reducing the use of electricity to cut CO2 emissions.

The only technology that can significantly reduce the use of electricity, is the LED, in the home, in industry and commercially. See, Only LEDs Can Significantly Cut Electricity Usage.

Using data to know when to turn on the dishwasher is not the real opportunity.

Controlling home appliances and other equipment has frequently been referred to as the Internet of Things, but it is too narrow a concept.

The importance of data isn’t that it might reduce the use of electricity. The real opportunity lies in improving quality, reliability and lowering costs … for both companies and customers.

General Electric Company appears to have embraced this concept faster than most companies.

GE Company, Schenectady, Edison Avenue entrance. Photo by D. Dears

GE Company, Schenectady, Edison Avenue entrance. Photo by D. Dears

Other companies also recognize the importance of data in the digital world, but GE has invested in the necessary software engineers, located in San Ramon, California, and has organized around a concept, which it calls GE Digital.

For example, Baker Hughes released its “FieldPulse model-based, predictive analytics software, enabling operators to optimize production by giving them a clear understanding of an asset’s performance in real time.”

In one sense, GE has an advantage that some other companies lack, in that it has multiple products and installations that can benefit from analyzing data.

But what is involved in making use of digital data?

The initial step is to be able to install sensors in the equipment that’s being manufactured and installed, or in assets, such as pipelines. It can be a sensor to measure stresses on turbine blades and buckets, or to measure flow rates of fuel pumps, etc.

Sensors can be installed on any component or equipment where data on stresses, temperatures, flow rates and on nearly anything that moves, bends or stretches, or can give off signals can be captured. Data can also be captured on what people are doing, such as the time a truck driver spends standing idle due to a breakdown somewhere else in the system of material movement.

The next step is to capture the data, which can amount to trillions of bytes that seem unrelated to each other, and then storing them while making them accessible to various programs. Use of the cloud allows for this interoperability and analysis of data using multiple algorithms.

The next, and probably the most difficult activity, is to understand how the data can be used to forecast problems, predict when equipment should be maintained, the type of maintenance required, how parts can be designed to improve life, how equipment scheduling can be improved, how operator and machine interfaces can be improved plus a thousand other attributes that can result in improved quality, reliability and costs.

And this requires developing algorithms and the analytics that will utilize the data to achieve the desired results.

Finally, compare this vision of digital data utilization with using LEDs for street lighting to reduce the use of electricity.

LED street lights can reduce the use of electricity by 80 to 90%, while using data will result in only small, though still cost effective, reductions in the use of electricity.

Using data to reduce the use of electricity has a tiny payoff, such as knowing when to turn off the LED streetlights … Or use the dishwasher or clothes drier.

Using data to cut electricity usage is a myopic view of the importance of digital data.

Utilities have major opportunities for using data to improve reliability and control costs.

This is why using data is important, not merely to reduce the use of electricity to cut CO2 emissions.

Sensors on transformers, reclosers, capacitors, breakers and regulators, for example, to measure temperatures, surges, voltage drops, and impedances and also to capture other data on the distribution system can help isolate problems more quickly and reduce down time for homeowners and businesses, while also reducing costs.

Smart City and Smart Grid are euphemisms that purport to imply the use of data, but which ignore the hard work and sophistication required to achieve improvement in quality, reliability and lower costs. A smart meter for example, isn’t the key to a smart grid: At best, it’s merely a sensor.

Reducing the use of electricity so as to cut CO2 emissions is the wrong objective for capturing and using data.

The big payoff will come from using digital data to improve quality, reliability and costs … for manufacturers, utilities, drillers, airlines, etc., and customers.

* * * * * *

Nothing to Fear, explains why attempting to cut CO2 is a fools errand.

Nothing to Fear is available from Amazon and some independent book sellers.

Link to Amazon: http://amzn.to/1miBhXy

Book Cover, Nothing to Fear

Book Cover, Nothing to Fear

* * * * * *

NOTE:

It’s easy to subscribe to articles by Donn Dears.

Go to the photo on the right side of the article where it says email subscription. Click and enter your email address. You can unsubscribe at any time.

If you know people who would be interested in these articles please send them a link to the article and suggest they also subscribe.

© Power For USA, 2010 – 2016. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author, Donn Dears LLC, is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Power For USA with appropriate and specific direction to the original content.

Follow

Get every new post delivered to your Inbox.

Join 427 other followers