Monday, August 1, 2016

Rancid wine in cracked bottles: The Tim Ball story

Tim Ball is spinning his wheels at WUWT, pushing this long-discredited, mendacious presentation of Greenland ice core data:

This deception was previously employed by Don Easterbrook:

. . . regarding which, Skeptical Science has already pointed out the rather glaring denier falsehood:
Easterbrook plots the temperature data from the GISP2 core, as archived here. Easterbrook defines “present” as the year 2000. However, the GISP2 “present” follows a common paleoclimate convention and is actually 1950. The first data point in the file is at 95 years BP. This would make 95 years BP 1855 — a full 155 years ago, long before any other global temperature record shows any modern warming. In order to make absolutely sure of my dates, I emailed Richard Alley, and he confirmed that the GISP2 “present” is 1950, and that the most recent temperature in the GISP2 series is therefore 1855. . . .

Unfortunately for Don, the first data point in the temperature series he’s relying on is not from the “top of the core”, it’s from layers dated to 1855. The reason is straightforward enough — it takes decades for snow to consolidate into ice.
Of course both Easterbrook and now Ball are also playing that ever-popular denier game of pretending one regional temperature record can stand as a proxy for global temperatures. But what really touches the nonsense into immortality is treating 1855 temperatures as modern-day temperatures, and denying global warming on that basis.

Have the razor-sharp critical intellects at WUWT picked up on this rather obvious deception? Not in the first hundred comments, which are mostly deniers arguing about whether warming is caused by a 100,000-year-cycle (rather than CO2) or a 21,000-year-cycle (rather than CO2). Tom Dayton points out the obvious, but "AndyG55" is ready with his unanswerable comeback: Michael Mann is a stupidhead LOL:


 Steve McIntyre splits the baby in his own inimitable style, warning deniers away from the data set without stating the obvious point that YOU CAN'T REFUTE GLOBAL WARMING WITH A RECORD THAT STOPS IN 1855.


 Other than that, it's the same old nonsense. Someone introduces the idea of a 41,000-year-cycle, so they chase that tennis ball for a while. Then, in response to some mealy-mouthed word salad by Easterbrook himself, Nick Stokes, who is somehow not banned from the monkey house, owns him and Steve McIntyre in one go:

. . . and that actually shuts up these venerable patriarchs of weaponized ignorance. Bravo, Nick. Bravo.

Thursday, June 2, 2016

We can do 100% renewables. But we probably shouldn't.

Source

 Peter Sinclair has a post up taunting "renewable haters" who are invited to be embarrassed that nuclear plants, under pressure from cheap natural gas, may require public money to stay in operation. Following hard on the heels of that, Exelon has announced the shuttering of the Clinton and Quad Cities nuclear plants, 3GW of near-zero carbon energy gone for want of $110 million in subsidy per year (which is the combined losses of the two plants in the current market.)

As an enthusiastic taunter of those I feel deserve it, I know the people Sinclair is talking about: people who position nuclear as the honest, work-a-day, practical solution, where as renewables are impractical fairy dust, a con sustained by massive public money. Which is ridiculous on all counts: nuclear energy has always required public support, with the government providing most of the R&D, permanent waste disposal at bargain prices (how's that coming, guys?), loan guarantees, even free insurance against the possibility of a meltdown. Meanwhile wind has reached 5% of US electricity production: sunny counties and regions, such as Jordan, are finding solar energy profitable without subsidies, as prices for modules continue to fall.

But the vices of nuclear advocates should not be confused with the virtues of nuclear energy. And just because we can build a 100% RE grid, does not mean we should.

Looking out into the world today, it is obviously imperative to get human civilization to net zero or net negative GHG emissions as soon as possible. Every year, every month that we don't pushes us further into the heart of a global disaster.

Renewables require careful load-balancing across large areas, storage, and dynamic demand management to begin to approach 100% of the energy supply. Contrawise, every 1% of baseload power you add makes the intermittent load easier to manage and cheaper overall. Science of Doom has a great post on the math here, and it's worth quoting his conclusion at some length:

What is the critical problem? Given that storage is extremely expensive, and given the intermittent nature of renewables with the worst week of low sun and low wind in a given region – how do you actually make it work? Because yes, there is a barrier to making a 100% renewable network operate reliably. It’s not technical, as such, not if you have infinite money..
It should be crystal clear that if you need 500GW of average supply to run the US you can’t just build 500GW of “nameplate” renewable capacity. And you can’t just build 500GW / capacity factor of renewable capacity (e.g. if we required 500GW just from wind we would build something like 1.2-1.5TW due to the 30-40% capacity factor of wind) and just add “affordable storage”.
So, there is no technical barrier to powering the entire US from a renewable grid with lots of storage. Probably $50TR will be enough for the storage. Or forget the storage and just build 10x the nameplate of wind farms and have a transmission grid of 500GW around the entire country. Probably the 5TW of wind farms will only cost $5TR and the redundant transmission grid will only cost $20TR – so that’s only $25TR.
Hopefully, the point is clear. It’s a different story from dispatchable conventional generation. Adding up the possible total energy from wind and solar is step 1 and that’s been done multiple times. The critical item, missing from many papers, is to actually analyze the demand and supply options with respect to a time series and find out what is missing. And find some sensible mix of generation and storage (and transmission, although that was not analyzed in this paper) that matches supply and demand.

What's more, baseload renewable sources such as geothermal, hydroelectric dams, and tidal power, all require large areas with appropriate geography (and geology) to be successful. Geothermal and tidal power are starting from an extremely small base, while hydroelectric dams (which have significant environmental costs of their own) are already close to their saturation point.

Compare the Exelon plants, Clinton and Quad Cities. Their combined capacity is 3GW, which at the industry-standard 0.9 capacity factor is roughly 24,000 MW-h per year. Those two plants, alone, produce more GWh of electricity than all the geothermal plants in the nation, combined. They produce more clean energy than all the utility solar plants in the nation, combined. That would be a bargain for a tiny subsidy of $100-150 million a year. It comes to about $0.05/kWh. We could subsidize our entire electrical grid to that extent and spend less than 2% of the GDP.

Nuclear energy is, by far, the largest source of low-carbon energy in the United States. Doubling or tripling our capacity could be done easily with the political will to do so. At a bare minimum, we should be maintaining the plants we have to the end of their useful life. Subsidies aren't a dirty word here. At least until we have a comprehensive carbon tax, all low-carbon energy will require subsidies or unfunded mandates, including wind and solar, especially once they reach a scale where their fluctuations necessitate storage.

Different countries and regions with different resources, relationships, and geography are going to need different mixes of sources to get to net zero. Ruling out either more RE or more nuclear seems irresponsible to me.

Thursday, May 5, 2016

SteveF makes a hash of climate sensitivity; I propose a solution

-->


Over at the Blackboard they are hawking a “heat balance based empirical estimate of climate sensitivity” which delightfully uses the IPCC's own numbers to show that climate sensitivity has got to be low!!! OMG!!! 
I will show how the IPCC AR5 estimate of human forcing (and its uncertainty) leads to an empirical probability density function for climate sensitivity with relatively “long tail”, but with a most probable and median values near the low end of the IPCC ‘likely range’ of 1.5C to 4.5C per doubling.
And how is Steve going to do that? Well, he's going to give us both barrels of the lukewarmer shotgun, oversimplification and argument from incredulity.

Let's leave aside for today the argument from incredulity (in which his own method produces a fat tail of dangerously high climate sensitivities, and he says, basically, "but that can't be true, because hand-waving") and look at the oversimplification at the heart of SteveF's method for estimating climate sensitivity.
We can translate any net forcing value to a corresponding estimate of effective climate sensitivity via a simple heat balance if:
1) We know how much the average surface temperature has warmed since the start of significant human forcing.
2) We assume how much of that warming has been due to human forcing (as opposed to natural variation).
3) We know how much of the net forcing is being currently accumulated on Earth as added heat.
The Effective Sensitivity, in degrees per watt per sq meter, is given by:
ES = ΔT/(F – A)      (eq. 1)
That is fairly close to true, leaving aside changes in the albedo of the earth over time, and problems with taking the average temperature, which I discuss below. The chief problem is that he doesn't know any of those things with sufficient accuracy to constrain climate sensitivity in any meaningful way.

Climate scientists who do actual work with the climate are doing a fine job of reducing the uncertainty of these numbers, but in every case, the opening move is always to average available measurements over a period of time. But a heat balance model by definition requires accurate accounting of how much heat is coming in and how much is going out, and those numbers are changing over time.

How much has the average surface temperature warmed since the start of significant anthropogenic forcing? Right now, the answer is about 1.5C (not 0.9C, as Steve estimates). El Nino will subside and that number will (temporarily) fall, but that is beside the point: if you are using present-day forcings, then you have to use present-day temperatures.

The same goes for ocean heat uptake: if you are going to compare that to surface temperatures, you need to know what the uptake was at the moment when you took values for the forcings and the total warming. That's tricky, because we know ocean heat uptake varies significantly over time. It's lower than usual right now, because of El Nino, but how low?

If you are going to use ocean heat uptake averaged over X number of years (which to my understanding is basically mandatory to get any kind of an accurate number), then you also have to average the net forcing over those same years, and you also have to average the warming compared to preindustrial over those years.

But simple averaging still will not work, because heat loss varies with temperature to the fourth power. An average warming of +0.9C could reflect a steadily linear increase, or a long period of flat temperatures followed by a prolonged spike in temps to +4C. The latter will radiate more heat into space than the former. The average temperature is the same, but the heat balance is not the same. So we had better stick to instants of time if this method is to have any hope of delivering accurate results.

To reiterate: for an estimate of heat balance to give you climate sensitivity, you need the heat balance AT THAT MOMENT, not averaged over time. The amount of heat the oceans were absorbing ten or twenty years ago can only be compared to the temperature ten or twenty years ago, and the net forcing of ten or twenty years ago. If you are comparing the average heat uptake by the oceans since 1993 with the last 13 years of temperatures and forcing estimates from a moment in 2010, you are comparing apples and oranges. All of these things change over time, and to use the relationship between them to estimate climate sensitivity, only contemporaneous estimates can hope to be valid.

Consider a building occupied by an unknown number of people. You want to know how many people are inside. If you know how many people when in the building at midnight Tuesday, how many entered on Wednesday, and how many left on Wednesday, you know how many are in the building when Thursday dawns (assuming no births or deaths, nerds.)

On the other hand, knowing how many people started in the building, the number who left the building on Sunday, and the average number of people entering the building each day over the last year, doesn’t help you a hell of a lot. But this is what Steve has tried to do with his heat balance based empirical estimate of climate sensitivity.”

For this to work, do you need to use the present moment? No, you do not. In fact, there may be excellent reasons to use some instant in the past, such as the benefit of hindsight in estimating warming or forcings or the presence of an exceptional event (like a large volcanic eruption) that lets you really test some of your theories about forcings and heat balance and temperature.

In other words, you need a series of “moments,” for which you estimate the levels of various forcings and the heat uptake of the oceans. Then you can predict what the temperature “should” be, based upon the inputs, and compare it to what the temperature was (and is.)

Since the act of resolving the state of the climate in each of these moments is one fairly powerful method of determining the inputs for the next “moment” (and whether it can do so, compared to historical observations, is a good test of how accurate it is) we might want to calculate a series of moments, each derived from the one before, based upon our best estimates of forcings, ocean heat uptake, and the like. 

Since everything we said about average temperatures over time could also be said about temperatures averaged over the global surface (i.e., that different local temperatures can yield the same average, but different heat loss) we had better break the earth into boxes, or "cells" and calculate temperature, heat uptake, heat loss, etc., for each cell.

Not all the same color
 

With these modest adjustments the calculations will have a much better chance of doing what Steve wants them to do, which is to take measurements of the climate system, account for ocean heat storage, and estimate climate sensitivity.

I call it a climate model. Trademark pending.

Wednesday, April 27, 2016

Punch-drunk with warming, WUWT branches out to being wrong about Palestine

2014 was the warmest year on record. Until 2015. 2015 looks to hold the title until 2016 goes in the books. Massive coral bleaching, accelerated sea level rise. To paraphrase Cedric Coleman, it's hard out there for a denier.

Eric Worrall of the monkey house tries to paper over the gaping hole in the movement's foundation with a "jazz hands" routine about Mahmud Abbas, who took the opportunity presented by a UN signing ceremony for a climate accord to call out Israeli settlements in the occupied territories:
"The Israeli occupation is destroying the climate in Palestine and the Israeli settlements are destroying nature in Palestine," Abbas told the gathering of 175 countries signing a landmark climate deal.
Eric chooses to pretend that Abbas claimed Israeli settlements cause global warming:
 Regardless of your position on the Israel / Palestine situation, suggesting that Israeli settlements contribute significantly to global warming is utterly implausible.
Which of course is probably why he didn't say that. He said the occupation was damaging the environment in Palestine, which it most certainly is. Israelis have established over 200 illegal settlements in the West Bank alone. These communities establish themselves strategically to control resources or strategic points and to restrict the development of Palestinian communities. They are not established via any rational planning process, which would certainly not place hundreds of tiny towns, some just campers and trailers running off diesel generators, scattered over thousands of square kilometers. This is indeed destructive to the environment in the West Bank.

Saturday, March 12, 2016

Lucia's sadly selective statistical showpersonship

Lucia Liljegren is spending the twilight of lukewarmism as a non-laughable position mostly posting recipes, and notifying her followers about the arrival of major holidays (three of her last ten posts.) But 'twas not always thus. During the "hiatus" Lucia was fond of comparing the IPCC's multi-modal mean with global temperatures, despite the fact that these were models of climate, not weather forecasts, and that patient people had explained to her over and over that "about 0.2C/decade over the next several decades" was not a prediction one could falsify based on a few years of data.

She liked making this mistake so much, she did it again and again and again (and again and again…and again.) But this all stopped rather abruptly in November of 2014, which funnily enough was exactly the time El Nino came to a stuttering start after a unprecedented 50-month absence:


As a result, every month after November 2014 (anomaly 0.68C) has been hotter (anomaly-wise) than November 2014 itself:

And yet despite the rather dramatic turn in the data, and despite the fact that she liked making this comparison well enough to make it over and over again with different temperature records and updates to the present, she never updated her final graph, which looked like this:


Why did Dr Liljegren suddenly lose interest in this exercise? Why the statistical-torture hiatus? We may never know, but said graph with more recent GISTEMP measurements superimposed looks like this:



It's a mystery, really.

UPDATE: MartinM has better graph-fu than I and has updated lucia's graph of the multi-model mean vs the 13-month mean.

 

Monday, January 4, 2016

Announcement of new elements allows WUWT dittoheads to flex their ignorance

Every now and again, Anthony likes to cut-n-paste a science press release, to try and delude himself and his readers into thinking his website has something to do with science other than whining about it. Unfortunately, changing the subject merely serves to illustrate the pathetic science illiteracy and reactionary politics of his little band.

Recently actual scientists were able to synthesize four new elements -- creating materials never seen on earth before. And Anthony, "citizen scientist," was able to copy their press release!

This should be easy. Not much is required of the commentors by way of response here. Scientists toiled for years in obscurity and today they leave their mark on history. Yay science! Unfortunately, this simple PR exercise is beyond the ken of Tony's tinfoil hat brigade.

Needless to say, "just get past…around 110" is not anyone's idea of how to synthesize stable superheavy elements, which is still very much a thing. But perhaps I am getting sidetracked from "JPS"'s main point, which seems to be: Scientists were wrong about an island of stability (wrong) so global warming is a lie!

Multiple dim bulbs simply reject the idea that anything has been discovered. It's just another hoax!



"Mark" one goes off on a weird tangent about element names, but "Mark" two stands ready to pull the discussion back to what the site is all about:

Having belittled homosexuals, the tinfoil hats decide it's time to bring up slavery and the New World Order:




I'd like to reiterate: this is a puff piece about a feel-good story about the discovery of new elements. But these deniers can't get through a simple press release on a totally non-climate-related subject without devolving into an anti-intellectual, homophobic, paranoid cri de coeur. It does not make one hopeful for their output of the course of the rest of this, an election year.

Monday, December 14, 2015

GISTEMP November: +1.05C

h/t Gavin Schmidt, via Twitter


Another month, another record. 2015 is now all but certain to go into the books as the hottest ever, claiming the title from 2014(!)

We are in the grip of a strong El Nino, so the coming years will likely see some regression towards the mean. I fully expect to see a "no warming since 2015" denier talking point by mid-2017 at the latest.

Beneath the noise, the world will continue to warm.