Tuesday, August 23, 2016

NOAA's Climate Explorer fools climate quack Bob Tisdale at WUWT

Bob Tisdale has just discovered a terrific new NOAA web tool that is designed to help communities in the USA adapt to climate change (archived here). Naturally enough, Bob doesn't bother to find out the first thing about the tool or the data that underpins it. Instead he all but accuses NOAA of fraud and fakery in his usual "climate hoax" conspiratorial manner.

Climate Explorer - for the USA


First lets look at what the collaborative effort (NOAA plus more) is offering planners and communities in the USA. It's called The Climate Explorer. You can choose a city and see what may happen to your temperature and rainfall over time, under different scenarios. There are two scenarios: high emissions and low emissions. If you choose Chico, Butte County, California, you'll see the following options, each having more options:
  • temperature
  • precipitation
  • other:- heating degree days and cooling degree days.

For temperature, you can see the observed temperature and can compare it to the hindcast of the downscaled models, as well as viewing projected changes. Here's an image that I've annotated with arrows that shows up the flexibility of the web tool:


You can select or deselect items at the bottom, you can change the timescale by moving the bar at the bottom, you can download the data, and you can copy the image.

It's very neat. What's immediately obvious (or should be) is that there was a huge amount of effort that went into developing this. I'm not just talking about the user interface. That in itself would have been a big job. It is very attractive, sleek and slick, and user friendly. No, what I'm referring to is the amount of effort required to downscale models and get data on a grid size that is useful for cities and regions.



This information isn't hard to find. On the top right hand of the front page is a link to "About Climate Explorer".  Early in the article, it gives a clue to the underlying data. It states:
Users can compare graphs of observed conditions to climate model simulations for the same period: in this case, climate models were initialized to reflect conditions in 1950, and then run through 2005. Comparing the range of observations against the simulations for this period can provide insights on the models’ collective ability to capture the range of observed variability for each climate variable. In some cases, the simulations and observations show a good match. In other cases, these comparisons may reveal consistent biases or limitations of the models.

Further down the same page it briefly describes how the modeled data was developed and links to the temperature and precipitation data.


Why downscale General Circulation Models (GCMs)?


On the top right hand corner there's also a link to Definitions and FAQ, which has a lot more information in (mostly) non-technical language - or at least language that a reasonable well-educated person should be able to follow. If you follow the links from the NEX-DCP30 definition, there you'll find how to access the downscaled data. You'll also find a link to a technical note. Here's an excerpt, explaining why scientists go to the trouble of downscaling GCM outputs:
The demand for downscaling of GCM outputs arises from two primary limitations inherent with current global simulation results. First, most GCMs are run using relatively coarse resolution grids (e.g., a few degrees or 102 km), which limit their ability to capture the spatial details in climate patterns that are often required or desired in regional or local analyses. Second, even the most advanced GCMs may produce projections that are globally accurate but locally biased in their statistical characteristics (i.e., mean, variance, etc.) when compared with observations.

The Bias-Correction Spatial Disaggregation (BCSD) method used in generating the NEX USDCP30 dataset is a statistical downscaling algorithm specifically developed to address these current limitations of global GCM outputs [Wood et al. 2002; Wood et al. 2004; Maurer et al. 2008]. The algorithm compares the GCM outputs with corresponding climate observations over a common period and uses information derived from the comparison to adjust future climate projections so that they are (progressively) more consistent with the historical climate records and, presumably, more realistic for the spatial domain of interest. The algorithm also utilizes the spatial detail provided by observationally-derived datasets to interpolate the GCM outputs to higher-resolution grids. 

If you do want to use all the data, be warned. You may want a powerful computer, lots of bandwidth, and you'll certainly want more than Microsoft Excel. The authors of the technical note stated (my emphasis):
With the help of the computational resources provided by NEX and the NASA Advanced Supercomputing (NAS) facility, we have applied the BCSD method to produce a complete dataset of downscaled CMIP5 climate projections to facilitate the assessment of climate change impacts in the United States. The dataset compiles over 100 climate projections from 34 CMIP5 GCMs (Table 1) and four RCP scenarios (as available) for the period from 2006 to 2100, as well as the historical experiment for each model for the period from 1950-2005. Each of these climate projections is downscaled over the coterminous US at a spatial resolution of 30 arc-seconds (approximately 800 meters), resulting in a data archive size of more than 12TB (1TB = 1012 Bytes).

Weaselly Bob Tisdale doesn't bother with research


Bob Tisdale, needless to say, can't be bothered learning about downscaling. He's not even the slightest bit impressed with the beauty of the website, making no comment about it's ease of use and attractive design. Instead he uses words like "manipulated" and "misleading" and "weasel words" as if he, resident pseudoscientist at WUWT, could understand the first thing about it. Bob thinks climate models are weather forecasts. He doesn't even know the difference between climate and weather let alone the first thing about climate modeling. He doesn't know that there is a difference between a projection, a prediction and a forecast, let alone what that difference is.

Bob is very silly. He decides to prove that the downscaled model outputs "Have Been Manipulated By NOAA To Show Better Model Performance Than Actual Outputs". (Note the capitalisation, typical of dim deniers. No, it's not a heading.) Recall how the scientists described how they performed careful downscaling to overcome the limitations of global models. Downscaling is the technique used to look more closely at climate on a regional scale.


Bob hasn't heard of downscaling


What Bob did was to take the global multi-model mean downloaded from KNMI Climate Explorer, and selected a small geographic area (the coordinates of 40N-42.5N, 90W-87.5W, with land-only values). He then compared that to the output from the NOAA downscaled data. He found, unsurprisingly, that it was different. His data was on a very coarse scale compared to the downscaled data. Bob jumped to wrong conclusions because he didn't bother to take the five minutes it would have taken him to read the information provided on NOAA's website. Bob wrote how he took a wild guess (and got it wildly wrong):
I suspect (but don’t know for certain) NOAA determined the climate-model simulated temperature anomalies for the variables, which drastically reduces the ranges and spans, and then shifted the results so that they capture the observations-based data.

Well, no, Bob. It wasn't nearly that simple. The scientists performed complex calculations using supercomputers. I can't do justice to the method in a blog article. You can read about it for yourself. The crux of it is:
  • Compilation of over 100 climate projections from the 34 CMIP5 GCM simulations (Table 1) across the four RCP scenarios.
  • Interpolation to a common 1-degree grid, plus extraction of the long term trend.
  • Bias correction.
  • Putting the long term trend back in.
  • Conducting spatial disaggregation - spatially interpolating the Adjusted GCM data to the finer resolution grid of the 30-arc second PRISM data (that's about 800 m across). This involved multiple steps to preserve spatial details of the observational data.

As stated in the paper:
...the [spatial disaggregation] algorithm essentially merges the observed historical spatial climatology with the relative changes at each time step simulated by the GCMs to produce the final results.

So you can see, even from the dot points above, that the exercise was a lot more complicated than Bob's - just shifting the anomalies from the multi-model global models to match the observations.

If you haven't guessed by now, deniers like Bob Tisdale are only interested in trashing science and pretending they know something about it. They don't. Bob barely understands what an anomaly from a baseline is, and hasn't the first clue about climate modeling.


What about rising seas?


Oh, and you'll groan at this. After complaining that he didn't like (what he didn't understand) about the new NOAA tool, and after all the articles at WUWT saying that rising seas aren't a problem, Bob Tisdale whined:
At NOAA’s Climate Explorer, visit a coastal city or town like Miami (Miami-Dade County), or New York City (New York County), or Los Angeles (Los Angeles County).  What’s missing?

Sea level.

For communities near coastlines, possible sea level rise is the primary concern, not temperature, not precipitation, not heating or cooling degree days.
I'll have to read the comments to see what the WUWT deniers have to say about that. (Incidentally, rising seas are a more immediate problem on the eastern seaboard of the USA than in Los Angeles County. Not that I'd expect Bob Tisdale to understand that.)


Postscript: Not just NOAA


Let me also point out, because Bob Tisdale didn't (it would spoil his rant against NOAA), that this tool isn't just a creature of NOAA - it involves a number of programs, agencies and entities as listed on the bottom of the page, such as NASA and USGS and more.


From the WUWT comments


There aren't many comments yet, so I might come back later.

bazzer1959 talks about "sides" and complains about Bob's charts:
August 22, 2016 at 4:50 am
Graphs which show very similar colours are completely incomprehensible. It’s impossible to discern the individual runs. As that isn’t necessary here, for the sake of this posting, then there is no need to show them at all. Just show an average. I appreciate your posting, as always, but it does annoy me when graphs are shown that are impossible to follow. The whole issue is complicated enough, and must always be simplified whenever possible. Both ‘sides’ are guilty of producing stuff that an ordinary person finds it rather hard to comprehend.

Lee Osburn said he found a diclaimer page. I couldn't find the word on any page. I don't know if he thinks that model outputs aren't data - they are.
August 22, 2016 at 5:29 am
Crying out loud, just went back and checked out their “disclaimer” page and it says that it is not real data…. So— that may just reflect on the quality of all of their data. All set in stone. Without observations, we have nothing to compare anything to. All just bullshit! 

Steve Case agrees with Bob about rising sea level.
August 22, 2016 at 5:48 am
For communities near coastlines, possible sea level rise is the primary concern, not temperature, not precipitation, not heating or cooling degree days.
B I N G O ! 

Mark from the Midwest wants to flood Ted Cruz and Lamar Smith with letters from WUWT's "climate hoax" conspiracy theorists. That would be a turn of the tables! (I don't know how NOAA's scientists would feel about getting cc'd in.)
August 22, 2016 at 6:10 am
If everyone on this site wrote to Lamar Smith, and/or Ted Cruz, and cc’d the appropriate persons at NOAA, plus the house and senate reps from their own districts, about the nonsense it would help. It would particulallry help due to the fact that a large number of regular commentors here can use those 3 little letters behind their name, that tend to get some attention, (aka piled-higher-and-deeper). And even more of you have specific education on this subject matter that you can cite in your communication.
I’ve been invovled in House testimony where a junior rep talked about the “overwhelming” response from her district, and when queried by the chair had to admit that it was basically 22 letters, all from Marin County, (a very clearly representative cross-section of the U.S. public opinion).
Sometimes a few hundred well contructed letters from folks with credentials can make a big difference. 

Forrest Gardener isn't impressed by models that can accurately hindcast. I'd say that it's because he doesn't understand what's involved or what it means.
August 22, 2016 at 6:17 am
I sort of understand the problem, but of what possible value are the NOAA hindcasts? Accurate hindcasts are about as meritorious as saying that they have plotted a graph which looks just like the real data or that they’ve built a self driving car which does just fine as long as the actual road is programmed in.
The thing they would have to do to claim merit would be to show what their predictions were when the predictions were actually made and how those predictions turned out.
Although Bob makes some good points, it looks to me like he let NOAA off the hook with that one. 

I don't know if usurbrain looked at the NOAA website, or if he or she has seen how well scientists predictions in the past have come to pass. I would say that using the brain is something that usurbrain avoids like crazy when it comes to anything climate.
August 22, 2016 at 6:27 am
About 40 years ago I developed computer models for Supercritical Steam Power plants and Nuclear Power plants. These models were used for developing the accident analysis need for regulatory approval and for developing full scale simulators used in operator training. Back then this was done with punch cards on a remote “main-frame” (which had less power than the CPU in todays cell-phone). If I had spent the amount of time they did, and wasted the thousands of hours of computer time as they have, and my end product was this farcical and inaccurate I have been escorted out the door with no chance to even get my personal belongings. Same would also be true for the “models” used in the IPCC climate change prognostications.
Why does any intelligent person accept this BS?

References and further reading


The Climate Explorer - link to the NOAA web-based tool

NEX-DCP30: Downscaled 30 Arc-Second CMIP5 Climate Projections for Studies of Climate Change Impacts in the United States - tech note about the downscaling method.

Climate Model Downscaling - article on the GFDL website, describing two common methods of downscaling general circulation models.

Methods of Downscaling Future Climate Information and Applications - slide presentation from Linda O. Mearns, one of the experts in downscaling. It has good graphics.

How reliable are climate models? - article by GP Wayne on SkepticalScience.com, with a good UQx Denial101 video about the accuracy of climate models with Dana Nuccitelli.

Crystal Serenity - article by Tamino showing how accurate were relatively early predictions of James Hansen.

William M Briggs is no futurist. About forecasts, scenarios and projections - HotWhopper article about the difference between predictions, projections, scenarios and forecasts.




12 comments:

  1. "If you haven't guessed by now, deniers like Bob Tisdale are only interested in trashing science and pretending they know something about it. They don't."

    Yup.

    ReplyDelete
  2. Way off topic, and a shameless plug, so Sou, if it is inappropriate, feel free to delete this comment.

    A friend of mine recently presented me with a graphic from Tony Heller comparing hot days in 1936 to hot days in 2016. I wrote a blog post about it. Some here may find it amusing.

    Click here: Unfinished Progress


    ReplyDelete
    Replies
    1. That's a great article, DC. The cause and effect bit was very nicely done, too.

      Delete
    2. Yep, very thorough. The comments o the Daily Kos reposting are also worth reading. And there's a plug right back here to Sou's!

      Delete
    3. Thanks, Sou. It was fun to write.

      bill, I had to give a shoutout to HotWhopper. Sou does a great job and deserves the mention.

      Delete
    4. Very good post, D.C. And it was refreshing to see the quality of comments it drew on the Daily Kos. I've gotten too used to the idiotic level found on denier sites or even on mainstream news sites.

      Delete
  3. There's another great tool from NASA's GISS data. It allows you to set a base period, and then a target period, and see how global and regional temperatures are expected to change. It's available here: http://data.giss.nasa.gov/gistemp/maps/

    ReplyDelete
  4. TB does not equal 1024 bytes. Try 1e+12

    ReplyDelete
    Replies
    1. It said 1012, not 1024. In other words, ten to the twelfth, with the superscripting lost in quoting.

      Delete
    2. Yes, what Philip said. Sorry for the confusion. Fixed now.

      Delete
  5. NASA/NOAA - must wonder why they bother sometimes!!!!

    ReplyDelete

Instead of commenting as "Anonymous", please comment using "Name/URL" and your name, initials or pseudonym or whatever. You can leave the "URL" box blank. This isn't mandatory. You can also sign in using your Google ID, Wordpress ID etc as indicated. NOTE: Some Wordpress users are having trouble signing in. If that's you, try signing in using Name/URL. Details here.

Click here to read the HotWhopper comment policy.