Now providing data and code might be new in the world of computer science but it's been the norm for climate science for years. There's more data than deniers know what to do with. Articles on anti-science blogs like WUWT show that deniers don't have a clue what to do with the data that is available. In fact most of them don't even know that there is a heap of data and code they can play with.
Anyway, Roy Spencer has promised to release his new code for UAH v6.0 after it's come out of beta. No date has been set AFAIK. Anthony and his denier rabble will be watching no doubt. And they'll pounce on it and complain that he fiddled with his data - "adjusting" it endlessly.
Here is the latest version from UAH, together with RSS and the previous version:
Data sources: UAH v6, UAH v5.6 and RSS |
How to build a strawman
Building strawmen is straight from Denier 101. Anthony Watts knows that section by heart. For example, does Anthony explain how he got the all the temperature records and meta data that he's had poor little Evan Jones beavering away on for the past five years or more? Guess what. He got it from the guvmint - who gave it to him for free. Has he made his "data and code" available? Not yet. All he's done is make sweeping unsubstantiated statements.
From the WUWT comments
Eric Worrall wants governments around the world to use Amazon cloud at $150/month per terabyte (or what the cloud companies call terabytes):
May 4, 2015 at 5:59 pm
About time.
In these days of cheap storage and even cheaper internet access, data size is no longer an excuse.
For example, Amazon S3 web storage – 1000GB storage, 10,000GB monthly transfer costs $150 / month by my calculation. Any university not prepared to pay a sum so trivial to ensure open access should hang its head in shame.
Most agencies store climate data on their own servers AFAIK - and make an awful lot of it available to the general public - for free. Not that Eric would know what to do with it. He's no Nick Stokes, or Caerbannog, or Clear Climate Code.
Mickey Reno claims that he wants to "test" adjustments. He can do it now and could have done it for years, had he really wanted to:
May 4, 2015 at 7:16 pm
Good job! Don’t forget to archive operational metadata and dataset version info in your archive. In the long run, I would hope large national bureaus like NOAA and Hadley would store raw, unedited data, including remote sensing data, and ALL edits, adjustments, infills and cleanups to those datasets would be reproducible derivations, NOT end products which overwrite the whole. Only in this way can we ever hope to test the adjustments themselves.
Deniers at WUWT eat up the garbage they are fed and think they are feasting on a banquet. They don't recognise that they're being fed from the dumpster at the back of the restaurant. No class, no style, no scepticism.
"Deniers at WUWT eat up the garbage they are fed and think they are feasting on a banquet. They don't recognise that they're being fed from the dumpster at the back of the restaurant. No class, no style, no scepticism."
ReplyDeleteNicely summarized!
Sharing data and code is old hat in computer science. What the heck is this press release about? (It's real, I was able to google the first paragraph and find it.)
ReplyDeleteHere's my thesis code from 2007:
http://sparse-meshing.com/svr/0.2.1/download.html
Not unprecedented; one of my committee members started this page in 1997:
https://www.cs.cmu.edu/~quake/triangle.html
Compare the "new, unprecedented" release:
http://www.cl.cam.ac.uk/research/srg/netos/qjump/download.html
Basically, it's become easier over the past decade to make a nice web page.