.

Thursday, June 20, 2013

How Anthony Watts gets a bit confused about ocean heat content

Sou | 5:04 AM Go to the first of 20 comments. Add a comment

Willis Eschenbach has posted an article on Anthony Watts' WUWT blog.  He's wondering this time about ocean heat and forcing.  He's done something similar to what Bob Tisdale did a little while ago and wrote about here.

Here is Willis' chart.  Willis has done some sums on ocean temperature and plotted this chart in units of watts/sq metre.  Willis calculated from scratch and may not have got the conversion ratios quite right.  I didn't check.  I do know that it's not that easy to work out the specific heat of sea water going all the way down to two kilometres deep into the ocean.


This next chart is also from Willis' spreadsheet.  He doesn't plot it or show it in his article, but it shows up in his calculations.  This time it's what he calculated as the heat accumulated in the ocean.  He did this calculation before working out the year to year differences and converting to Watts/sq metre.  In other words, it's part of the very same data he used to generate the chart above.  This one shows the cumulative effect on the ocean of the year to year changes shown in the top chart.  



This next chart is based on a chart from SkepticalScience.  What I did was take the Skeptical Science ocean heat content and added in the heat content from the latest few years from NODC/NOAA (with an estimate from ocean temperature to fill the missing year 2004) and worked out the difference in heat being added to the ocean each year.  This is the same as Willis chart shown above but using data from a different and more trusted source.  It looks like more accumulated heat than Willis' chart in part because the base is different and in part because of Willis' different calculation, but both show a large accumulation.



Lots of people fall for Willis' line and go on about how the ocean isn't really heating up etc.  What is a surprise (but probably shouldn't be) is a comment from Anthony Watts.

How Anthony Watts is tricked...


In the comments to Willis' article, Anthony Watts put up the SkS version of the above chart (which looks pretty much the same except it includes land and atmosphere as well), writing (my bold):
June 19, 2013 at 4:13 am

Hmmm, this rather puts the kibosh on this graph from the SkS zealots:
Makes me wonder how Murphy finds such a large trend in OHC, but Levitus does not find any trend in forcing.
 No, Anthony, it's Willis who says he found the "average forcing is small" and that the overall mean is "not significantly different from zero" and that only a few of the individual years are significantly different from zero.  Willis finally gets around to pointing out Anthony's error, though he argues again that "it's not statistically significant" writing:
[REPLY: Thanks, Anthony. There is a large trend in OHC, as you show in your graph, but it is not statistically significant. This is because of the high autocorrelation of the data (lag-1 autocorrelation of 0.92). As usual, SkS forgot to mention that ... w.].
I'm not about to check that out but will just observe that the standard errors quoted by NOAA are much lower than the heat content reported.  SkepticalScience has links to relevant papers, but they are behind a paywall.  It's certain that the ocean is heating up (sea levels are rising etc) just as it's certain the oceans are getting more acidic.

Small mercies.  I suppose Willis could have denied it was warming altogether or put it down to one of Bob Tisdale's magical ENSO leaps.



Addendum:

20 June 2013 3:50 pm AEST

Thanks to Dana, here is the up-to-date SkepticalScience.com chart showing how heat is accumulating on Earth:

Heat accumulation on Earth via SkepticalScience
Source: SkepticalScience.com

Read the comments below for some good insights and further information, including other references to scientific publications and data.

20 comments:

  1. Note the data and graph from Nuccitelli et al. (2012) (which I'm partial to for obvious reasons!) is more up to date.
    http://www.skepticalscience.com/graphics.php?g=65

    Levitus OHC data are available here.
    http://www.nodc.noaa.gov/OC5/3M_HEAT_CONTENT/heat_global.html

    ReplyDelete
  2. The usual crap. See Lyman et al. (2010) Robust warming of the global upper ocean:

    We fit a line using weighted least squares (Supplementary Information) to the mean OHCA curve (Fig. 2, black line), using the overall uncertainty (Fig. 2, red error bars) for each year in the fit. These uncertainties are large enough that interannual variations, such as the 2003–2008 flattening, are statistically meaningless. We estimate a warming rate of 0.6360.28 W/m2 (uncertainties at the 90% confidence level) for 1993–2003, which is slightly (but not significantly) higher than the value of 0.560.18 W/m2 stated in the Fourth Assessment Report of the Intergovernmental Panel on Climate Change. The fit to the entire 16-yr record, including the well-sampled Argo years, yields a more robust warming rate of 0.6460.11 W/m2. The large uncertainties in OHCA introduced by the XBTs would undoubtedly have a similar effect on trends in thermosteric sea level (not shown).

    ReplyDelete
    Replies
    1. I've noticed that after years of ignoring it, the deniers are now starting to attack OHC. Not surprising. It is not doing what they want it to. One has only to look at OHC 0-2000m over the last decade. Especially at the three-month mean (red).

      Woo. Look at all that not-missing energy piling up.

      Delete
  3. I think that once again Willis has miscalculated the auto-correlation coefficient. There is no way that series has "lag-1 autocorrelation" of 0.92. He has, once again, failed to make the series stationary before calculating the autocorrelation. The ar(1) value he calculates is so high precisely because the series has a positive trend.

    Willis knows a few mechanical details about calculations, but he really can't understand what he is doing.

    ReplyDelete
    Replies
    1. Yes, Willis is confused. Autocorrelation in the *residuals* of a linear fit affects the estimate of the *uncertainty* in the trend. The autocorrelation of the raw data, rather than the residuals, is irrelevant. As you note, any data with a linear trend has a very strong autocorrelation, even if the residuals are completely uncorrelated.

      The deniers really do have a fundamental problem with the difference between signal and noise.

      Delete
    2. As far as I can tell, he hasn't even done anything remotely like consider auto-correlation coefficients. His calculation works in the following way. He takes the difference between the heat content in two successive years and then divides by the number of seconds in a year and by the surface area of the Earth. This gives him what he is calling the "forcing" (but is really just the fraction of the excess flux associated with ocean heating - and which he clearly does not realise).

      To get the error he adds the errors on the data for the two successive years in quadrature (Sqrt((Delta J1)^2 + (Delta J2)^). He then divides this by the number of seconds in a year and by the surface area of the Earth. This in itself is wrong (I believe) since he should be calculating the range in the gradient from these errors, not summing them in quadrature. Ultimately, he therefore produces an error value for every year which he associates with the error in the forcing (flux). To get the final error he simply take the standard deviation in these errors. Not only is the error calculation wrong (I believe) but what he's presented is the 1 sigma variation in these errors, so - in some sense - the error in the error, not the error in the flux.

      Delete
  4. Tisdale has to attack OHC. If it is rising significantly then his entire ocean-forcing mechanism is bunk, yes?

    ReplyDelete
    Replies
    1. I have not looked into the details of how Bob Tisdale explains his magical ENSO leaps of heat. Do you understand them?

      Any time I've seen someone asks him about the mechanism he has avoided the question and referred them to his book. He appears to be quite comfortable with a rise in ocean heat content overall while denying global warming, which is weird. (What I understand of Tisdale's approach is along the lines of: "The whole earth is warming but it's not *global* warming!")

      One of these days when I have the time I may look into it further. He's not all that relevant. His "theories" seem to me to be just one of many slightly crackpot "theories" that conflict with other crackpot "theories" that you read on WUWT, any or all of which may be accepted at one time or other by some or all deniers, despite their conflicting nature.

      Delete
  5. Just looking at the "Willis Ocean Heat Contents" graph, it seems very weird that the trend (i.e. the mean of the difference) would not be significant.

    ReplyDelete
    Replies
    1. Indeed, but that's easy to explain. It's because his error calculation is complete nonsense :-)

      Delete
  6. I think I have just found another huge balls-to-the-wall error in Willis's work. He's currently being torn a new one by some guy called "xymininininm" or something, and as part of his defense he is claiming that in order to adjust for the auto-correlation you need to use a formula from Nychka, D. Rather than using the standard 1/(1-phi) correction for autocorrelation, Willis claims the Nychka method calculates an "effective n," that is then used to estimate the standard error.

    I found some people using this method on other denialist sites - including that Lucia person. From there I tracked down the original paper (pdf; not paywalled) and it turns out that Willis's correction is NOT a correction to the variance. It is a correction to the degrees of freedom of a t-distribution used to calculate the confidence interval for a regression coefficient.

    Over at WUWT right now Willis is claiming that the effective n for calculating the t statistic of a slope is 1 (reduced from 58 or something by his "Nychka correction"). In fact, all that happens is the t statistic is T(1) distributed instead of T(58). This, instead of reducing the t-statistic by a factor of 50, simply increases the cut-off for significance from 1.97 to 2.2.

    So, I think his stats are really really wrong. And I think other denialists are making the same mistake of interpreting an effective degrees of freedom for a t distribution as the n used in calculating standard errors. It looks like McIntyre, Willis and Lucia have all made this mistake. Interesting ...

    ReplyDelete
    Replies
    1. Sorry, my comment below was meant to be a reply to Anonymous as it was their comment that made me look at it again.

      Delete
  7. Okay, I see what he's done now. I've been looking through his spreadsheet and had thought he was calculating his error from the standard deviation of his error calculation (which he gets from summing the error in the two subsequent heat content measurements in quadrature). But I think your're right, he's taking the standard deviation of the annual "forcings" he's calculated (which does not depend in any way on the reported errors in the measurements) and then done some kind of correction that depends on Sqrt(n) and a factor of 1.96. So, maybe I had misunderstood how he was calculating his error (although the number turns out to be the same) but it doesn't change the conclusion. The error analysis is complete nonsense.

    ReplyDelete
  8. Willis's work is a lot worse than this. He is calculating the trend of the *derivative* of OHC, not the trend of OHC.

    He finds this trend slightly positive, i.e that OHC is accelerating.

    It's not just his statistics; his interpretation of his own data is wrong.

    ReplyDelete
  9. Here's the (easy) reasoning that makes Willis's error very explicit:

    http://wattsupwiththat.com/2013/06/19/forcing-the-ocean-to-confess/#comment-1342359

    ReplyDelete
  10. @Anonymous, indeed I had ignored his calculation of the trend as it was sufficiently obvious that he didn't really know what he was talking about when he was so surprised that the mean of his calculated values was so much smaller than the downwelling flux (0.5 kW/m^2). As you say, however, his trend is actually the trend of the derivative and so he's shown that there is evidence for acceleration.

    ReplyDelete
  11. Blogged at:
    http://davidappell.blogspot.com/2013/06/wuwt-ocean-misunderstanding-and.html

    ReplyDelete
    Replies
    1. Thank you for sharing that link.

      Excellent informative post that rounded off what I learned over here quite well.
      My most basic take away message is that we are in good hands with the experts who understanding this stuff - it's a shame the Willis's and Wattzer's have zero ethical standards, they could learn stuff from you folks if they weren't so dogmatically blinded.

      Cheers,
      Peter

      Delete
  12. I have put a polite and reasonable comment at the post, trying to point out to Willis some of his errors. Reproduced below the line here. Also, is it my memory or has he majorly edited that post to remove some graphs?
    ---
    Willis, I think you are misusing the Nychka method, and all your calculations are wrong.

    The Nychka method provides a formula to calculate the degrees of freedom for a t-test or 95% confidence interval, but it is not the effective sample size used in calculating variance. This is not clear in the document you link to, but is clear in the paper by Lee and Lund[1] that Lucia uses to refer to the Nychka article. In this article they state clearly the correction factor for the variance of the OLS estimate of the slope, and then give the Nychka formula for the degrees of freedom of t.

    In fact, the efficiency of a properly calculated estimator of the slope is challenging to calculate and doesn't rely on an specific estimate of an "effective sample size" (see Lee and Lund), but it has been calculated for specific instances. In your case, a simple approximation of the standard error of the slope would be that it is about sqrt(0.22) times the standard error of an ordinary least squares regression slope, which would mean that the t-statistic is divided by sqrt(0.22), so about probably 3 times bigger. You can use this document to estimate efficiency for an AR(1) autocorrelation of 0.8, assuming that the x-axis has no serial dependence (since it is just time). Or you can plug the numbers into the formula from Lee and Lund if you like that sort of detail. But what should happen is that your t statistic from the OLS estimator will become bigger by a factor of about 3, which is equivalent to an effective n of about 20. Not 4 as you suggest. Then you can compare this value against a t statistic with degrees of freedom calculated using the Nychka method - that is the "effective sample size" minus 2 (because it's regression).

    If you do this I think you will find your p value is much less than 0.08.

    Note however that Lee and Lund observe that Nychka's method a) is not published in peer-reviewed literature and b) tends to give degrees of freedom below 0 for large autocorrelations, so may not be reliable.

    You also don't need to use any of this strange Monte Carlo stuff to handle testing in these cases. Every stats package can get you an auto-regression adjusted best linear unbiased estimator for the slope, directly from the data with a couple of lines of code. It's a trivial and well-established task. So rather than using a complex combination of Nychka adjustments and MC runs, just use the standard methods in R.

    I think you have misunderstood some of the background material for time series analysis. Instead of relying on complex and little-used methods promoted by other bloggers, I recommend you purchase a good textbook and work through the basic parts. I recommend Brockwell and Davis[2]. Until you do, you will continue to make the kind of basic errors that you made here.

    I hope that helps.

    ---
    References
    1. Lee J, Lund R. Revisiting simple linear regression with autocorrelated errors. Biometrika. 2004;91(1):240-245. {available online free if you search}
    2. Brockwell P, Davis R. Introduction to Time Series and Forecasting. Springer, New York. {I'm sure any edition will be fine}

    ReplyDelete

Instead of commenting as "Anonymous", please comment using "Name/URL" and your name, initials or pseudonym or whatever. You can leave the "URL" box blank. This isn't mandatory. You can also sign in using your Google ID, Wordpress ID etc as indicated. NOTE: Some Wordpress users are having trouble signing in. If that's you, try signing in using Name/URL. Details here.

Click here to read the HotWhopper comment policy.