Caring For The Earth And Its People
< 7 >
Being an engineer (electrical, acoustical) puts me in a dicy position for suggesting technical ideas for a complicated climate change project, especially since my SB was 54 years ago. So it takes time to wake up ideas.
Recently the amazing work on gravity waves brought to the fore the idea of extractly signals from noise, although that work was on a level never before imagined.
However there was much discussion about characterizing noise in order to make extracting the signal more accurate.
Some months ago I had down loaded NASA temperature data for the period 1880 to 2017. Looking at the unmodified curve suggested a hint of an exponential. So it seemed time for an exercise in curve fitting. I spent a whole night modififying a one term exponential until I got the blue center curve.
Since I was doing all this in Excel, I was finally reminded that Excel did curve fitting also. Only two of their curves worked: linear, which was not useful, and a polynominal expansion. The green and violet (upper and lower) curves are 4th and 5th order. Weighting a combination of the two curves would likely have brought them closer to mine.
Obviously the further out one goes, the less likely the curves are useful, although they agree fairly well with projections out to 2100. What struck me, however, was that most analysis I have seen only looked at a small segment of recent data, which appeared to be a straight line. The question came to mind was whether there was any useful data in the earlier parts of the curve, which in turn suggested rethinking the arguments between deniers and non-deniers during the early part of the research, perhaps before 1960. I was suddenly struck by the fact that the signal (temperature) from the early data would be quite noisy and came from denier experiments or arguments and comparing it to "denier" data was actually fruitless, which is how the arguments struck me. It seems that you got the answer you wanted because there was no right answer.
However, the Excel algorythm was likely a traveling averaging function and it did a credible job of establishing a baseline, but more importantly it pulled some data out of the noise in the earlier years. Also, an averaging of the result of many calculations and experiments would act just like noise in the early years which would average out, while the heat from warming would be comparatively noise free.. Hense, the 4th and 5th order polynominal curves and predictions of 4 to 5°C at 2100 were likely in the ball park.