Sunday, April 14, 2013
TempLS and paleo - Marcott et al
TempLS is a program I wrote for getting global average surface temperature anomalies. I've been using it every month to calculate an index from GHCN/ERSST data, which I compare with GISS when it comes out. It differs from those usually used in that it associates offsets with stations and does a least squares fit, so avoids fittinga fixed anomaly period into station data. The math basis is described here. And here is a somewhat dated overview.
I've been in discussion at Lucia's (see also recent threads here), and it seemed to me that a method she was proposing for anomaly reconstruction for paleo could be implemented in TempLS. Although TempLS is oriented toward monthly data with seasons, it can fix on just one month of the year, so all I had to do was get data in the right input format. As with my earlier recons, I did not attempt the Monte Carlo variation of dating and calibration. I used the published dating.
It works quite well - I'll show early results below. It avoids the dimple. I use the same 5x5 weighting scheme. However TempLS has extensive facilities for weighting by different methods, such as unstructured triangular meshes, which are well suited for sparse sets like these.
Details below.
The Marcott et al data comes from the Supplementary Materials spreadsheet. In these plots I have shifted all curves to a 4500 to 5500 BP anomaly base for plotting. However, I am using the Marcott et al CI's as from the spreadsheet, and the TempLS calc did not use this period, so I did not modify the CI's to reflect this shift. I smoothed all data using a century moving average. This cuts 40 years off each end. The reason I smoothed the Marcott data is that it is given to only two significant figures, so is whiskery when plotted unsmoothed.
Here is the plot over the full period with the TempLS and Marcott reconstruction with shaded CI's. One thing to note is that the TempLS solution is dimple-free. The overlap is fairly good - TempLS strays a bit at the ends. I'm checking for possible reasons.
And here is the plot with just the CI's for comparison. TempLS is mostly lower, but again does not include error due to calibration and dating uncertainty.
Here are versions restricted to the last two millenia.
Could you "de-rez" the instrumental records from stations close to proxy locations to match the resolution of those proxies and then attach them to the end of those series? Wouldn't that give us an idea of what the reconstruction would look like if there were more proxy data in recent times?
ReplyDeleteCCE,
DeleteThe main problem is that if you de-rez the instrumental, there isn't much left - especially if you're looking for local records which may be very short. I think the global record would be more appropriate.
But I think the main thing is that this is asking of proxies what they aren't good at. They really ccan't resolve the last century, or at least these can't.
Looks like you have the material for a publishable comment here. Very nice synthesis of different ideas. I figure the last few hundred years just suffers from too few proxies. If you merged in other "calibrated temperature proxy" data sets you might be able to produce a more reliable reconstruction, now that you have the framework for computing it.
ReplyDeleteThanks, Carrick,
DeleteActually, for the first time in my blogging career, I am thinking of publishing. It would involve upgrading the weighting. But also representing the solution not as a 20yr discrete series, but as a Fourier series or some such. Then the linear interpolation isn't needed, which would make for much smaller matrices, among other things.
I agree. nice piece of work.
DeleteI think the Fourier series approach is a great idea. With an OLS solution you can of course handled irregular spacing of data points (though you lose orthogonality of the cos & sine functions), plus you can also enforce a low-pass cut off in your reconstructed series.
DeleteSounds like the right approach to me.
Carrick,
DeleteYes, I'm making progress. It doesn't work will collocation (with irregular points), but I'm expecting that a finite element Galerkin will work - I have to integrate triangle
*trig exactly. I wasn't relying on orthogonality.
One possibility is that the jiggling of date points can be done analytically.
Carrick,
DeleteI think I've reached a dead end there. I was trying to do a Galerkin method using the triangular shape functions of the data as test functions. But that just tapers the multipliers of the high order coefficients which are free to get bigger - HF noise. I realised I had to use the trig functions as test functions, but that just ends up as Fourier series of my very first approximation - a simple mean.
Nick I was pretty sure you weren't going to rely on orthogonality here (that comment was for other people who are used to being able to, when you have uniformly sampled data)...
DeleteHave you looked at the singular values of the matrix?
Carrick,
DeleteOddly, I do have orthogonality, although I'm not taking advantage of it. I'm currently thinking in continuum terms, FEM+spectral. The data points relate to the continuum via triangular shape functions and everything is integrated (exactly) over time. The irregular sampling just means different shape fns.
The qn is, what test functions? I want a near-orthogonal matrix to invert. That means, test fns have to multiply whatever is operating on the unknowns (F coefs) to produce that effect after integration. Shape fns, which I tried first, don't. It should be the trig fns themselves.
It all looked good. The trig fns erase the proxy offsets - all constant over time. And I do get a diagonal matrix. The thing is that whatever operates on unknowns is independent of proxy. To solve, I might as well add them all. And then I find myself just doing Fourier analysis on the averaged proxy expression. It doesn't really lead anywhere.
I agree with Carrick. BTW, there are probably quite a few extensions of the proxies used. Tierney for example has a higher resolution 1500 year to present version of the 60ka Lake Tanganyika TEX86. That is one with spikes :)
ReplyDeleteYes, I'll try to get the area weighting sorted, and then can probably look around for more series.
DeleteOne thing about the idea of using a function space for the time sequence is that it's an explicit clamp on resolution, and should ensure that you just can't get spurious spikes.
"However, I am using the Marcott et al SI's as from the spreadsheet, and the TempLS calc did not use this period, so I did not modify the SI's to reflect this shift."
ReplyDeleteI think I'm having a brain cramp here. What do you mean by "SI's"
LL, No I was. I meant CI's. It's a mistake I often make, for some reason. I'll fix.
Delete