I have been meaning to tidy up the way TempLS deals with the regular lat/lon SST grid on the globe. I use ERSST, which has a 2x2° grid. This is finer than I need; it gives the sea much more coverage tha the land gets, and besides being overkill, it distorts near coasts, making them more marine. So I had reduced it to a regular 4x4 grid, and left it at that.
But that has problems near the poles, as you can see in this image:
The grid packs in lots of nodes along the upper latitudes. This is ugly, inefficient, and may have distorting effects in making the polar region more marine than it should, although I'm not sure about that.
So I looked for a better way of culling nodes to get a much more even mesh. The ideal is to have triangles close to equilateral. I have been able to get it down to something like this:
I don't think there is much effect on the resulting average, mainly because SST is still better resolved than land. But it is safer, and looks more elegant.
And as an extra benefit, in comparing results I found a bug in TempLS that had been puzzling me. Some, but not all, months had been showing a lot of drift after the initial publication of results. I found this was due to my system for saving time by storing meshed weights for past months. The idea is that if the station mix changes, the weights will be recalculated. But for nodes which drop out (mostly through acquiring a quality flag) this wasn't happening. I have fixed that.
Below the jump, I'll describe the algorithm and show a WebGL mesh in the new system.
I first decide on a reduction plan near the equator, which could be no reduction at all. In this case, I decided to keep all the latitudes, but omit every second node along each latitude. So I am reducing there by a factor of 2 instead of 4. I also stagger adjacent latitudes. I should have been doing this before. It means that the triangles are isosceles, and closer to equilateral.
I then work out, using the cos latitude contraction toward the poles, when I could get closer to the equatorial spacing by leaving just one in three, or one in four etc, still staggering rows with equal spacing.
Because I have doubled the density at the equator, I end up with more nodes, but not twice as many. It ends up with about 3500 instead of 2300, and a total of about 6000 altogether in recent years.
So here is the mesh plot, using the recent MoyGLV2 WebGL system. As usual, you can trackball and zoom, but not much else here.
AI Lions and Renewable Straw
3 hours ago
I don't fully understand your meshing procedure.
ReplyDeleteTwo questions: (1) why subsample rather than just using all the data, it doesn't sound like that much; (2) why limit yourself to using the 2-degree points from your input data rather than putting points on the globe freely, in a regular mesh -- including up in the Arctic.
Numerobis,
DeleteYou're right that there isn't that much data (though meshing every month, when I have to, takes a while). The motive for culling is to get roughly comparable density between sea and land. Otherwise in the boundary regions there is a bias - the land can have no influence on the densely meshed sea, while sea data will fill in wherever there is a gap in land data, as there often is. That still happens, but not so much.
Putting points freely requires interpolation of the SST data. Generally that's fine; SST is fairly smooth, and 2° data gives a good basis. It gets tricky near land. It would have been tricky for ERSST too; I hope they know more about it than I do. However, I don't think freely moving points would get a much better mesh than this method.
On the general meshing procedure, I use a convex hull algorithm. So in culling, I try to ensure that a reasonable mesh could be found, but I rely on the hull process to find it.
With a bit of work I bet we could improve the data pipeline a bit to get a nice interpolation that's not dependent on data density and handles the shores better (for some version of better). A constrained Delaunay, rather than the Delaunay triangulation you're computing, would let the shores be represented as precisely as you want. And we could add points to break down big triangles as well (or at least to limit how fast element size shrinks, which is associated with interpolation error).
DeleteBut it looks like mesh error leads to single-digit percent differences in anomalies. So it's likely not worth it.
Nick, do you mean culling of data points literally, or are you just averaging them to reduce their numbers? The first alternative would lead to a larger loss of info and more noise.
ReplyDeleteImplementing GHCNv4 would give a new redundancy problem. Do you think it is possible to use rural stations only, to get the numbers down? Giss effectively only use rural stations for long-term trends, due to their UHI correction...
Olof,
Delete"Nick, do you mean culling of data points literally"
Yes. I tried originally replacing the remaining points by local averages including those with culled, but found it didn't make much difference, and was complicated with land around. SST varies very smoothly; it is itself "optimal interpolation". Culling potentially adds noise, but there is so little compared with land.
I did this partly thinking about GHCN V4. With rational culling, one might well end up with V3 :)