Thursday, April 14, 2011

Quiet time


USHCN adjustments plotted for USA and States


There has been a lot of interest in USHCN adjustments. Paul Homewood has been tabulating data from various states, most recently Ohio. Steven Goddard has been getting publicity with various flaky graphs. In criticising one of these, I posted a plot of average US adjustments. In doing so, I followed SG's practice of a simple average across the USA. It would be better to use some kind of area weighting.

Zeke Hausfather has been writing a series at Lucia's, and there have been various posts at WUWT.

So I thought it would be useful to post a complete series of plots of the effects of USHCN adjustments on the individual states, and then an average of these weighted by state area. This should give similar results to gridding. So there is an active plot below the jump. Note that the results are in °F, which seems to be traditional for USHCN.

Saturday, April 9, 2011

TempLS Ver 2.1 release

This is the first release for a while. The structure is still much like ver 2, and the math basis is the same. The main differences are:
  • The code has been internally reorganized. Most variables have been gathered into R-style lists. And more code has been sequestered into functions, to clarify the underlying structure of the program.
  • The new work on area-based weighting has been incorporated.
  • Other work has been done with weighting. In particular, subsets of stations can be weighted differently relative to each other.
  • Hooks have been added so that the adaptions to the Antarctic studies can be included.


The preprocessor which merges and converts data files available on the web to a standard TempLS format is unchanged. Data and inventory files from Ver 2 will still work.

A reminder of how TempLS is used. In an R environment, a jobname is defined, and a .r file with that name is created which defines some variables relevant to the wanted problem. This may have several consecutive runs defined. The variables are variants of supplied defaults.

Then source("TempLSv2.1.r") solves and writes data and jpg files to a directory jobname.

The files you need are on a zip file TempLSv2.1.zip. This 16Mb file contains preprocessed data sets, so you won't need to use the preprocessor. I've had to separate GSOD data; GSOD.zip is another 7 Mb.

If you just want to look at the code and information (everything readable), there is a 170 Kb TempLS2.1_small.zip. A ReadMe.txt file is included. There is also a partly updated manual from V2.0, most of which is still applicable. More details below, and past files on TempLS can be found on the Moyhu index. I'll describe below the things that are new.

Monday, April 4, 2011

Blogger's spam filter

In about September last year, Blogger, the Google outfit that host this site, introduced a new spam filter. Until that time I had been very happy with them. I still am, in most aspects. But the spam filter is making dialogue on the site impossible. It is supposed to learn from the liberations that I make, but it seems to be just getting worse. I have not yet been blest with a single item of real spam, but about one in three genuine comments go to the bin for no obvious reason at all.

The problem is compounded because, since I'm on Australian time, they can sit in the spam bin for hours before I can fix it.

I've done what I can to raise the problem with Blogger. They don't offer direct communication, but delegate feedback to a forum. The response is that, no, you don't get a choice here, your sacrifice is for the general good.  We seem to be conscripted into an experiment whereby Google uses our rescue activity to improve its spam database.

So I'm looking at Wordpress. Blogger has just announced a revamp, but from their aggressive attitude about the filter, I don't see much hope. I'll wait a few days longer, and listen to advice on whether Wordpress is likely to be better, but it's looking inevitable.