Over on the right, under Resources, there is a link to a "Document Store".
That's where I keep various bits of code and other texts. The TempLS
material is there, for exam/noscript>ple. I had been keeping it on a location on drop.IO.
Steve Mosher warned me that drop.IO has been taken over by Facebook, and
will be closing. I looked around, and discovered that there is a much
better alternative. Google Sites allows users, free, to set up a site with
100 Mb of space, to which you can upload files, images, etc. It also offers
a range of templates replete with gadgets.
So I have set one up. It's a bit of a work in progress. The templates are
impressive, but none quite matched. I chose a project template, which has
lots of facilities, but some not really appropriate management talk. I've
tried to keep that to a minimum.
So if you go to the new Document store you'll
see a home page, with some intro stuff (not much yet), and a pointer to a
page of "Project
Documents". This is the new repository. The Google facility allows me
much more freedom in writing descriptive text (so I'll have to do it :().
I'll also use it to host images - I've set up a "Picture Gallery". There's
no real reason to look at that, as it's just for images to link in to
posts, but you can if you want. I had been using TinyPic, but I don't need to do that any more.
I'll also transfer to it ongoing items like the category index of posts, and the temperature tracking page. I'll also make an updated TempLS page and keep it there.
The links within past posts still point to drop.IO, and I probably won't fix many of them. But you can always use the link under resources.
One merit of the new scheme is that I can link to individual files.
Wednesday, November 24, 2010
Subscribe to:
Post Comments (Atom)
Totally off topic, but I see you have been engaging at NoConsensus. I noticed your comment:
ReplyDelete"Sec 4 basically recomputes the artificial PC examples of M&M, and says that decentering can introduce bias, which is really not in dispute – the question is how much (in MBH98/99), which they never say."
I have no wish to get involved there, but in the case of Fig 4-1 and 4-4, nothing was even recomputed; those simulated PC1s were simply rendered from M&M's "top 1%" hickey stick archive (a fact missed by Wegman et al). And Wegman et al also completely misinterpreted M&M's procedure in GRL 2005, claiming that they had tested the Mann et al PCA on AR1(.2) red noise (low-order, low correlation), when in fact M&M had use persistent, high-autocorrelation ARFIMA noise. So even the "analysis" at the heart of Wegman et al was wrong. No wonder he didn't release "their" code It was actually just M&M's code minimally corrected to run properly, and completely misunderstood.
For more, see:
http://deepclimate.org/2010/11/16/replication-and-due-diligence-wegman-style
Oops make that "hockey stick" please!
ReplyDeleteThanks, DC, I wasn't aware of that until you dug out the information. I assumed that they'd used M&M code, not just copied the result.
ReplyDeleteTo be clear, they did use M&M code and that's what the M&M code *does*. It just copied a selection of the previously saved PC1s (it's a bug - it should be saving and displaying a new 1% every time, but the path is wrong).
ReplyDeleteBut also notice that they got the description of the M&M methodology completely wrong. They claimed it was AR1(.2) red noise!
Nick,
ReplyDeleteI put all those code excerpts just for you! You should really dig into that.