Since its inception, I’ve been a fan of Clear Climate Code, a project that tries to rewrite the rather messy NASA GISS temperature code into clear python (not that I would understand a lot about programming, but I admire the effort). So far, they’ve been quite successful and able to perfectly reproduce the steps done to create the GISTEMP global average surface temperature curve.
Among the most fascinating aspects is the amount of constructive skepticism the CCC people show. When they found a bug in GISTEMP, they sent the information to NASA who fixed it. Ultimately, GISS may take over the new code when it’s finished, and that would be a marvellous success story for citizen science. What CCC does is simply the opposite of what many contrarians do: Improving science, not seeding doubt about its validity.
And they’ve got more plans. Just a few days ago, Nick Barnes, the head behind CCC, announced the creation of the Climate Code Foundation (h/t Stoat). It’s supposed to act as an umbrella for projects related to temperature code, so far CCC and open climate code (though I can’t tell how much work, if any, has been done through the latter initiative). One project Barnes had already thought about loudly was the creation of clear code for paleo temperature reconstructions. Whether that’s going to be the next thing remains to be seen.
Something related to these efforts is what Ron Bronberg is doing over at The Whiteboard. After RealClimate reported about this project, a lot of new people became aware of his attempts to link the GHCN network of weather stations with the GSOD network, thereby creating a much more robust dataset. Ron’s motto is what many participants in the climate debate would like to see more often from contrarians: “Trust but verify”. With this in mind, they’d become real skeptics and finally helping to advance our understanding of the climate system, not undermining what we already know.
All these efforts show what many people were already aware of: That surface temperature records are indeed reliable. However, they advance upon earlier work since they form an independent, online-based reconfirmation, done by indivudals without any visible affiliation to classical climate science. This increases trust in temperature records and is therefore highly welcomed.
The UK MetOffice has understood the challenge related to surface temp records, and announced a conference on this issue to be held next week. Via surfacetemperatures.org, they discuss the necessity to come up with more robust, more spatially and temporarily advanced meteorological datasets to be used for climate science. If you’re interested why they’re doing this, you might want to read Peter Stott’s and Peter Thorne’s Nature article.
In 16 white papers, the MetOffice laid out its plans for this task. In White Paper #14, seeking input from the community, i.e. you, they have a nice graph showing how this might work, and how they may use social media platforms to facilitate the flow of information.
One of the best things about this initiative is that it actually reverses the process of dealing with contrarians. This is not a mere reaction, has the IPCC was forced to exert after that darn glacier error, but a proactive approach, and it puts climatologists in a much better role than many of them found themselves in since the UEA email theft.