Balneus

Australian Lefty on Politics, Governance, Science and Info Management

Keeping high policy uninformed

Posted by Dave Bath on 2008-05-15


Just published (2008-05-15) in Nature (doi:10.1038/453257a) is The Next Big Climate Challenge, which argues that significant funding is needed for climate modellers.  Indeed that climate modelling and the supercomputing grunt it requires should be considered as international "big science", with projects funded internationally in the same manner (although hopefully more effective) than CERN’s hadron smasher and space telescopes.

Policy cannot be informed without information.  Localized climate policy requires localized models.  This means big computers and big bucks, given that current models and the computers they run on are limited to "cells" of about 100km, enough to be certain there is a problem, but not good enough to inform hard practical policy decisions.

And even higher resolutions — a kilometre or less, say — may well be needed to handle such critical issues as cloud formation realistically.  Hence the need for computers a couple of
generations beyond the current state of the art.

Didn’t see any of this in Penny Wong’s 2008-05-14 Media Statement about $2.3B in four years to tackle climate change, did we?

There’s money to "adapt", but adapt to what?  How will agriculture be affected, and in what areas?  What do we do if a metropolis will get more rainfall than productive land (farmland and water catchments) a hundred kilometres away?

The Nature article points to the need for public purses to put climate modelling on the best computers in the world, which are now often limited for use…

in areas of national security such as communications intelligence or nuclear weapons design. And climate prediction is a national security issue if ever there was one.

It smells like the budget money is all about scoring political points for "doing something", rather than figuring out what are the right things to do.


See Also:

Advertisements

6 Responses to “Keeping high policy uninformed”

  1. Jacques Chester said

    The budget for a top-10 supercomputer – north of $100 million these days – sounds impossibly high, but in practice it’s actually pretty cheap for the amount of science you can do.

    Perhaps money could be invested in heterogenous supercomputing. Right now you build your supercomputer out of X thousand Cells, or X thousand opeterons or SH-4s or whatever and they’re all alike. It’d be nice if you could do something like Google and simply have a rolling inventory. So instead of $100 million every five years, you spend $20-30 million per year to keep the machine near the top 10.

    I’d suggest building it in the ACT. No real advantages in terms of power supply, but it’s so freaking cold that HVAC should be a doddle.

  2. Jacques Chester said

    This also reminds me that one of the US labs designed a family of ASICs for weather modelling which they figured would give an order of magnitude improvement per-watt. That could make it far more doable at the cost of flexibility.

  3. Dave Bath said

    Maybe we should send a Beowulf to rip the Bl**dy arms off those Canberra grendels! (In-joke for linux geeks).

    It’s not just the hardware, it is the investment you make to develop models, extra effort to collect the data to feed them,…

    http://www.top500.org/stats/list/30/apparea can be used to see the application areas with the most grunt. Look at the number of the top 500 in finance (and I bet “undeclared” means mil most of the time). Do you note that Weather research gets less than twice as much share as “gaming”?

    Hmmm. As of 2007/11, India had the 4th most powerful box in the world. Oz only has 1 in the top 500 list (number 200), sitting at http://www.apac.edu.au/

  4. zombinol said

    Government is as usual inefficient with the consulting companies and hardware vendors do nothing to reign in the inefficiencies lest their sales and income volumes drop. To spend any amount on new hardware to produce computing power is inefficient, short-sighted and obscene in the extreem, of all the existing government (All Australian governments) computing capacity (CPU, Memory, Storage and network bandwidth), a gross amount is sitting idle and consuming energy while doing nothing.

    I speak of the spare capacity that is not being used in ALL computers in ALL Governments in Australia and of course ALL computer everywhere.

    At best 10% of a desktop computers computing capacity is utilised during the 8 hour work day and 100% is not utilised for the other 16 hours, Server and Mainframe technology is usually utilised at varyingly higher levels, but there is still immense spare computing capacity.

    Grid Computing provides both the computing power required for the complex modelling while not requiring the proliferation of more hardware.

    So, spending $100 million or less to harnes those spare CPU cycles through the further development and implementation of Grid based computing would solve the need for computing power issue and deliver a platform where a single supercomputer is then seen as literally inefficient and small in computing capacity by comparison.

    Also, apart from the amount of computing capacity sitting idle is the benefit of no single point of failure, if you have 50,000 computers in 1,000 locations, that smacks of ultra redundancy, where as a supercomputing platform located in 1 or 2 locations is a huge liability to manage from the failure perspective it also concentrates the power load, the 50,000 computers scenario distributes the load across the whole country.

    But then again what would I know, I’m not a Politician.

    You cannot solve the problem with the same kind of thinking that created the problem – Albert Einstein

  5. Jacques Chester said

    zombinol;

    Not every problem is suited to loosely coupled distributed computing systems. Such as, for example, weather simulation. Some things are, such as, for example, protein folding.

    Not such a bad idea though, installing a BOINC client on government computers.

  6. Dave Bath said

    One of the problems with gridding “spare” capacity is the issue of the protection of privileged information (especially those that are privacy-related).

    Of course, we could require fully compartmentalized solutions using mandatory access control (MAC) on a process by process basis. But that’s a few extra bucks… and more importantly, the costs of awareness training…

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

 
%d bloggers like this: