Australian National University to switch on largest supercomputer next week
- 05 October, 2012 15:25
The Australian National University (ANU) will begin performance testing of the country’s most powerful supercomputer on October 10 following a system build that began in mid-August.
Earth system and climate change scientists and researchers will be the first recipients of the supercomputer’s massive 1.2 petaflops of processing power – provided by 57,000 Intel-based cores – when the machine goes into production by early 2013.
The machine uses Fujitsu’s PRIMERGY x86 high performance computing (HPC) clustered design and Intel Xeon E5 CPUs. It has 176 terabytes of memory and 12 petabytes of disk storage. It is eight rows deep in the data centre and each row is 14 metres wide.
Professor Lindsay Botten, director at National Computational Infrastructure (NCI) – an initiative between the ANU and the Australian government – said around half of the machine’s power will be dedicated to modelling Earth systems such as weather and long-term climate change.
The Bureau of Meteorology will share the machine with the CSIRO and several universities to do climate and Earth system science work, said Professor Botten.
“It will assist in solving a lot deeper problems; [researchers] will have more elaborate calculations, which will enable them to consider more elaborate research questions.”
Professor Botten said researchers want to use the supercomputer to provide seasonal weather modelling over a months rather than a few days.
“You can see the economic impact of [weather changes]. If you can start to get [longer-term weather] models that work accurately, you can make some economic decisions like, ‘Do I put a crop in the ground or not?’” he said.
He added that a machine of this size and memory capacity will also enable weather forecasters to work at much higher resolution to gain more accurate results about the potential impact of severe thunderstorms.
Other government agencies, universities and a small number of private enterprises will access the supercomputer in the future using a Linux-based terminal or through a browser to access several Web services.
Technicians from Fujitsu in Australia and Japan have assembled the supercomputer, which was delivered to a dedicated data centre at the ANU “in three semi-trailer loads per weekend for four weekends,” said Professor Botten.
“By the end of this month, it will essentially be fully built and start undergoing performance testing from October 10 to the end of October,” he said.
In November, Fujitsu will run acceptance testing to demonstrate the robustness of the machine using various bench marking tools. The ANU will then load the CentOS operating system software – derived from Red Hat Linux – onto the system.
The machine consumes 1.5 megawatts of power or the equivalent of up to 500 electric ovens switched on around-the-clock, according to Professor Botten.
“[We] are looking at [spending] $3 million to $4 million per year for electricity,” he said.
According to Botten, the machine is eight times larger than its predecessor at NCI and a “factor of 10” in terms of performance behind the fastest machine in the world – dubbed Sequoia – at Lawrence Livermore National Laboratory at the Department of Energy in the United States.
This $100 million, four-year supercomputing project is a partnership between the ANU and other universities, CSIRO, Bureau of Meteorology, Geoscience Australia and the Australian government.
“There was $50 million [allocated] for the infrastructure, about $26 million for the machine, $23 million in the building and a couple of million dollars to do some upgrades,” he said.
The supercomputer will be “50 times” more powerful than the clustered machine launched yesterday by eResearch in South Australia., according to Professor Botten.
Follow Byron Connolly on Twitter: @ByronConnolly
Join the CIO Australia group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.
How to Switch From iPhone 5S to BlackBerry Z30 (and Why)
CIOs to Become In-House Brokers -- and That's a Good Thing
The future of computing
10 Hot Hadoop Startups to Watch
The future of computing
Why you should be re-thinking your approach to data protection
Organisations of all shapes and sizes need a new approach to data protection that addresses the challenges of data growth, but IT budgets are not keeping pace with the escalating costs of supporting storage requirements. This whitepaper explores how securing and retrieving organisational data will need to be done more efficiently.
Rebranded Quadmark revamps its IT solutions with Google Apps
The Singapore office was using Exchange as its email server but encountered various issues such as storage capacity limitations and difficulty in managing spam. Adding new users to the server was also a hassle that often required a third party vendor, resulting in a waste of time and resources. Quadmark also experienced email performance issues that slowed down their employees’ response time, leading to frustration among staff and clients. Quadmark’s management felt that it was unacceptable to continue it’s current solution and thus decided to streamline its IT infrastructure alongside its rebranding plans. The business wanted a unified and consolidated email service for its various offices. Quadmark also wanted to be able to house files and documents on the cloud.
Research firm Radicati names Google Apps for Business the leader in cloud business email
Radicati Market Quadrants illustrate how individual vendors fit within a specific technology market, classifying them as niche specialists, up-and-coming pioneers, today’s top players or yesterday’s leaders. In August 2013, Radicati used this model t o assess and compare 14 cloud business email providers, including Google.