1. Skip to content
  2. Skip to main menu
  3. Skip to more DW sites

Predicting the future with supercomputers

Emma Wallis
March 31, 2017

Scientists will tell you it's impossible to predict the unpredictable. Those in the field of data assimilation get as close as possible. DW's Emma Wallis visits them at Japan's K supercomputer.

K Computer Front View, Riken Institute, Kobe Japan
Image: Riken

There are no crystal balls at the Riken Advanced Institute for Computational Science (AICS) in Kobe, Japan. What there is, though, is a vast supercomputer behind glass on a site spanning over two hectares.

The computer itself looks like a vast, gently buzzing art installation. Every other neat rectangular cabinet is clothed in a red banner with the Japanese character K, which means 10 quadrillion. K operates 10 petaflops, or a quadrillion of floating operations per second. K can alternatively mean the gate and in some translations square jewel, blessing or jubilation.

"What we're doing here is exploring the future," says Dr Takemasa Miyoshi, who leads the data assimilation team at the Riken AICS in Kobe.  He describes data assimilation as "combining simulation with real world data [so that] the simulation represents the real world precisely."

Specially designed

The K computer has already crunched several huge simulations of how a potential earthquake might play out in cities like Tokyo. The simulations look at how the buildings will react to tremors, but also at the potential behavior of people as they evacuate. From these simulations, researchers can design better emergency response and support for any buildings with detected weaknesses.

Akihiko Okada works with the research communications team at the Riken institute in Kobe. He, like everyone I spoke to, accepts with pragmatism that his institution is located in one of the most seismic spots on the planet. A part of the island itself came about after "The Great Hanshin earthquake" of 1995, which caused serious destruction in Kobe town and the deaths of more than 6,000 people. Rubble from the earthquake was used to build the island.

Okada seems pleased that computers like the K can work to prevent more death and destruction in the event of another event like the Tohoku earthquake in 2011, which resulted in the deaths of more than 16,000 and triggered the Fukushima nuclear disaster. .

"People say when we had that Tohoku earthquake, it was above and beyond our imagination - it's too big, so they believed they couldn't prepare for it," Okada explains. "But using this computer, we try to simulate various cases. And we would like to eliminate that feeling of the unexpected so we can save people's lives."

The door to the future

At the data assimilation conference in Kobe, scientists from several different fields are in attendance - from climate change, to biology, to physics, meteorology and bioinformatics. One of them was Professor Dr Roland Potthast, the director of data assimilation at the German Weather Center near Frankfurt. Even with the precise models used by meteorologists and the enormous amounts of data they can now gather, it's just a pinprick in what is potentially out there, he says.

"The models are huge, 10 to the power of 9, so you have 10 million degrees of freedom in your model. And the data is just a tiny bit of that."

Image: Riken

To illustrate the issue, Potthast describes an ideal - but unlikely - scenario for collecting weather data: An airplane would fly through the atmosphere and, every eight miles (13 kilometers), it would record 90 layers of data stretching 43 miles high. In reality, however, "You always have comparatively sparse data," he adds.

That's why, he concludes, we need computers like the K to run data assimilation cycles.  The German weather center itself has two supercomputers: One runs forecasts 24 hours a day to cater, in part, to the flight paths running across Germany and Europe. The other they use for their experiments. Without this constant number crunching, he tells DW, all flights would have to be grounded, since a plane can't take off without an accurate weather forecast for its flight path.

An AI solution?

Artificial intelligence, or machine learning, is one method which some are trying out as a crystal ball. Others use types of filtering algorithms to eliminate the degree of uncertainty in a model, thereby obtaining more accurate results.

Takemasa Miyoshi heads the data assimilation team at the Riken AICSImage: Riken Institute

Miyoshi acknowledges that AI can be a tool in his trade, but it's not the only one they use. What his team is really looking forward to is the next generation of supercomputer, which should be ready for use in 2021.

"The Riken AICS is planning the next generation of the K ... with Fujitsu. So we will have more computing power in the future."

But it's not just computing power they need.  Now the race is on to design specific software that will tap the potential of that massive power. That and the cables are the real key, it seems, to really glimpsing the future.

One of the new computer's first tasks, Miyoshi explains, will be to work on an already initiated project which aims to count every single tree on planet earth in order to assess environmental damage. Since there are far more trees on Earth than there are stars in our own galaxy, it's a very good thing the next generation supercomputer is slated to be 100 times faster.

 

Skip next section DW's Top Story

DW's Top Story

Skip next section More stories from DW