Solving complex computer problems in an entirely new way

Scientists develop the next generation of reservoir computing.


Scientists at the Ohio State University have discovered the next generation of reservoir computing that can solve the hardest hard computer problems. This new computing makes reservoir computing 33 and a million times faster.

Scientists solved a complex computing problem in less than a second on a desktop computer with such speed. Plus, it requires significantly fewer computing resources and less data input.

Daniel Gauthier, a lead author of the study and professor of physics at The Ohio State University, said, “Reservoir computing is a machine learning algorithm developed in the early 2000s and used to solve the “hardest of the hard” computing problems, such as forecasting the evolution of dynamical systems that change over time.”

In a previous study, it was found that reservoir computing is appropriate for learning dynamical systems. It also provides accurate forecasts about how they will behave in the future.

Reservoir computing uses an artificial neural network. Scientists feed data on a dynamical network into a “reservoir” of randomly connected artificial neurons in a network. The network produces useful output that the scientists can interpret and feedback into the network, building a more accurate forecast of how the system will evolve in the future.

Gauthier said, “One issue has been that the reservoir of artificial neurons is a ‘black box’ and scientists have not known exactly what goes on inside of it – they only know it works.”

“The artificial neural networks at the heart of reservoir computing are built on mathematics. We had mathematicians look at these networks and ask, ‘To what extent are all these pieces in the machinery needed?”

When scientists investigated the answer to the question, they found that this new computing could be significantly simplified. They tested their concept on a forecasting task that involves a weather system. Their next-generation reservoir computing was a clear winner over today’s state.

In one relatively simple simulation done on a desktop computer, the new system was 33 to 163 times faster. To be precise, it is about 1 million times faster. Also, it shows the accuracy equivalent of just 28 neurons.

Gauthier said, “For our next-generation reservoir computing, there is almost no warming time needed. The warmup is training data that needs to be added as input into the reservoir computer to prepare it for its actual task.”

“Currently, scientists have to put in 1,000 or 10,000 data points or more to warm it up. And that’s all data that is lost, that is not needed for the actual work. We only have to put in one or two or three data points.”

“What’s exciting is that this next generation of reservoir computing takes what was already very good and makes it significantly more efficient.”

In the future, scientists are planning to solve more complex problems such as forecasting fluid dynamics.

Journal Reference:
  1. Gauthier, D.J., Bollt, E., Griffith, A. et al. Next-generation reservoir computing. Nat Commun 12, 5564 (2021). DOI: 10.1038/s41467-021-25801-2
Latest Updates