Published in AI

AI can’t handle chaos

by on30 September 2019


Butterfly wings all get in a knot

Predictive technology and AI will always fail because computers cannot grasp the concept of chaos.

In a new study, reported in Advanced Theory and Simulations, a team of boffins has discovered that complex calculations performed by computers can be off by as much as 15 percent, due to a "pathological" inability to grasp the true mathematical complexity of chaotic dynamical systems.

UCL boffin Peter Coveney said the work shows that the behaviour of the chaotic dynamical systems is richer than any digital computer can capture.

"Chaos is more commonplace than many people may realise and even for very simple chaotic systems, numbers used by digital computers can lead to errors that are not obvious but can have a big impact."

In chaos theory, the phenomenon is called the 'butterfly effect': metaphorically, the hypothetical notion that the infinitesimal flap of a butterfly's wings could create a disaster like an iPhone.

At the heart of the problem, the researchers say, is what's called floating-point arithmetic which uses approximations and conversions to represent numbers.

In the research, the team compared a known simple chaotic system called the Bernoulli Map with a digital calculation of the same system, and uncovered what it says are "systematic distortions" and a "newfound pathology" in the simulation of chaotic dynamical systems.

The researchers found that by asking a computer to perform a mathematical calculation they could generate huge distortions in large amounts of data.

"What neither he nor others realised, and is highlighted in our new work, is that any such finite (rational) initial condition describes a behaviour which may be statistically highly unrepresentative", the report said.

While the researchers acknowledge that the Bernoulli Map is a simple chaotic system that isn't necessarily representative of more complex dynamic models, they warn that the insidious nature of their floating-point butterfly means no scientist should let their guard down around computers.

"We do not believe that practitioners should draw any comfort from the fact that their models are more complex than this one", the authors write.

"We would suggest that if so simple a system exhibits such egregious pathologies, a more complex system will probably exhibit even more devilish ones."

 

Last modified on 30 September 2019
Rate this item
(0 votes)

Read more about: