What is it about?
Neural networks are increasingly employed to model, analyze, and control non-linear dynamical systems ranging from physics to biology. Owing to their universal approximation capabilities, they regularly outperform state-of-the-art model-driven methods in terms of accuracy, computational speed and/or control. On the other hand, neural networks are very often taken as black boxes whose explainability is challenged. In this paper, we analyze how neural networks successfully manage the longstanding challenge of classifying signals in chaotic or regular. We consider a neural network with an architecture which lends itself well for analysis, Large Kernel Convolutional Neural Networks (LKCNNs). We open its black box to reveal the underlying learning mechanisms.
Featured Image
Photo by Steve Johnson on Unsplash
Why is it important?
We have shown that to classify signals with high accuracy, LKCNNs use qualitative properties of the input sequence. This enables them to outperform classical methods. LKCNNs higher classification accuracy strongly emerges as we consider regular signals which are almost chaotic. We also investigated the emerging connection between input periodicity and periodicity within the network layers. We have shown this aspect to be paramount for performance. This could give new baseline requirements during neural network training.
Perspectives
Read the Original
This page is a summary of: How neural networks learn to classify chaotic time series, Chaos An Interdisciplinary Journal of Nonlinear Science, December 2023, American Institute of Physics,
DOI: 10.1063/5.0160813.
You can read the full text:
Contributors
The following have contributed to this page