TY - JOUR
T1 - Exploring Flip Flop memories and beyond
T2 - training Recurrent Neural Networks with key insights
AU - Jarne, Cecilia
N1 - Publisher Copyright:
Copyright © 2024 Jarne.
PY - 2024/3
Y1 - 2024/3
N2 - Training neural networks to perform different tasks is relevant across various disciplines. In particular, Recurrent Neural Networks (RNNs) are of great interest in Computational Neuroscience. Open-source frameworks dedicated to Machine Learning, such as Tensorflow and Keras have produced significant changes in the development of technologies that we currently use. This work contributes by comprehensively investigating and describing the application of RNNs for temporal processing through a study of a 3-bit Flip Flop memory implementation. We delve into the entire modeling process, encompassing equations, task parametrization, and software development. The obtained networks are meticulously analyzed to elucidate dynamics, aided by an array of visualization and analysis tools. Moreover, the provided code is versatile enough to facilitate the modeling of diverse tasks and systems. Furthermore, we present how memory states can be efficiently stored in the vertices of a cube in the dimensionally reduced space, supplementing previous results with a distinct approach.
AB - Training neural networks to perform different tasks is relevant across various disciplines. In particular, Recurrent Neural Networks (RNNs) are of great interest in Computational Neuroscience. Open-source frameworks dedicated to Machine Learning, such as Tensorflow and Keras have produced significant changes in the development of technologies that we currently use. This work contributes by comprehensively investigating and describing the application of RNNs for temporal processing through a study of a 3-bit Flip Flop memory implementation. We delve into the entire modeling process, encompassing equations, task parametrization, and software development. The obtained networks are meticulously analyzed to elucidate dynamics, aided by an array of visualization and analysis tools. Moreover, the provided code is versatile enough to facilitate the modeling of diverse tasks and systems. Furthermore, we present how memory states can be efficiently stored in the vertices of a cube in the dimensionally reduced space, supplementing previous results with a distinct approach.
KW - computational neuroscience
KW - dynamics
KW - eigenvalue distribution
KW - flip flop
KW - recurrent neural networks
UR - http://www.scopus.com/inward/record.url?scp=85190379989&partnerID=8YFLogxK
U2 - 10.3389/fnsys.2024.1269190
DO - 10.3389/fnsys.2024.1269190
M3 - Journal article
C2 - 38600907
AN - SCOPUS:85190379989
SN - 1662-5137
VL - 18
JO - Frontiers in Systems Neuroscience
JF - Frontiers in Systems Neuroscience
M1 - 1269190
ER -