
The brain must store and process continuous data to survive, from navigating a forest to keeping your eyes still. However, theoretical models of such continuous memories in neural networks have to date typically required fine-tuning. In a recent paper published in PRX Life, and featured on the journal cover, Prof. Tankut Can and his co-authors present a general principle, frozen stabilization (FS), by which neural networks self-organize in a way that allows them to store continuous memories. Importantly, these memories are robust to changes in the neural wiring, and do not require any fine-tuning of network connectivity. The principle of FS is also constructive, in that it provides a general recipe for building network models with robust continuous memories. This could have practical applications in machine learning or neuroscience modeling.
https://journals.aps.org/prxlife/abstract/10.1103/PRXLife.3.023006