Self-organized sequence processing in recurrent neural networks with multiple interacting plasticity mechanisms

Abstract

The highly recurrent connectivity encountered in the neocortical circuitry makes recurrent neural network (RNN) models highly suitable when investigating the computational properties of biologically inspired model neurodynamics. The recent reservoir computing (RC) models, an extension of the RNN paradigm, provide a framework for state-dependent computations, where information is encoded in the form of state-space trajectories, which is similar to recent findings in neurobiology. Over the past few years, several attempts have been made to endow these network models with adaptive mechanisms, capable of mimicking the various neural plasticity mechanisms known to exist in the brain and to play a fundamental role in shaping the dynamics and information processing capabilities of the underlying neural networks. In this thesis, we analyze the dynamic properties of a simple reservoir computer model, with self-organizing plasticity mechanisms operating concomitantly. We investigate how different combinations of three forms of biologically inspired adaptive mechanisms shape the reservoir’s dynamic properties and their effectiveness in acquiring an internal representation of structured symbol sequences. We demonstrate, replicating previous work, that only combined do these mechanisms allow the dynamic reservoir networks to achieve an input separation that outperforms static (i.e. without plasticity) reservoir networks. We further assess how the symbol sequences are internally represented in different network settings. All reservoir networks are shown to reflect the input structure in their state dynamics, but plasticity is clearly beneficial by modifying network parameters, increasing the network’s ability to learn the temporal structure of the input sequences.

Publication
MSc Thesis

Related