State-dependent processing in Spiking Neural Networks

Abstract

Cognitive and behavioral processes are indissociable from their biophysical substrateand the characteristics of the underlying processing elements. As such, neural computation and the properties of functional neurodynamics ought to be understood primarily as complex biophysical phenomena: nested interactions, spanning multiple temporaland spatial scales and distributed across massively parallel modular hierarchies. The observable dynamics, both at a mesoscopic and microscopic scale, are the result of complex nonlinear interactions, the ensemble actions of very large and heterogeneous neuronal populations. These, shaped by evolutionary and developmental constraints and permanently subjected to functional re-organization, constitute very proficient adaptive processing systems. To a first approximation, this complex circuitry can be seen as large excitable reservoirs, whose symmetry-breaking inhomogeneities (present at multiple levels) naturally give rise to rich high-dimensional dynamics that supports cognitive function and computation. Furthermore, the intrinsic recurrent dynamics endows the system with fading memory, but places critical constraints on processing precision. Neural computation ought to be studied in context, accounting for the emergence of mental phenomena and thus guided and constrained by findings from the cognitive and behavioral sciences. These can provide tasks and computational specifications, as well as performance constraints while the necessary parallels between structure and function are gradually and systematically established. In this context, it is reasonable to study the implementation of different aspects of cognitive function that express themselves as temporal sequences given that they are ubiquitous in multiple cognitive domains (from sensorimotor sequencing to language processing). Interestingly, the cortical architecture and its underlying functional relations also appear to be highly sensitive to serial order and temporal structure. The ability to perceive statistically repeating spatiotemporal patterns and abstract the underlying rules may constitute a domain-general mechanism for the acquisition of predictive relations, as it allows for efficient generalization over compact representations. The ever-changing network of synaptic (and neuronal intrinsic) properties determines, on short to medium timescales, the ability of a circuit to process time-varying input in an active, predictive manner, dynamically patterned as transient sequences of network states which, through the orchestration of multiple plasticity mechanisms, come to reflect an increasingly restricted and accurate internal model of the relevant knowledge structure. Throughout this thesis, we propose to model the underlying systems (at various degrees of biological plausibility) in functional contexts, in order to gain knowledge about the system itself and the computational relevance of its internal features. We attempt to shed light on the nature of on-line integration of information, while evaluating the character of on-line processing memory and finite precision computation in systems where the current state continuously interacts with and modifies the processing characteristics.

Publication
PhD Thesis

Related