Please login to be able to save your searches and receive alerts for new content matching your search criteria.
The issue of noise in biological systems is primarily a question of relations: between order and disorder, between stability and flexibility, and between processes at different temporal and spatial scales. In this paper, we use computational models of cortical structures to investigate relations between structure, dynamics, and function of such systems. In particular, we investigate the nature and role of noise at different organizational levels of the nervous system, emphasizing the neural network level. We show that microscopic noise can induce global network oscillations and pseudo-chaos, which make neural information processing more efficient. We find optimal noise levels for which the convergence to stored memory attractor states reaches a maximum, akin to stochastic resonance. We finally discuss the relation between neural and mental processes, and how computational models may relate to real neural systems.
We investigate the pairwise mutual information and transfer entropy of ten-channel, free-running electroencephalographs measured from thirteen subjects under two behavioral conditions: eyes open resting and eyes closed resting. Mutual information measures nonlinear correlations; transfer entropy determines the directionality of information transfer. For all channel pairs, mutual information is generally lower with eyes open compared to eyes closed indicating that EEG signals at different scalp sites become more dissimilar as the visual system is engaged. On the other hand, transfer entropy increases on average by almost two-fold when the eyes are opened. The largest one-way transfer entropies are to and from the Oz site consistent with the involvement of the occipital lobe in vision. The largest net transfer entropies are from F3 and F4 to almost all the other scalp sites.