posted on 2024-07-12, 18:00authored byAndre Van Schaik, Levin Kuhlmann, Michael Hauser-Raspe, Jonathan Manton, Jonathan Tapson, David B Grayden
Bayesian spiking neurons (BSNs) provide a probablisitic and intuitive interpretation of how spiking neurons could work and have been shown to be equivalent to leaky integrate-and-fire neurons under certain conditions [1]. The study of BSNs has been restricted mainly to small networks because online learning, which currently involves a maximum-likelihood-expectation-maximisation (ML-EM) approach [2, 3], is quite slow. Here a new approach to estimating the parameters of Bayesian spiking neurons, referred to as fast learning (FL), is presented and compared to online ML-EM learning.
Funding
Understanding cortical processing: Neuronal activity and learning in recurrently connected networks