Diese Anwendung erfordert Java-Skript.

Bitte aktivieren Sie Java-Script in den Browser-Einstellungen.
Learning non-linear probabilistic computations in recurrent spiking networks
Lefebvre, Jérémie; Black, Michael (2013)
clipboard
mla
clipboard
Lefebvre J., et al. "Learning non-linear probabilistic computations in recurrent spiking networks.", timms video, Universität Tübingen (2013): https://timms.uni-tuebingen.de:443/tp/UT_20130926_009_bestcon_0001. Accessed 28 Apr 2024.
apa
clipboard
Lefebvre, J. & Black, M. (2013). Learning non-linear probabilistic computations in recurrent spiking networks. timms video: Universität Tübingen. Retrieved April 28, 2024 from the World Wide Web https://timms.uni-tuebingen.de:443/tp/UT_20130926_009_bestcon_0001
harvard
clipboard
Lefebvre, J. and Black, M. (2013). Learning non-linear probabilistic computations in recurrent spiking networks [Online video]. 26 September. Available at: https://timms.uni-tuebingen.de:443/tp/UT_20130926_009_bestcon_0001 (Accessed: 28 April 2024).
file download bibtex   endnote
Information
title: Learning non-linear probabilistic computations in recurrent spiking networks
alt. title: Bernstein Conference 2013: Computational Vision
creators: Lefebvre, Jérémie (author), Black, Michael (annotator)
subjects: Bernstein Conference, Computational Neuroscience, Computational Vision, Non-linear Probabilistic Computations, Recurrent Spiking Networks, Jérémie Lefebvre
description: Bernstein Conference 2013, 24. bis 27. September 2013
abstract: Cortical networks are known to perform canonical operations, like exponentiation, linear filtering, and normalization; key modular tasks combined by the sensory cortices to build accurate input representations [1]. According to the paradigm of probabilistic population codes(PPCs)[2], these form the basis of more complex probabilistic computations, like multisensory integration and marginalization[3], that operate on neural codes representing probability distributions. Previous work using PPCs has demonstrated how some of these computations can be implemented in biological neural networks, but only at the population scale. Moreover, the issue of learning has been largely ignored. To address these issues, we propose a computational framework designed to explore the ways in which fully recurrent spiking networks can learn to perform probabilistic computations. Specifically, we show that a variation of the STDP learning rule can be used to train a recurrent network to perform near-optimal probabilistic computations using spike response models [4]. The algorithm first generates population response patterns that correspond to a target posterior reflecting the operation of interest. Along with input patterns of activity, this output pattern is used as a training signal to learn feedforward and recurrent weight matrices. To test our method, we considered a simple task: cue combination, where the activity of two populations is summed probabilistically to preserve information [2]. Our simulations confirmed that the trained network can learn non trivial connectivity matrices to perform cue integration with near zero loss of information. References [1] M. Carandini and D.J. Heeger, Normalization as a canonical neural computation, Nature Reviews Neuroscience 13: 51-62, 2012 [2] W.J. Ma et al. Bayesian inference with probabilistic population codes, Nature Neuroscience, 9: 1432-1438 , 2006 [3] J.M. Beck et al. Marginalization in Neural Circuits with Divisive Normalization, The Journal of Neuroscience, 31(43): 15310-15319, 2011 [4] W. Gerstner and W. M. Kistler, Spiking Neuron Models, Cambridge University Press, 2002
publisher: ZDV Universität Tübingen
contributors: Bernstein Center for Computational Neuroscience Tübingen (BCCN) (producer), Bethge, Matthias (organizer), Wichmann, Felix (organizer), Lam, Judith (organizer), Macke, Jakob (organizer)
creation date: 2013-09-26
dc type: image
localtype: video
identifier: UT_20130926_009_bestcon_0001
language: eng
rights: Url: https://timmsstatic.uni-tuebingen.de/jtimms/TimmsDisclaimer.html?638499455594155425