Below: Main recent and current projects - My research trajectory - List of topics/key words with links to main pubications.
When dealing with a difficult categorization task, the brain has to face two independent sources of uncertainty: categorization uncertainty and neuronal uncertainty. The latter stems from neuronal noise, whereas the former is intrinsic to the category structure in stimulus space: categories like phonemes or colors typically overlap, so that a given stimulus might belong to different categories. In works done with Laurent Bonnasse-Gahot,
we propose a general neural theory of category coding, in which these two sources of uncertainty are quantified by means of information theoretic tools. We derive analytical formulae which capture different psychophysical consequences of category learning - namely, a better discrimination between categories, and longer reaction times to identify the category of a stimulus lying at the category boundary. Our approach allows us to model experimental data. One main contribution of this work is to exhibit, in both quantitative and qualitative terms, the interplay between discrimination and identification.
[Publications : BGN08, BGN12]
Most experiments are based on reinforcement learning protocoles - e g a monkey learns a task through trials and errors, with rewards in case of success. An ongoing project is with the biologist Barry Richmond (NIH, Bethesda, USA) and his team on the behavioral learning of categories in monkeys (see CEHMRN13).
I have also some interest in the modelling at the interface between neuro-computation and social cognition.
Back to the top of this page.
In the mid 80's I turned to the modelling of neural networks, following the pioneering work of J. J. Hopfield on attractor neural networks. Since then my main contributions in the domains of neural networks and computational neuroscience have been in the modelling of short term memory in the human brain, the theory of supervised and unsupervised learning, and in the study of neural coding making use of both statistical physics and information theoretic tools. In the recent years I have been involved in collaborations with biologists. Some of my most recent works are motivated by problems at the boundary between linguistics and neuroscience (in particular in collaboration with psycholinguists). This lead me to study the neural coding of phonemes, and the neural mechanisms underlying decision making in multichoices tasks.
While working on problems motivated by the modelling of human cognitive processing, I have also addressed issues in the related fields of machine learning (theoretical and practical algorithmic aspects within the supervised learning framework), statistical inference, data analysis (in particular in bioinformatics) and signal processing (notably working on blind source separation and independent component analysis). The analysis of optimal neural coding also lead me to study of the statistical properties of natural images. Finally, during about ten years I have been acting as a consultant for a private research laboratory, working on applications of machine learning techniques.
My first work in the field of economic and social science dates back to the mid 90's, a collaboration on the topic of market organization with Gérard Weisbuch at the ENS and the economist Alan Kirman, EHESS. Since 2002, I work on collective phenomena in economic and social sciences in collaboration with physicists, mathematicians, economists and other social scientists.
Back to the top of this page.
within brackets, published or submitted [articles] - a same paper can be associated with several topics/keywords .
Sensory coding: information maximization and redundancy reduction
(written with N. Parga, 1997);
Information theoretic approach to neural coding and parameter estimation
(NIPS Workshop on "Statistical Theories of Cortical Function", 1998)
Keywords / topics:
neural coding, sensory coding, signal processing, data analysis; mutual information, Fisher information, and in particular:
neural coding versus parameter estimation
unsupervised versus supervised learning
link between infomax and
|typical and maximal mutual information between input and output:|
|perceptrons||[NP92, NP93, KPN97],||linear networks||[DGCPN95],||spiking neurons||[NBP98, BN98, BGN07]|
| continuous case |
(e.g. coding of an orientation)
|[BN98, BGN07]||categorical perception and decision making||[ BGN08, BGN12, CEHMRN13]|
learning and clustering (unsupervised and supervised)
bounds and retarded learning
|[WN94, HN99, BGN02]|
independent component analysis (ICA)
and blind source separation (BSS)
|[NP94, NP97, NBP98, PN00, ACN99, ACN00]|
|natural images analysis||[TMPN98, TNP03, MN04].|
|perception of space||[PON03]|
Collaboration with Nestor Parga
Application of ICA to geophysical data: collaboration with Filipe Aires and Alain Chédin, group ARA of LMD, Ecole Polytechnique, and IPSL .
For general references on ICA, see , ICA at CNL and (ICA-Central is a web site devoted to ICA/BSS).
On natural image analysis, see the web page of Antonio Turiel (previoulsy at UAM, LPSENS, INRIA; now at Institute for Marine Sciences of Barcelone).
Modeling memory: What do we learn from attractor neural networks?
(written with N. Brunel).
Keywords / topics:
attractor neural networks (ANN), associative memory, working memory, supervised learning, storage capacity, information capacity, sparse coding, in particular:
|duality relating neural coding and parameter estimation||[NP92]||dynamics of attractor neural networks||[14, 15]|
|storage/information capacity of perceptrons||[NP92, BNT92], [20-22]|
capacity in the
sparse coding limit
review paper on|
|storing temporal sequences||[LN93],[11,13]||
modeling with ANN
|palimpsests (working memory)||[9,10,12]||neural networks: from
physics to psychology
|[book 1993] |
|cerebellum: learning, Purkinje versus perceptron||[BHINB04, BBHN07, CNB12]|
Collaborations with Nicolas Brunel (Chicago), and at Ens: Vincent Hakim (LPS), Boris Barbour, Biology Department of Ens.
See also related works at LPSENS by S. Cocco, R. da Silveira, V. Hakim, Th. Mora and J. Ninio.
collaboration with Marc Mézard: tiling algorithm.
collaboration with Florence d'Alché-Buc: neural trees, trio-learning.
Collective phenomena and cognitive aspects in markets organization and social systems
Market and social organization - Discrete choices under social influence:
collaboration with G. Weisbuch and G. Deffuant (Irstea) [opinion dynamics: WDAN02]
collaboration with Denis Phan (GEMASS), Mirta B. Gordon (LIG, Grenoble) and Viktoriya Semeshenko (Buenos Aires), Jean Vannimenus (LPSENS)
[market and social organisation with heterogenous agents and social influence; multiple equilibria, hysteresis, learning agents]
[PGN03, PPN03, WEHIA2003 / NGPV05, GNPV05, MGN06, SGN07, GNPS09, GNPS13] [book chapter: PGN04]
collaboration with R. Cont (while at the CMAP, Ecole Polytechnique) and F Ghoulmie [financial markets - heterogeneous agents, stylised facts: GCN05]
Revue paper, in French, with M. B. Gordon : "Physique statistique de phénomènes collectifs en sciences économiques et sociales"
(preprint.pdf), in: revue Mathématiques et Sciences Humaines, num. 172, Hiver 2005, num. spécial "Modèles et méthodes mathématiques dans les sciences sociales : apports et limites").
Urban Social Dynamics
Project DyXi (2009-2011):
Dynamiques Citadines Collectives : Hétérogénéités Spatiales et Individuelles
(Urban Collective Dynamics: Individual and Spatial Heterogeneities)
modeling learning and evolution; evolution/perception of phonetic categories
collaboration with Janet Pierrehumbert (Northwestern University, Evanston)
Back to the top of this page.