From statistical physics to statistical inference and back

Edited by Peter Grassberger and Jean-Pierre Nadal

NATO ASI Series, Volume C428

Kluwer Acad. Pub., Dordrecht, 1994 (now published by Springer)

(ISBN 0-7923-2775-6)

Click here to see Springer's presentation

This volume (NATO ASI Series C 428), following a meeting organised in Cargese (Corsica) from
August 31 to September 12, 1992, is on the notion of inference, at the frontier between physics
and mathematics. The word "Inference" denotes the ability to derive a general rule from a particular
set of observations. Physicists, for modeling physical systems with a large number of degree of
freedom, and mathematicians, for performing data analysis, have developped their own concepts
and methods for making the "best" inference. Cognitive science is one particular field among
others where the notion of inference plays a central rôle. In particular, both the physicist and the
mathematician approaches have been used recently in the theory of formal neural networks.

Even if there are well known connections between the various methods (statistical physics is based
on the concept of entropy, which is related to the information quantity of Shanon), there is a need
for a clarification. The meeting was a first attempt to detail the possible bridges and gaps between
the different approaches. Gathering physicists and mathematicians, it was, however, mainly
intended to physicists - who, as much as Monsieur Jourdain speaking in prose, make inferences
every day without knowing it.

The book presents the concepts and methods of the main approaches (maximum entropy principle,
statistical physics approach to learning, Bayesian approach, minimum description length...) and
present applications (physical systems, neural networks and learning theory, spin-glasses, coding,
forecasting time series, ...). After an introductory chapter which give an idea of the spirit of the
meeting, there is a section presenting the main theoretical approaches to inference. Then a section
on (Shanon) coding and the statistical physics of disordered systems, where the unexpected
profound relationship between these two domains is introduced. The next section on learning
(mainly neural networks theory) is followed by a section on dynamical systems. Finally, the last
section is on the notion of inference in quantum mechanics.

There exists volumes on specfic topics - e.g. on learning theory, on maximum entropy principle,
etc. -, but there is not a single volume which put together, in a constructive way, the most relevant
approaches to inference theory like this one.

Contents and Preface of the book (dvi.gz file)