Cours Master 2: COMPUTATIONAL SCIENCE 2017

The lecture notes in overleaf are given here . Please fee free to edit, modify and complete the notes in latex (and add your name in the "scribe" section to make sure I remember!).

Lecture 1: Probability
A generic good reference is "All of Statistics", that contains a lot of the probabilistic statements we discussed.
Sanov and Cramers theorem are discussed in this very nice blog post.
The first Homework consists in few exercices in Probability. This homework will not be graded, so you do not have to send it to me. It is however important that you try to do it! If you want to see the answer of the pooling/sampling question in python just click here for the notebook.

Lecture 2: Statistics
Again frequentist statistics is well described in "All of Statistics". The wikipedia pages on maximum likelihood, Cramers-Rao and the Fischer Information are really well written. If you are Curious about the history of these things, the life of Sir Ronald Fisher is interesting (the good and the bad).

Homework 2: click here fore homework 2. To be given back before 13/10.
Need HELP? If you are new to python, you can take a look to these simple codes I give you

Lecture 3: Sampling simple distributions
A python example for sampling discrete distribution using Bi-section search in Tower Sampling can be found here.
My favorite book on Monte-carlo is the one from Werner Krauth Statistical Mechanics: Algorithms and Computations. It will be a very good reference for all this part of the course. If you are really interested by the subject, I strongly recommend following his MOOC on coursera as it is both fun and deep!

Lecture 4: Monte-Carlo Markov-Chains
The presentation follows this short introduction to the subject by Werner Krauth. You may watch his videos that correspond to the topic discussed in class: the Monte-Carlo Markov Chain and the convergence of the 3x3 pebble game.

Homework 3: click here fore homework 3. To be given back before 27/10.

Lecture 5: The Hard-Sphere model

Homework 4: click here fore homework 4. To be given back before 17/11.

Lectures on machine learning
The demo for the machine learning example are shown in python on github. Here is the notebook for The k-nn algorithm and this one is for the linear regression .
During the last lecture, I derived the JL lemma. The demonstration can be found here in details.
I added demo for the kernel and random projections that we discussed in the last lectures. here is a generic demo on random features and kernels and here is a demo on the famous MNIST dataset. Finally, i wanted to show you how easy it is to use neural networks these days, so this is a deep learning implementation for the MNIST database.
The exam of last year is posted here.