Monte Carlo integration examples

This notebook shows some examples of Monte Carlo integration which we'll look at in class.

If you are looking at the HTML export of this notebook, the source file is here: monte-carlo.ipynb and you can download it and run it in Jupyter Notebook if you have it. (If you'd like to get it, installing Anaconda is a quick way.)

Some 1D examples

Here is an integral we'd like to compute:

$$I = \int_0^1 1 - \sqrt{1 - x^4} dx$$

It doesn't have a closed form but the value (computed by Wolfram Alpha as an Elliptic Integral) is approximately 0.125981.

Here is a strategy to estimate this integral using uniform sampling.

Let's ses how the error depends on the number of samples...

This log-log plot comes out as a straight line, showing that the number of samples and the noise have a power-law relationship; and the slope of this line at $-1/2$ shows that the noise is inversely proportional to the square root of the number of samples, which is what we expected.

Importance sampling

Here is another sampling strategy that tries to mimic the integrand with something that is easy to integrate analytically.

A quick verification that this sampling strategy is internally consistent.

From this plot we can see that importance sampling has significantly decreased the noise (by a factor of around 10 in this case) but since the slope is unchanged, the convergence rate is still the same ($1/\sqrt{N}$).

Stratified sampling

Another strategy for reducing variance is to change the set of random samples being fed into the process. We can do that in our example just by replacing the function that generates the samples with one that generates stratified samples.

This plot shows that in this example stratification has a huge benefit. Comparing to the reference line at slope -1, you can see the slope of the dashed lines on that plot is about -1.5. Again this is as expected for a nice smooth 1D integrand.

Let's try a messier integrand to see if it takes more samples for stratification to start winning.

Now you can see that we have to get past 100 samples before the better convergence rate of stratified sampling starts to take over.

Stratified sampling in 2D

I just wanted to illustrate the appearance of stratified point sets in 2D.

Let's try using these point sets to integrate a Gaussian over the unit square. Per Wolfram Alpha the answer is about 0.557746.

Stratification is getting us a more accurate answer at 100 samples; let's make the plot to see the trend...

Now we see that stratification gets us from $N^{-1/2}$ to $N^{-1}$.