I found an even better example of the value of Laplace approximation, and its just a small tweak to the example I did a few weeks ago: http://nbviewer.ipython.org/gist/aflaxman/6d0a9ff2441348f3a130

# Category Archives: statistics

## Laplace approximation in PyMC3, revisited

Comments Off on Laplace approximation in PyMC3, revisited

Filed under statistics

## MCMC in Python: Gaussian mixture model in PyMC3

PyMC3 is really coming along. I tried it out on a Gaussian mixture model that was the subject of some discussion on GitHub: https://github.com/pymc-devs/pymc3/issues/443#issuecomment-109813012 http://nbviewer.ipython.org/gist/aflaxman/64f22d07256f67396d3a

Filed under MCMC, software engineering, statistics

## Laplace approximation in Python: another cool trick with PyMC3

I admit that I’ve been skeptical of the complete rewrite of PyMC that underlies version 3. It seemed to me motivated by an interest in using unproven new step methods that require knowing the derivative of the posterior distribution. But, it is really coming together, and regardless of whether or not the Hamiltonian Monte Carlo stuff pays off, there are some cool tricks you can do when you can get derivatives without a hassle.

Exhibit 1: A Laplace approximation approach to fitting mixed effect models (as described in http://www.seanet.com/~bradbell/tmb.htm)

http://nbviewer.ipython.org/gist/aflaxman/9dab52248d159e02b2ae

Comments Off on Laplace approximation in Python: another cool trick with PyMC3

Filed under software engineering, statistics

## By no means unhelpful

Good advice from Density Estimation for Statistics and Data Analysis by Bernard. W. Silverman:

Filed under statistics

## Kish Stuff

A student came by interested in survey statistics and we go to talking about what an amazing person Leslie Kish must have been. We did some googling on it. Here are a few items we found:

http://projecteuclid.org/download/pdf_1/euclid.ss/1032209665

http://www.amstat.org/about/statisticiansinhistory/index.cfm?fuseaction=biosinfo&BioID=9

https://asapresidentialpapers.info/documents/Kish_Leslie_1977_edit_(wla_092809).pdf

Comments Off on Kish Stuff

Filed under statistics

## Non-parametric regression in Python: Gaussian Processes in sklearn (with a little PyMC)

I’ve got a fun class going this quarter, on “artificial intelligence for health metricians”, and the course content mixed with some of the student interest has got me looking at the options for doing Gaussian process regression in Python. `PyMC2` has some nice stuff, but the `sklearn` version fits with the rest of my course examples more naturally, so I’m using that instead.

But `sklearn` doesn’t have the fanciest of fancy covariance functions implemented, and at IHME we have been down the road of the Matern covariance function for over five years now. It’s in `PyMC`, so I took a crack at mash-up. (Took a mash at a mash-up?) There is some room for improvement, but it is a start. If you need to do non-parametric regression for something that is differentiable more than once, but less than infinity times, you could try starting here: http://nbviewer.ipython.org/gist/aflaxman/af7bdb56987c50f3812b

p.s. Chris Fonnesbeck has some great notes on doing stuff like this and much more here: http://nbviewer.ipython.org/github/fonnesbeck/Bios366/blob/master/notebooks/Section5_1-Gaussian-Processes.ipynb

Comments Off on Non-parametric regression in Python: Gaussian Processes in sklearn (with a little PyMC)

Filed under statistics

## Bayesian Correlation in PyMC

Here is a StackOverflow question with a nice figure:

Is there a nice, simple reference for just what exactly these graphical model figures mean? I want more of them.

Filed under statistics