There has been some interesting traffic on the PyMC mailing list lately. It seems that there is a common trouble with the “Adaptive Metropolis” step method, and it’s failure to converge. I’ve been quite impressed with this approach, and I haven’t had the problems that others reported, so I started wondering: Have I been lucky? Have I been not looking carefully?
I decided to do some experiments to make Metropolis and Adaptive Metropolis shine, and since I’m an aspiring math filmmaker these days, I made my experiments into movies.
I consider the Metropolis step method the essence of MCMC. You have a particular point in parameter space, and you tentatively perturb it to a new random point, and then decide to accept or reject this new point with a carefully designed probability, so that the stationary distribution of the Markov chain is something you’re interested in. It’s magic, but it’s good, simple magic.
Here is the simplest example I could come up with, sampling from a uniform distribution on the unit square in the plane using Metropolis steps. Any self-respecting step method can sample from the unit square in two dimensions!
I’m supposed to be writing a lot right now. Papers, grants, documentation, there’s lots to write (blogs don’t count). But I’ve been a little bit blocked, so I’ve been reading instead. Since I spend so much time reading things where the idea is paramount and the prose is barely functional, I thought I’d mix it up and read some things that are well written just because they are well written. Maybe I’ll get inspiration from authors who put words together well.
On that note, I just finished up Slumberland, the latest novel by Paul Beatty. I like the idea that great writing is happening currently, and not just something in the “classics” section of the library.
Beatty writes epics about Black superheros, so there was no reason for me to expect math to make an appearance in his latest story. Its about an American deejay who moves to pre-unification Berlin with the perfect beat. That makes the math content here the exact opposite of what I complained about after I saw Salt last summer, where screenwriters made Angelina Jolie math-phobic for no reason.
It was quite a pleasant surprise when early on in the book, following some laugh-out-loud funny dialogue that I won’t even hint at here, the surprising results of a math test come out:
The scores were posted outside the classroom in descending order. It was the first computer printout I’d ever seen. There was something affirming about seeing my name and score—FERGUSON W. SOWELL: 100/100—at the top of the list in what was then a futuristic telex font. I felt official. I was real.
Thank you, Paul Beatty, for making your superhero a math whiz on the side.
All-time-most-popular post on healthyalgorithms: pictures of teacup pigs I copied from a long forgotten newspaper article
New hobby: making mazes with MCMC
Together, they look like this:
The stats helper monkeys at WordPress.com mulled over how this blog did in 2010, and here’s a high level summary of its overall blog health:
The Blog-Health-o-Meter™ reads Wow.
About 3 million people visit the Taj Mahal every year. This blog was viewed about 32,000 times in 2010. If it were the Taj Mahal, it would take about 4 days for that many people to see it.
In 2010, there were 52 new posts, growing the total archive of this blog to 116 posts. There were 39 pictures uploaded, taking up a total of 5mb. That’s about 3 pictures per month.
The busiest day of the year was August 27th with 430 views. The most popular post that day was MCMC in Python: Global Temperature Reconstruction with PyMC.
Where did they come from?
The top referring sites in 2010 were code.google.com, blog.computationalcomplexity.org, reddit.com, Google Reader, and math.cmu.edu.
Some visitors came searching, mostly for teacup pigs, tea cup pigs, teacup pig, math art, and pymc.
Attractions in 2010
These are the posts and pages that got the most views in 2010.
MCMC in Python: Global Temperature Reconstruction with PyMC August 2010
Paper rejected, Cheer Up with Baby Animals November 2009
MCMC in Python: PyMC for Bayesian Probability November 2008
MCMC in Python: PyMC for Bayesian Model Selection August 2009
Multilevel (hierarchical) modeling: what it can and cannot do in Python December 2009