Five Interesting Papers
Originally Shared Publicly via Google+ check there for comments
Had a very nice dinner with Mike Tipping and David Duvenaud on Tuesday night in Cambridge. Great to catch up with Mike, who’s had a big influence on ML with probabilistic PCA and sparse Bayesian learning. He’s been working in the commercial sector for a few years (but it was really nice when David mentioned [in passing] perspectives about sparsity that were originated by Mike!).
Mike followed up with a mail asking the following question (shared with permission): “If I were to read 5 papers from the last couple of years that capture the interesting/important stuff happening in ML, what would they be?”
So below’s my answer: I love the fact that four of them are on arxiv. I also know that at least two of them had trouble getting published (either delayed in publication or reviewers not enthusiastic … etc).
They are chosen partly as reflections of where I think the field is going, and partly as reflections of where I think the field should be going. And of course the list is totally subjective and missing great papers by some of my favourite researchers: it’s a personal list, but Mike and I share similar tastes. It will be interesting to hear Mike’s opinion about them when he’s done.
Stochastic variational inference by Hoffman, Wang, Blei and Paisley A way of doing approximate inference for probabilistic models with potentially billions of data … need I say more?
Austerity in MCMC Land: Cutting the Metropolis Hastings by Korattikara, Chen and Welling Oh … I do need to say more … because these three are at it as well but from the sampling perspective. Probabilistic models for big data … an idea so important it needed to be in the list twice.
Practical Bayesian Optimization of Machine Learning Algorithms by Snoek, Larochelle and Adams This paper represents the rise in probabilistic numerics, I could also have chosen papers by Osborne, Hennig or others. There are too many papers out there already. Definitely an exciting area, be it optimisation, integration, differential equations. I chose this paper because it seems to have blown the field open to a wider audience, focussing as it did on deep learning as an application, so it let’s me capture both an area of developing interest and an area that hits the national news.
Kernel Bayes Rule by Fukumizu, Song, Gretton One of the great things about ML is how we have different (and competing) philosophies operating under the same roof. But because we still talk to each other (and sometimes even listen to each other) these ideas can merge to create new and interesting things. Kernel Bayes Rule makes the list.
http://www.cs.toronto.edu/~hinton/absps/imagenet.pdf An obvious choice, but you don’t leave the Beatles off lists of great bands just because they are an obvious choice.