indii.org
http://www.indii.org/
Lawrence Murray: software, research, photography.en-usSun, 27 Jan 2019 17:22:48 -0800Sun, 27 Jan 2019 17:22:48 -0800Gecko
http://www.indii.org/archives/gecko/index.html
Sun, 27 Jan 2019 00:00:00 -0800Lawrence Murrayhttp://www.indii.org/archives/gecko/index.htmlThe Wet Comes To Bali
http://www.indii.org/archives/wet-in-bali/index.html
Mon, 26 Nov 2018 00:00:00 -0800Lawrence Murrayhttp://www.indii.org/archives/wet-in-bali/index.htmlLight and dark.The Kimberley
http://www.indii.org/archives/the-kimberley/index.html
Tue, 20 Nov 2018 00:00:00 -0800Lawrence Murrayhttp://www.indii.org/archives/the-kimberley/index.htmlGorges and waterholes in the remote north-west of Australia.Automated learning with a probabilistic programming language: Birch
http://www.indii.org/research/automated-learning-with-a-probabilistic-programming-language-birch/index.html
Sat, 13 Oct 2018 00:00:00 -0700Lawrence Murrayhttp://www.indii.org/research/automated-learning-with-a-probabilistic-programming-language-birch/index.htmlThis work offers a broad perspective on probabilistic modeling and inference in light of recent advances in probabilistic programming, in which models are formally expressed in Turing-complete programming languages. We consider a typical workflow and how probabilistic programming languages can help to automate this workflow, especially in the matching of models with inference methods. We focus on two properties of a model that are critical in this matching: its structure—the conditional dependencies between random variables—and its form—the precise mathematical definition of those dependencies. While the structure and form of a probabilistic model are often fixed a priori, it is a curiosity of probabilistic programming that they need not be, and may instead vary according to random choices made during program execution. We introduce a formal description of models expressed as programs, and discuss some of the ways in which probabilistic programming languages can reveal the structure and form of these, in order to tailor inference methods. We demonstrate the ideas with a new probabilistic programming language called Birch, with a multiple object tracking example.
Saltoluokta
http://www.indii.org/archives/saltoluokta/index.html
Sat, 13 Oct 2018 00:00:00 -0700Lawrence Murrayhttp://www.indii.org/archives/saltoluokta/index.htmlSummer north of the arctic circle.Birch
http://www.indii.org/http:/www.birch-lang.org
Sat, 24 Mar 2018 00:00:00 -0700Lawrence Murrayhttp://www.indii.orghttp://www.birch-lang.orgAn object-oriented, universal probabilistic programming language.Sahara
http://www.indii.org/archives/sahara/index.html
Sun, 11 Mar 2018 00:00:00 -0800Lawrence Murrayhttp://www.indii.org/archives/sahara/index.htmlBack to Morocco.Improving the particle filter for high-dimensional problems using artificial process noise
http://www.indii.org/research/improving-the-particle-filter-for-high-dimensional-problems/index.html
Thu, 01 Mar 2018 00:00:00 -0800Lawrence Murrayhttp://www.indii.org/research/improving-the-particle-filter-for-high-dimensional-problems/index.htmlThe particle filter is one of the most successful methods for state inference and identification of general non-linear and non-Gaussian models. However, standard particle filters suffer from degeneracy of the particle weights for high-dimensional problems. We propose a method for improving the performance of the particle filter for certain challenging state space models, with implications for high-dimensional inference. First we approximate the model by adding artificial process noise in an additional state update, then we design a proposal that combines the standard and the locally optimal proposal. This results in a bias-variance trade-off, where adding more noise reduces the variance of the estimate but increases the model bias. The performance of the proposed method is evaluated on a linear Gaussian state space model and on the non-linear Lorenz’96 model. For both models we observe a significant improvement in performance over the standard particle filter.
Under the Dhow Sail
http://www.indii.org/archives/under-the-dhow-sail/index.html
Mon, 01 Jan 2018 00:00:00 -0800Lawrence Murrayhttp://www.indii.org/archives/under-the-dhow-sail/index.htmlSailing in Zanzibar.Delayed Sampling and Automatic Rao-Blackwellization of Probabilistic Programs
http://www.indii.org/research/delayed-sampling-and-automatic-rao-blackwellization-of-probabilistic-programs/index.html
Fri, 22 Dec 2017 00:00:00 -0800Lawrence Murrayhttp://www.indii.org/research/delayed-sampling-and-automatic-rao-blackwellization-of-probabilistic-programs/index.htmlWe introduce a dynamic mechanism for the solution of analytically-tractable substructure in probabilistic programs, to reduce variance in Monte Carlo estimators. For inference with Sequential Monte Carlo, it yields improvements such as locally-optimal proposals and Rao-Blackwellization, with little modification to model code necessary. A directed graph is maintained alongside the running program, evolving dynamically as the program triggers operations upon it. Nodes of the graph represent random variables, and edges the analytically-tractable relationships between them (e.g. conjugate priors and affine transformations). Each random variable is held in the graph for as long as possible, sampled only when used by the program in a context that cannot be resolved analytically. This allows it to be analytically conditioned on as many observations as possible before sampling. We have implemented the approach in both Anglican and a new probabilistic programming language named Birch. We demonstrate it on a number of small examples, and a larger mixed linear-nonlinear state-space model.