BP_SM___CML_presentation.pdf (1.12 MB)

An ideal match? Investigating how well-suited Concurrent ML is to implementing Belief Propagation for Stereo Matching

Download (1.12 MB)
posted on 03.12.2020, 01:31 by James Cooper
Belief Propagation, introduced by Judea Pearl, is a family of algorithms for computing marginal probabilities over Bayesian Networks and Markov Random Fields, and is explicitly based around concepts of message passing. In the context of Computer Vision, so-called Loopy Belief Propagation has found some success as an algorithm for stereo matching, where the entries of the output 'disparity map' operate as communicating nodes in a Markov Random Field. Concurrent ML, introduced by John Reppy, is an approach to concurrent programming based on synchronous message passing. Thus, Loopy Belief Propagation and Concurrent ML would appear to be an excellent match. No evidence of anyone attempting to meld the two in the past could be found, however. This talk will provide a brief overview of stereo matching, Loopy Belief Propagation for stereo matching, and Concurrent ML, before discussing the presenter's work so far on applying Concurrent ML to image-based tasks.


This is a slightly updated version of a presentation I gave at Victoria University Wellington on 24 November 2020. It addresses the topic described above in the abstract. I have attempted to rewrite parts of it so that they do not require me talking to the points for the reader to make sense of them (I'm not sure I was entirely successful).



University of Auckland