BP_SM___CML_presentation__Copy_for_Weekly_Research_Seminars.pdf (389.06 kB)
An ideal match? Investigating how well-suited Concurrent ML is to implementing Belief Propagation for Stereo Matching
Belief Propagation, introduced by Judea Pearl, is a family of algorithms
for computing marginal probabilities over Bayesian Networks and Markov
Random Fields, and is explicitly based around concepts of message
passing. In the context
of Computer Vision, so-called Loopy Belief Propagation has found some
success as an algorithm for stereo matching, where the entries of the
output 'disparity map' operate as communicating nodes in a Markov Random
Field. Concurrent ML, introduced by John Reppy,
is an approach to concurrent programming based on synchronous message
passing. Thus, Loopy Belief Propagation and Concurrent ML would appear
to be an excellent match. No evidence of anyone attempting to meld the
two in the past could be found, however.
This talk will provide a brief overview of stereo matching, Loopy
Belief Propagation for stereo matching, and Concurrent ML, before
discussing the presenter's work so far on applying Concurrent ML to
image-based tasks.
------------------------------------------------------------------------
This version is both shortened and slightly updated. I prepared it for another presentation given to the School of Computer Science at the University of Auckland in April 2021. I had to reduce the content considerably to make it fir the significantly reduced speaking time. I also updated some of it to reflect how I got on with trying to pivot to Manticore. [Spoiler alert: Not very well in the end :(]