-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 01:01:10
In this talk, we present an outline of a powerful Bayesian estimation framework we call "Plug-and-Play Priors" (PnP). We invented this family of algorithms (including "consensus equilibrium") to solve large-scale ill-posed inverse problems, particularly focusing on computational imaging applications. At its core, PnP is powerful because it allows for off-the-shelf denoisers and other ML models and filters to be used as Bayesian prior models. In our original work, we built out PnP on top of alternating direction method of multipliers (ADMM), but several researchers have since adapted our methods to other frameworks such as FISTA. We also present theoretical conditions on these off-the-shelf models and filters (used as priors) to ensure PnP iterates converge. Through these theoretical guarantees, Plug-and-Play has indeed come a long way from what could very well have been "Plug-and-Pray". We will also talk about several imaging examples that demonstrate what PnP has been able to achieve, and why it has spawned a new sub-area of computational imaging. We end the talk with mentions of some surprising-and-impressive applications and extensions other researchers have achieved, as they have built on top of our original PnP algorithm.