Skip to content

Commit

Permalink
Links to blog post and companion paper; Documentation updates (#111)
Browse files Browse the repository at this point in the history
  • Loading branch information
StannisZhou authored Feb 10, 2022
1 parent e580b6d commit 2d86b75
Show file tree
Hide file tree
Showing 4 changed files with 26 additions and 16 deletions.
20 changes: 11 additions & 9 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,11 +6,13 @@

# PGMax

PGMax implements general factor graphs for discrete probabilistic graphical models (PGMs), and hardware-accelerated differentiable loopy belief propagation (LBP) in [JAX](https://jax.readthedocs.io/en/latest/).
PGMax implements general [factor graphs](https://en.wikipedia.org/wiki/Factor_graph) for discrete probabilistic graphical models (PGMs), and hardware-accelerated differentiable [loopy belief propagation (LBP)](https://en.wikipedia.org/wiki/Belief_propagation) in [JAX](https://jax.readthedocs.io/en/latest/).

- **General factor graphs**: PGMax supports easy specification of general factor graphs with potentially complicated topology, factor definitions, and discrete variables with a varying number of states.
- **LBP in JAX**: PGMax generates pure JAX functions implementing LBP for a given factor graph. The generated pure JAX functions run on modern accelerators (GPU/TPU), work with JAX transformations (e.g. `vmap` for processing batches of models/samples, `grad` for differentiating through the LBP iterative process), and can be easily used as part of a larger end-to-end differentiable system.

See our [blog post](https://www.vicarious.com/posts/pgmax-factor-graphs-for-discrete-probabilistic-graphical-models-and-loopy-belief-propagation-in-jax/) and [companion paper](https://arxiv.org/abs/2202.04110) for more details.

[**Installation**](#installation)
| [**Getting started**](#getting-started)

Expand Down Expand Up @@ -42,6 +44,7 @@ By default the above commands install JAX for CPU. If you have access to a GPU,

## Getting Started


Here are a few self-contained Colab notebooks to help you get started on using PGMax:

- [Tutorial on basic PGMax usage](https://colab.research.google.com/drive/1PQ9eVaOg336XzPqko-v_us3izEbjvWMW?usp=sharing)
Expand All @@ -52,14 +55,13 @@ Here are a few self-contained Colab notebooks to help you get started on using P

## Citing PGMax

To cite this repository
Please consider citing our [companion paper](https://arxiv.org/abs/2202.04110) if you use PGMax in your work:
```
@software{pgmax2021github,
author = {Guangyao Zhou* and Nishanth Kumar* and Miguel L\’{a}zaro-Gredilla and Dileep George},
title = {{PGMax}: {F}actor graph on discrete variables and hardware-accelerated differentiable loopy belief propagation in {JAX}},
howpublished={\url{http://github.com/vicariousinc/PGMax}},
version = {0.2.2},
year = {2022},
@article{zhou2022pgmax,
author = {Zhou, Guangyao and Kumar, Nishanth and L{\'a}zaro-Gredilla, Miguel and Kushagra, Shrinu and George, Dileep},
title = {{PGMax: Factor Graphs for Discrete Probabilistic Graphical Models and Loopy Belief Propagation in JAX}},
journal = {arXiv preprint arXiv:2202.04110},
year={2022}
}
```
where * indicates equal contribution.
First two authors contributed equally.
5 changes: 1 addition & 4 deletions docs/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,10 +20,7 @@

project = "PGMax"
copyright = "2021, Vicarious FPC Inc"
author = (
"Nishanth Kumar, Guangyao (Stannis) Zhou,"
+ " Miguel Lazaro-Gredilla, Dileep George"
)
author = "Guangyao Zhou, Nishanth Kumar, Miguel Lazaro-Gredilla, Shrinu Kushagra, Dileep George"

# The full version, including alpha/beta/rc tags
release = "0.2.2"
Expand Down
9 changes: 9 additions & 0 deletions docs/source/examples.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
Examples
=========


Here are a few self-contained Colab notebooks to help you get started on using PGMax:

- `Tutorial on basic PGMax usage <https://colab.research.google.com/drive/1PQ9eVaOg336XzPqko-v_us3izEbjvWMW?usp=sharing>`_
- `Implementing max-product LBP <https://colab.research.google.com/drive/1mSffrA1WgQwgIiJQd2pLULPa5YKAOJOX?usp=sharing>`_ for `Recursive Cortical Networks <https://www.science.org/doi/10.1126/science.aag2612>`_
- `End-to-end differentiable LBP for gradient-based PGM training <https://colab.research.google.com/drive/1yxDCLwhX0PVgFS7NHUcXG3ptMAY1CxMC?usp=sharing>`_
8 changes: 5 additions & 3 deletions docs/source/index.rst
Original file line number Diff line number Diff line change
@@ -1,17 +1,19 @@
PGMax Reference Documentation
==============================
PGMax implements general factor graphs for probabilistic graphical models (PGMs) with discrete variables, and hardware-accelerated differentiable loopy belief propagation (LBP) in `JAX <https://jax.readthedocs.io/en/latest/>`_.

- General factor graphs: PGMax goes beyond pairwise PGMs, and supports arbitrary factor graph topology, including higher-order factors.
- LBP in JAX: PGMax generates pure JAX functions implementing LBP for a given factor graph. The generated pure JAX functions run on modern accelerators (GPU/TPU), work with JAX transformations (e.g. ``vmap`` for processing batches of models/samples, ``grad`` for differentiating through the LBP iterative process), and can be easily used as part of a larger end-to-end differentiable system.
PGMax implements general `factor graphs <https://en.wikipedia.org/wiki/Factor_graph>`_ for discrete probabilistic graphical models (PGMs), and hardware-accelerated differentiable `loopy belief propagation (LBP) <https://en.wikipedia.org/wiki/Belief_propagation>`_ in `JAX <https://jax.readthedocs.io/en/latest/>`_.

- General factor graphs: PGMax supports easy specification of general factor graphs with potentially complicated topology, factor definitions, and discrete variables with a varying number of states.
- LBP in JAX: PGMax generates pure JAX functions implementing LBP for a given factor graph. The generated pure JAX functions run on modern accelerators (GPU/TPU), work with JAX transformations (e.g. ``vmap`` for processing batches of models/samples, ``grad`` for differentiating through the LBP iterative process), and can be easily used as part of a larger end-to-end differentiable system.

See our `blog post <https://www.vicarious.com/posts/pgmax-factor-graphs-for-discrete-probabilistic-graphical-models-and-loopy-belief-propagation-in-jax/>`_ and `companion paper <https://arxiv.org/abs/2202.04110>`_ for more details.

.. toctree::
:maxdepth: 1
:caption: Getting Started:

installation
examples

.. toctree::
:maxdepth: 1
Expand Down

0 comments on commit 2d86b75

Please sign in to comment.