Michael Data

A Markov Network a.k.a. Markov Random Field is an undirected Dependency Graph which is a minimal I-map of a probability distribution. As a minimal I-map, deleting any edge destroys its Markov property.

Unlike a Bayesian network, a Markov network is undirected and may contain cycles.

The Markov blanket of a variable is the set of other variables which can be conditioned upon to achieve independence with the rest of the graph. The Markov boundary is a minimal Markov blanket.

Also see WP:Markov random field


Can be constructed for any positive probability distribution which satisfies dependency graph properties.

  • Edge deletion
    1. Start with a full graph
    2. Check each edge and see if removing it violates the I-mapness of the graph
    3. Determine whether fixing the values of all other variables in the graph renders these two independent
  • Markov boundary
    1. For each variable, add edges until it has a sufficient Markov boundary
    1. Starting with a Bayesian Network
    2. Connect the parents of all convergent nodes

Deviant case: $x=y=z=t$
A minimal I-map for this case is any tree because each variable is independent of the rest only conditioned on a third. This is difficult to find with the previous construction methods because it isn't a well-behaved probability distribution.

Inference Using Join Trees

$p(a,b,c,d,e) = \frac{p(a,b,c)p(b,c,d)}{p(b,c)} \frac{p(c,e)}{p(c)}$

Can be calculated by multiplying the joint of each clique, and dividing by their common variables.