Michael Data

Cutset conditioning is a method for performing Tree Decomposition on a dependency graph. By identifying and conditioning on the “cut set” of the graph, a tree structure is produced.

The cut set of a graph is the set of variables which, if removed, will “cut” all loops in the graph and produce an acyclic graph.

Loop-cutset decomposition (Loop-cutset conditioning) uses [http://en.wikipedia.org/wiki/Decomposition_method_(constraint_satisfaction)#Cycle_cutset Cutset-Conditioning] to transform a belief graph into a tree structure by removing large-degree variables from the network. It improves time performance at the cost of space. The result is a forest of probability trees which can be easily processed by Belief Propagation. Unfortunately, the resulting cutset forest forms a tree which takes additional memory and must be searched over via tree search spaces.

  • Input is a belief network.
  • Identify the cycle cutset: the set of nodes which must be removed in order to produce a tree.
  • For each node in this cutset:
  • use VEC on that node
  • Output is a forest of probability trees

Inference work is exponential in $C$ for cycle [http://en.wikipedia.org/wiki/Cut_set cutset] of size $C$.
$ C + 1 > w$ for cutset size $C$ and induced-width w. For example, in a chain of cliques, the induced width is a function of clique size while the cutset is a function of the total number of nodes in the graph. Finding a minimal cycle cutset is not easy.

Because full loop-cutset decomposition produces a very large graph, it is generally better to not condition all the way to a tree structure but to stop when some target graph width $\omega$ is met. This is called a w-cutset algorithm.

Cutset Selection

Small cycle-cutsets require a smaller number of tree-solving algorithms to be run, but finding a minimal-size cycle-cutset is [http://en.wikipedia.org/wiki/Feedback_vertex_set NP-complete]. One heuristic for quickly choosing a cutset is to perform a depth-first search.

Variable Elimination Conditioning

Variable Elimination Conditioning, or VEC, is the fundamental operation of cutset conditioning.

VEC-Evidence

This is an algorithm for using VEC to optimize Bucket Elimination inference:

  1. Use VEC to decompose the belief graph into a poly tree

#* Each sub-tree is a dependency graph of some kind
#* The main tree is a tree of variable assignments leading to a sub-tree

  1. Solve each sub-tree (including any evidence that applies) e.g. using Bucket Elimination
  2. Sum over the main tree for the final probability of all evidence
W-cutset

For a target treewidth $\omega$ , run loop-cutset decomposition but stop when the target width is achieved.
Time complexity is $O(n * k^{c + \omega + 1})$
Space complexity is $O(k^\omega)$

  • * $n$ is number of variables
  • * $c$ is the number of variables in the cutset
  • * $k$ is the domain size
  • * $\omega$ is the treewidth

It can be shown that $c \geq \omega - 1$. So decreasing $\omega$ results in a greater time requirement and lesser space requirements.