Introduction To Bayesian Methods

1. Objective

In our previous blog, we have discussed Bayesian Network in detail. In this blog, we are going to explain you the different Bayesian methods such as Variable elimination, Dynamic Programming, and Approximation algorithms. We will also discuss some primary approximation methods i.e. Loopy belief propagation, Bounded cutset conditioning etc. in this tutorial.

Introduction To Bayesian Methods

Introduction To Bayesian Methods

R Quiz

2. What are the Bayesian Methods?

There are three different methods in a Bayesian network:

  • Variable elimination
  • Dynamic Programming
  • Approximation algorithms

let us discuss these Bayesian Methods one by one-

i. Variable Elimination

 Bayesian Methods

Bayesian Methods – Formula.1

To do the effective marginalization you can use Joint Probability Distribution. In this method, you can sum out irrelevant terms.
For example:

So, we’ll remove the variables from the inside-out. Starting with variable Xm and finishing with variable X1. Here we carry out summations right-to-left, storing intermediate results (factors) to avoid recomputation. To remove variable Xi, we start by gathering up all the factors that mention Xi and removing them from our set of factors.
During the computation, the complexity of variable elimination depends on the size of the largest factor. This, in turn, depends on the order of elimination of variables. In poly tree networks, the time and space complexity are linear in the size of the network.

ii. Dynamic Programming

Suppose you want to calculate several margins. If you use the variable elimination algorithm, then there will be several redundant calculations. To avoid them, you can use dynamic programming (DP) method. DP method works according to the type of graphical model.
When the BN graph is acyclic (that is, a tree), then you can use a local message passing algorithm. This is a simple forward-backward algorithm for HMM chains. When the BN has undirected cycles, there is a risk of double counting by the local message passing algorithms. To avoid this, you can convert the undirected BN into a tree by clustering nodes together. Thus, the resulting tree is called a junction tree.
The major disadvantage of the DP algorithm is that it takes a large time to run in large cluster sizes. Induced for the width of graph size refers to an increase in processing time. Minimizing the induced width of the graph is NP-hard problem. NP is a class of computational problems. You can solve with the help of the Non-Deterministic Turing machine.

iii. Approximation Algorithms

Models having repetitive structures such as multivariate time-series models. Models used for image analysis have high induced width. So they take a lot of time if you try to infer them with variable elimination or DP algorithm.
For these models, you can use approximation techniques. Let us see some primary approximation techniques:

Approximation Techniques in Bayesian Methods

Approximation Techniques in Bayesian Methods

a) Variational method

It is one of the simplest mean-field techniques used for the approximation of algorithms. It uses the rule of approximating the sum of random variables having large values using their means.

b) Sampling (Monte Carlo) method

This is the simplest type of importance sampling. The general method is:

  • Define samples x from P(x).
  • Check samples using their likelihood P(x or y)

c) Loopy belief propagation

In this method, the actual graph applies pearl algorithm. This results in double counting. It has been observed that double counted incidents canceled and offered the accurate results.
d) Bounded cutset conditioning
You can break loops in the graph by instantiating subsets of the variables. This process is very slow in the case of a large cutset. To calculate lower bounds instantiate a subset of the cutset values.
e) Parametric approximation method
Intermediate summands in a simple and easy to understand form. For Example, Mini buckets and Boyen-Koller algorithm.

So, this was all in Bayesian Methods. Hope you like our explanation.

3. Conclusion – Bayesian Methods

Hence, we have discussed the complete concept of Bayesian Methods. Still, if you have any query related to Bayesian Methods, you can leave a comment in a section given below.
See Also-

Reference for R

Leave a Reply

Your email address will not be published. Required fields are marked *

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.