# Probabilistic graphical model (PGMs) Algorithm | Machine Learning

In this article, we are going to discuss about PGMs (probabilistic graphical model) Algorithm in machine learning.
Submitted by Bharti Parmar, on March 13, 2019

## What is PGM?

So, what exactly the PGM is? P → Probabilistic, G → Graphic, M → Models

### Probabilistic

The nature of the problem that we are generally interested to solve or the type of queries interested to solve or the type of queries we want to make is all probabilistic because of uncertainty. There are many reasons that contribute to it.

1. Incomplete knowledge
2. Noisy observation
3. Some Attributes that contribute to the problem that counts and has not present in the model

### Graphical

It helps us to visualize better and we use graph theory to reduce the number of relevant combinations of all the participating variables to represent the high dimensional probability distribution model more compactly.

### Model

A model is a declarative (means declare and defined not derived either by a domain expert by using their domain knowledge and by using statistical knowledge and learning algorithms with historical datasets) representation of a real-world scenario or a problem that we want to analyze. It is represented by using mathematical tools like graph or simple by an equation.

### PGM

It is a technique of closely representing a joint distribution (a rich framework for encoding probability distributions over complex domains) by exploiting dependencies between the random variables. They are used to create a model for real-world scenarios and represent them in the compact graphical representation. It is also allowing us to do inference on joint distribution in a computationally cheaper way than the traditional methods. You can achieve in one line what you would traditionally ways to solve. This is in itself is very abstract and involves many terms that need its own speaks.

Representations are done at the intersection of statistics and computer science. It relies on ideas of probability theory, graph algorithms, machine learning. It has a wide variety of applications Like medical diagnosis, image understanding, speech recognition, NLP, and many more. They are also an initial tool in conveying ML problems.

Whether the graph is directed or undirected, it classifies graphical modes into two ways — Bayesian networks and Markov networks. By knowing the PGMs algorithm we can easily understand what is Bayesian network, graphical model and Markov’s field model.

DIA. Example of probability: DIA. Example of Conditional Probability: Different kinds of distribution:

1. Joint distribution: It describes how two or more variables are distributed simultaneously. To get a probability from the joint distribution of A and B, you would consider P(A=a and B=b).
2. Probability distribution: Conditional probability distribution looks at how the probabilities of A are distributed, given a certain value, say, for B, P(A=a | B=b).
3. Marginal distribution: It is one that results from taking a mean over one variable to get the probability distribution of the other. Like: the marginal probability distribution of A and B are related would be given by the following:
B P(a│b)P(b)db

Conclusion:

In this article, we have learned what is PGM with example and its different kinds of distribution? We will learn more about ML in the upcoming article. Have a nice day! Happy learning!

Languages: » C » C++ » C++ STL » Java » Data Structure » C#.Net » Android » Kotlin » SQL
Web Technologies: » PHP » Python » JavaScript » CSS » Ajax » Node.js » Web programming/HTML
Solved programs: » C » C++ » DS » Java » C#
Aptitude que. & ans.: » C » C++ » Java » DBMS
Interview que. & ans.: » C » Embedded C » Java » SEO » HR
CS Subjects: » CS Basics » O.S. » Networks » DBMS » Embedded Systems » Cloud Computing
» Machine learning » CS Organizations » Linux » DOS
More: » Articles » Puzzles » News/Updates