# Find the probability of a state at a given time in a Markov chain - Set 1 in Python

PythonServer Side ProgrammingProgramming

#### Beyond Basic Programming - Intermediate Python

Most Popular

36 Lectures 3 hours

#### Practical Machine Learning using Python

Best Seller

91 Lectures 23.5 hours

#### Practical Data Science using Python

22 Lectures 6 hours

Suppose we have a Markov chain graph g; we have the find the probability to reach the state F at time T if we start from state S when time t = 0. As we know a Markov chain is a random process consisting of various states and the probabilities to move one state to another. This can be represented as a directed graph; the nodes are states and the edges have the probability of going from one node to another. From one state to another, it takes unit time to move. The sum of the probabilities of the outgoing edges is one for every node.

So, if the input is like N = 6, S = 4, F = 2, T = 100, then the output will be 0.28499144801478526

To solve this, we will follow these steps −

• table := a matrix of size (N+1)x(T+1) and fill with 0.0

• table[S, 0] := 1.0

• for i in range 1 to T, do

• for j in range 1 to N, do

• for each k in G[j], do

• table[j, i] := table[j, i] + k[1] * table[k[0], i - 1]

• return table[F, T]

## Example

Let us see the following implementation to get better understanding −

Live Demo

def get_probability(G, N, F, S, T):
table = [[0.0 for j in range(T+1)] for i in range(N+1)]
table[S][0] = 1.0
for i in range(1, T+1):
for j in range(1, N +1):
for k in G[j]:
table[j][i] += k[1] * table[k[0]][i - 1]
return table[F][T];
graph = []
graph.append([])
graph.append([(2, 0.09)])
graph.append([(1, 0.23),(6, 0.62)])
graph.append([(2, 0.06)])
graph.append([(1, 0.77),(3, 0.63)])
graph.append([(4, 0.65),(6, 0.38)])
graph.append([(2, 0.85),(3, 0.37), (4, 0.35), (5, 1.0)])
N = 6
S, F, T = 4, 2, 100
print(get_probability(graph, N, F, S, T))

## Input

6, 4, 2, 100

## Output

0.28499144801478526
Updated on 27-Aug-2020 06:36:52