1 of 19

Stochastic Processes

Dr.V.Senthilkumar

Assistant professor

CPA college, bodinayakanur

Aug.2024

2 of 19

Stochastic Processes

  • Topics
  • Definitions
  • Review of probability
  • Realization of a stochastic process
  • Continuous vs. discrete systems
  • Examples

3 of 19

Basic Definitions

Stochastic process: System that changes over time in an uncertain manner

Examples

  • Automated teller machine (ATM)
  • Printed circuit board assembly operation
  • Runway activity at airport

State: Snapshot of the system at some fixed point in time

Transition: Movement from one state to another

4 of 19

Elements of Probability Theory

Experiment: Any situation where the outcome is uncertain.

Sample Space, S: All possible outcomes of an experiment (we will call them the “state space”).

Event: Any collection of outcomes (points) in the sample space. A collection of events E1, E2,…,En is said to be mutually exclusive if EiEj =for all ij = 1,…,n.

Random Variable (RV): Function or procedure that assigns a real number to each outcome in the sample space.

Cumulative Distribution Function (CDF), F(·): Probability distribution function for the random variable X such that

F(a) ≡ Pr{Xa}

5 of 19

Components of Stochastic Model

Time: Either continuous or discrete parameter.

State: Describes the attributes of a system at some point in time.

s = (s1, s2, . . . , sv); for ATM example s = (n)

  • Convenient to assign a unique nonnegative integer index to each possible value of the state vector. We call this X and require that for each s 🡪 X.
  • For ATM example, X = n.
  • In general, Xt is a random variable.

6 of 19

Model Components (continued)

Activity: Takes some amount of time – duration. Culminates in an event.

For ATM example 🡪 service completion.

Stochastic Process: A collection of random variables {Xt}, where tT = {0, 1, 2, . . .}.

Transition: Caused by an event and results in movement from one state to another. For ATM example,

# = state, a = arrival, d = departure

7 of 19

Realization of the Process

Deterministic Process

Time between arrivals

Pr{ taτ } = 0, τ < 1 min

= 1, τ ≥ 1 min

Arrivals occur every minute.

Time for servicing customer

Pr{ tsτ } = 0, τ < 0.75 min

= 1, τ ≥ 0.75 min

Processing takes exactly 0.75 minutes.

Number in system, n

(no transient response)

8 of 19

Realization of the Process (continued)

Stochastic Process

Time for servicing a customer

Pr{ tsτ } = 0, τ < 0.75 min

= 0.6, 0.75 ≤ τ ≤ 1.5 min

= 1, τ ≥ 1.5 min

Number in system, n

9 of 19

Markovian Property

Given that the present state is known, the conditional probability of the next state is independent of the states prior to the present state.

Present state at time t is i: Xt = i

Next state at time t + 1 is j: Xt+1 = j

Conditional Probability Statement of Markovian Property:

Pr{Xt+1 = j | X0 = k0, X1 = k1,…, Xt = i } = Pr{Xt+1 = j | Xt = i }

  for t = 0, 1,…, and all possible sequences i, j, k0, k1, . . . , kt–1

Interpretation: Given the present, the past is irrelevant in determining the future.

10 of 19

Transitions for Markov Processes

State space: S = {1, 2, . . . , m}

Probability of going from state i to state j in one move: pij

Theoretical requirements: 0 ≤ pij ≤ 1, Σj pij = 1, i = 1,…,m

State-transition matrix

11 of 19

Discrete-Time Markov Chain

  • A discrete state space
  • Markovian property for transitions
  • One-step transition probabilities, pij, remain constant over time (stationary)

Simple Example

State-transition matrix State-transition diagram

0

1

2

P =

0

1

2

0.6

0.3

0.1

0.8

0.2

0

1

0

0

12 of 19

Game of Craps

  • Roll 2 dice
  • Outcomes
    • Win = 7 or 11
    • Loose = 2, 3, 12
    • Point = 4, 5, 6, 8, 9, 10
  • If point, then roll again.
    • Win if point
    • Loose if 7
    • Otherwise roll again, and so on
  • (There are other possible bets not included here.)

13 of 19

State-Transition Network for Craps

14 of 19

Transition Matrix for Game of Craps

Probability of win = Pr{ 7 or 11 } = 0.167 + 0.056 = 0.223

Probability of loss = Pr{ 2, 3, 12 } = 0.028 + 0.056 + 0.028 = 0.112

Sum

2

3

4

5

6

7

8

9

10

11

12

Prob.

0.028

0.056

0.083

0.111

0.139

0.167

0.139

0.111

0.083

0.056

0.028

15 of 19

Examples of Stochastic Processes

Multistage assembly process with single worker, no queue

State = 0, worker is idle

State = k, worker is performing operation k = 1, . . . , 5

Single stage assembly process with single worker, no queue

State = 0, worker is idle

State = 1, worker is busy

16 of 19

Examples (continued)

Multistage assembly process with single worker and queue

(Assume 3 stages only; i.e., 3 operations)

s = (s1, s2) where

Operations

k = 1, 2, 3

17 of 19

Single Stage Process with Two Servers and Queue

s = (s1, s2 , s3) where

State-transition network

i = 1, 2

18 of 19

Series System with No Queues

Component

Notation

Definition

State

s = (s1, s2 , s3)

State space

S = { (0,0,0), (1,0,0), . . . ,

(0,1,1), (1,1,1) }

The state space consists of all possible binary vectors of 3 components.

Events

Y = {a, d1, d2 , d3}

a = arrival at operation 1

dj = completion of operation j for j = 1, 2, 3

19 of 19

What You Should Know About Stochastic Processes

  • Definition of a state and an event.
  • Meaning of realization of a system (stationary vs. transient).
  • Definition of the state-transition matrix.
  • How to draw a state-transition network.
  • Difference between a continuous and discrete-time system.
  • Common applications with multiple stages and servers.