queuing analysis
play

Queuing Analysis Gregory (Grisha) Chockler, Zinovi Rabinovich, Ittai - PowerPoint PPT Presentation

Queuing Analysis Gregory (Grisha) Chockler, Zinovi Rabinovich, Ittai Abraham Operating Systems Course, Spring 2003 Hebrew University of Jerusalem, Israel Queuing Analysis p.1 Plan Review of basic probability. Markov processes and


  1. Queuing Analysis Gregory (Grisha) Chockler, Zinovi Rabinovich, Ittai Abraham Operating Systems Course, Spring 2003 Hebrew University of Jerusalem, Israel Queuing Analysis – p.1

  2. Plan • Review of basic probability. • Markov processes and Poisson processes. • Models in Queuing Theory. • Analysis of the M/M/1 model. • Analysis of the M/M/1/b model. • Proof of Poisson process definition equivalence. Queuing Analysis – p.2

  3. Random Variables • A discrete random variable X can get some countable set of values S = { x 1 , x 2 , x 3 . . . , } . • A function f : R → [0 .. 1] such that � x ∈ S f ( x ) = 1 , and ∀ x / ∈ S , f ( x ) = 0 defines the probability function of X : Pr[ X = x ] = f ( x ) . • A continuous random variable X can get values in some intervals of numbers. � ∞ • A function f : R → R + such that −∞ f ( x ) dx = 1 , defines the probability density function for X . � ∞ Thus Pr[ X ≥ x ] = x f ( x ) dx . Queuing Analysis – p.3

  4. Conditional Probability, Independence and Sums • The conditional probability that A occurs given that B occurs is defined to be Pr[ A | B ] = Pr[ A ∩ B ] Pr[ B ] . • Bayes’ theorem: Pr[ A | B ] = Pr[ B | A ] Pr[ A ] . Pr[ B ] • Events A and B are independent if Pr[ A ∩ B ] = Pr[ A ] Pr[ B ] . • X, Y are independent if X ∈ x and Y ∈ y are independent events for all x, y . • If X, Y are independent, with probability functions f X , f Y then for the sum Z = X + Y we have f Z ( z ) = � x ∈ X f X ( x ) f Y ( z − x ) and � ∞ −∞ f X ( x ) f Y ( z − x ) dx . f Z ( z ) = Queuing Analysis – p.4

  5. Expectation • The expectation of a random variable X with probability function f ( x ) is defined as � ∞ x | f ( x ) > 0 f ( x ) x , and E ( x ) = −∞ f ( x ) xdx . E ( X ) = � • Linearity of expectation: E ( aX + bY ) = aE ( X ) + bE ( Y ) . • Example: exponential distribution with � θe − θx x ≥ 0 parameter θ : f ( x ) = 0 x < 0 • Recall: udv , take v = x , u = − θe − θx . � � vdu = vu − � ∞ � ∞ 0 − e − θx = • E ( x ) = 0 θe − θx xdx = − xe − θx | ∞ 0 − � ∞ 0 e − θx = − 1 θ e − θx | ∞ 0 = 1 θ . 0 + Queuing Analysis – p.5

  6. Variance • Variance is defined as: = E ( X 2 ) − E ( X ) 2 (if it ( X − E ( X )) 2 � � var ( X ) = E converges). • Standard deviation: σ X = � var ( X ) , coefficient σ X of variation: cv ( X ) = E ( x ) . • Covariance: cov ( X, Y ) = E ( XY ) − E ( X ) E ( Y ) , cov ( X, X ) = var ( X ) , if X and Y are independent then cov ( X, Y ) = 0 . • var ( aX ) = a 2 var ( X ) , and var ( X + Y ) = var ( X ) + var ( Y ) + 2 cov ( X, Y ) . Queuing Analysis – p.6

  7. Markov Processes • A stochastic process is a series of random variables X 1 , X 2 , . . . . • A markov process is a stochastic process in which the probability of X t is determined by only by X t − 1 . • Formally, Pr( X t = i t | X t − 1 = i t − 1 ) = Pr( X t = i t | X t − 1 = i t − 1 , X t − 2 = i t − 2 , . . . , X 1 = i 1 ) . • A markov process can be viewed as a weighted directed graph in which ∀ v ∈ V , � ( u → v ) ∈ E d ( u, v ) = 1 . Queuing Analysis – p.7

  8. Example of a Markov Process 1/2� 1� A� B� 1/2� 1/2� 1/2� 1/3� 1/3� 1/3� 1/2� 1/2� C� E� D� 1/2� Queuing Analysis – p.8

  9. Stationary Distribution • Under certain conditions, markov process ( V, E, d ) has a stationary distribution π . lim t →∞ X t = π • If X t ∼ π then X t +1 ∼ π . ( � v ∈ V π v = 1 ) • Formally, Pr( X t +1 = v ) = � ( u → v ) Pr( X t = u ) d ( u, v ) , π v = � ( u → v ) π u d ( u, v ) . • Balance property : given a cut ( U, ¯ U ) the flow of probability π is balanced. • Formally, � � π u d ( u, v ) = π v d ( v, u ) ( u → v ) | u ∈ U,v ∈ ¯ ( v → u ) | u ∈ U,v ∈ ¯ U U . Queuing Analysis – p.9

  10. Example of a Stationary Distribution 1/2� A� 1� B� 1/4� 1/4� 1/2� 1/2� 1/2� 1/3� 1/3� 1/3� 1/2� 1/2� C� E� D� 1/6� 1/6� 1/6� 1/2� 1/6(½)+1/6(½)+1/6(½)=¼(1/3)+¼(1/3)+¼(1/3)� Queuing Analysis – p.10

  11. Poisson Process • Consider a stochastic process { N ( T ) | t ≥ 0 } , N (0) = 0 , N ( t ) counts the number of occurrences of an event, thus N ( t ) ∈ N . • N ( t ) = n means that there have been n occurrences by time t . If N ( t ) = n now, from there it can only go to state n + 1 . This happens as soon as the next event takes place. • Let T n denote the time the process spends in state n . • Poisson process: If T n are exponentially distributed with the same mean, say 1 θ , for all n and all the T n ’s are independent. Queuing Analysis – p.11

  12. Poisson Process - Equivalent Definitions • Poisson process 1: All T n ∼ exp ( θ ) , and all the T n ’s are independent. • Poisson process 2: N ( t ) ∼ Poisson ( θt ) . Pr[ N ( t ) = k ] = ( θt ) k k ! e − θt , and the number of events in non-overlapping intervals are independent. • Poisson process 3: N (0) = 0 , for any k as t → 0 , • Pr [ N ( k + t ) = N ( k )] = 1 − θt + o ( t ) . • Pr [ N ( k + t ) = 1 + N ( k )] = θt + o ( t ) . • Pr [ N ( k + t ) ≥ 2 + N ( k )] = o ( t ) • Increments in non-overlapping intervals are independent. (Recall f ( x ) = o ( g ( x )) means f ( x ) g ( x ) = 0 ). lim x → 0 Queuing Analysis – p.12

  13. Queuing Models Things we need to decide about our model: • Job arrival, Job service time. • No. of processor, their qualities. • Queue size, dispatch discipline, No. of queues. • Kandell’s notation: B/D/n/k/dd, B=arrival, D=service time, n=processors, k=queue length, dd=dispach discipline. Queuing Analysis – p.13

  14. The M/M/1 model • Single FIFO queue, single processor. • Jobs arrive at rate λ , thus job births form a poisson process with parameter λ . • The service time of a job is exponentially distributed with parameter µ , thus job deaths form a poisson process with parameter µ . Queuing Analysis – p.14

  15. Analysis of M/M/1 • Examine time scale at at small intervals δ, 2 δ, 3 δ, . . . . • Analyze as δ → 0 so omit o ( δ ) factors. • � � π u d ( u, v ) = π v d ( v, u ) ( u → v ) | u ∈ U,v ∈ ¯ ( v → u ) | u ∈ U,v ∈ ¯ U U Queuing Analysis – p.15

  16. Analysis of M/M/1 (part 2) • Let p be the stationary distribution. For any i , p i is the probability of having i jobs in the system at the steady state. • By the balance property, at the steady state p j ( µδ + o ( δ )) = p j − 1 ( λδ + o ( δ )) • Taking δ → 0 : λδ + o ( δ ) λδ + o ( δ ) δ p 0 = λ p 1 = lim δ → 0 µδ + o ( δ ) p 0 = lim δ → 0 µ p 0 µδ + o ( δ ) δ • Same way for every j > 0 , p j = λ µ p j − 1 . • Denote the traffic intensity ρ = λ µ . • Since p j = ρp j − 1 so p j = ρ j p 0 . Queuing Analysis – p.16

  17. Analysis (part 3) i =0 a i = a k +1 − 1 i =0 a i = • Recall � k a − 1 , � ∞ 1 1 − a , i =0 ia i = � ∞ a (1 − a ) 2 for a < 1 . • Now 1 = � ∞ i =0 p i . Using p j = ρ j p 0 we have 1 = � ∞ p 0 i =0 ρ i p 0 = 1 − ρ so p 0 = 1 − ρ . • The expected number of jobs in the system ∞ ∞ ρ i (1 − ρ ) ρ i = (1 − ρ ) iρ i = � � E [ n ] = 1 − ρ i =0 i =0 • The expected number of waiting jobs in the system i =0 i (1 − ρ ) ρ i +1 = ρ 2 is E [ w ] = � ∞ 1 − ρ . Queuing Analysis – p.17

  18. Analysis (part 4) • Response time r = time waiting + time served. • Little’s law: E [ r ] = E [ n ] (targil). λ • So λ ρ 1 µ E [ r ] = (1 − ρ ) λ = = (1 − λ µ − λ µ ) λ Queuing Analysis – p.18

  19. The M/M/1/b model • Same as M/M/1 but now queue has bounded size b . • When the queue is full, arriving jobs are dropped. • Makes the model a bit more realistic. Queuing Analysis – p.19

  20. Analysis of M/M/1/b • Same as M/M/1 we have p j = ρ j p 0 . i =0 a i = a k +1 − 1 • Recall � k a − 1 . ρ b +1 − 1 • 1 = � b i =0 ρ i p 0 = p 0 ρ − 1 . 1 − ρ • So p 0 = 1 − ρ b +1 i =0 ia i − 1 = ( k +1) a k − a k +1 − 1 • Recall � k ( a − 1) 2 . a − 1 • The expected number of jobs: E [ n ] = � b 1 − ρ 1 − ρ b +1 ρ i i = i =0 ( b +1) ρ b 1 − ρ − ρ ( b +1) ρ b +1 � − ρ b +1 − 1 � 1 − ρ ρ 1 − ρ b +1 ρ = • ρ − 1 ( ρ − 1) 2 1 − ρ b +1 Queuing Analysis – p.20

  21. Poisson Process - Equivalent Definitions • Poisson process 1: All T n ∼ exp ( θ ) , and all the T n ’s are independent. • Poisson process 2: N ( t ) ∼ Poisson ( θt ) . Pr[ N ( t ) = k ] = ( θt ) k k ! e − θt , and the number of events in non-overlapping intervals are independent. • Poisson process 3: N (0) = 0 , for any k as t → 0 , • Pr [ N ( k + t ) = N ( k )] = 1 − θt + o ( t ) . • Pr [ N ( k + t ) = 1 + N ( k )] = θt + o ( t ) . • Pr [ N ( k + t ) ≥ 2 + N ( k )] = o ( t ) • Increments in non-overlapping intervals are independent. (Recall f ( x ) = o ( g ( x )) means f ( x ) g ( x ) = 0 ). lim x → 0 Queuing Analysis – p.21

Recommend


More recommend