1 / 15

Monte Carlo Analysis of Uncertain Digital Circuits

Monte Carlo Analysis of Uncertain Digital Circuits. Houssain Kettani, Ph.D. Department of Computer Science Jackson State University Jackson, MS houssain.kettani@jsums.edu http://www.jsums.edu/~houssain.kettani September 2004. General Setup. Consider the following digital network. x 1.

Download Presentation

Monte Carlo Analysis of Uncertain Digital Circuits

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Monte Carlo Analysis of Uncertain Digital Circuits Houssain Kettani, Ph.D. Department of Computer ScienceJackson State UniversityJackson, MS houssain.kettani@jsums.edu http://www.jsums.edu/~houssain.kettani September 2004

  2. General Setup • Consider the following digital network x1 x2 Digital Network . . . f(xn, xn-1, …, x1) xn 179/MAPLD2004

  3. Assumptions • The inputs xi’s and the output f are binary variables taking the values 0 and 1. • The xi’s are independent Bernoulli random variables with P(xi = 1) = E[xi] = pi. 179/MAPLD2004

  4. Mission • Let P = P(f(xn, xn-1, …, x1) = 1) = E[f(xn, xn-1, …, x1)] • Questions: • Given a logic function, f(xn, xn-1, . . . , x1), with known probabilities pi’s, what can we say about the probability P? • How can we address the problem of maximizing or minimizing P? 179/MAPLD2004

  5. Motivating Example • Consider the following simple digital circuit: • P = p2p1 x1 f(x2, x1) = x2.x1 x2 179/MAPLD2004

  6. Theorem 1 Let f(xn, xn-1, . . . , x1) be a binary function of n independent binary random variables with P(xj = 1) = pj . Let I be the set of minterm indices for which f(xn, xn-1, . . . , x1) is 1. Then 179/MAPLD2004

  7. Stochastic Optimization • Suppose that the probabilities pi can be picked from intervals Ii = [pi- , pi+]. • Consequently, the tuple (p1, p2, . . . , pn) can be picked from the hypercube I = I1 × I2 × . . . × In. • Then, what value should we set the probabilities pi to in order to maximize or minimize P? 179/MAPLD2004

  8. Essential Variables • A binary variable xk is said to be essential if there does not exist admissible values of the (n−1) remaining variables xj In, j ≠ k, making the probability P independent of xkIk. • If xk is essential, then the partial derivative ∂P / ∂pk is non-zero over I. • Hence, if the variable xk is essential, the partial derivative ∂P / ∂pk has one sign over I. 179/MAPLD2004

  9. Essential Variables (Cont.) • Let us denote this invariant sign by • Hence, sk is constant over I having the value sk=−1 or sk=1. 179/MAPLD2004

  10. Theorem 2 Let P be a function of some pj ’s. Then, • For the case of maximizing P, if the variable xk is essential, then pick pk = p−k when sk = −1, and pick pk = p+k when sk = 1. • For the case of minimizing P, if the variable xk is essential, then pick pk = p+k when sk = −1, and pick pk = p−k when sk = 1. • If xk is not essential, then for either case pick pk = p−k or pk = p+k. 179/MAPLD2004

  11. Numerical Examples (1/3) • f1(x3, x2, x1) = x3 + x1, and f2(x3, x2, x1) = x2x1 + x3x1. • p1 [0.4, 0.6], p2 [0.1, 0.5], and p3 [0.2, 0.8]. • We have, I1 = {0, 2, 4, 5, 6, 7}, and I2 = {1, 4, 5, 6}. • Hence, we have from Theorem 1: • P1 = (1 − p3)(1 − p1) + p3, and • P2 = (1 − p2)p1 + (1 − p1)p3. 179/MAPLD2004

  12. Numerical Examples (2/3) Suppose we would like to maximize P1. Then • Note that both x1 and x3 are essential with s(1)1 = −1 and s(1)3 = 1. • Thus, the maximum P+1 is obtained with p1 = 0.4 and p3 = 0.8. • Consequently, P+1 = 0.92. 179/MAPLD2004

  13. Numerical Examples (3/3) Suppose that we would like to minimize P2. Then • Note that both x2 and x3 are essential with s(2)2 = −1 and s(2)3 = 1. • Thus, the minimum P−2 is obtained with p2 = 0.5 and p3 = 0.2. • However, the variable x1 is not essential. • Thus, we try both values 0.4 and 0.6 for p1. • This results in P2 = 0.32 and P2 = 0.38. • Consequently, P−2 = 0.32. 179/MAPLD2004

  14. Summary • Considered the case of digital circuits with uncertain input variables. • Presented a probabilistic measure of the output function in terms of the probabilities of the input. • The result is a multilinear function, which facilitates the optimization problem of the probability of the output. 179/MAPLD2004

  15. Further Research • What if the input variables are dependent? • What if we consider b-ary logic instead of binary? • What if we broaden the concept of uncertain digital networks to include uncertain logic gates and extend our results to such case? 179/MAPLD2004

More Related