590 likes | 703 Views
The Stable Marriage Problem. Original author: S. C. Tsai ( 交大蔡錫鈞教授 ) Revised by Chuang-Chieh Lin and Chih-Chieh Hung. His master diploma. A partner from NCTU ADSL ( A dvanced D atabase S ystem L ab). Chih-Chieh Hung ( 洪智傑 )
E N D
The Stable Marriage Problem Original author: S. C. Tsai (交大蔡錫鈞教授) Revised by Chuang-Chieh Lin and Chih-Chieh Hung
His master diploma. A partner from NCTU ADSL (Advanced Database System Lab) • Chih-Chieh Hung (洪智傑) • Bachelor Degree in the Department of Applied Math, National Chung-Hsing University. (1996-2000) • Master Degree in the Department of Computer Science and Information Engineering, NCTU. (2003-2005) • Ph.D. Student (博一) in the Department of Computer Science and Information Engineering, NCTU. (since 2005) • a Ph.D. candidate. Will be
A partner from NCTU ADSL (Advanced Database System Lab) • Research topics • Data mining • Mobile Data Management • Data management on sensor networks • Advisor: • Professor Wen-Chih Peng (彭文志) • Personal state: • The master of CSexam.Math forum on Jupiter BBS. • Single, but he has a girlfriend now.
Consider a society with n men (denoted by capital letters) and n women (denoted by lower case letters). • A marriage M is a 1-1 correspondence between the men and women. • Each person has a preference list of the members of the opposite sex organized in a decreasing order of desirability.
A marriage is said to be unstable if there exist 2 marriage couples X-x and Y-y such that X desires y more than x and y desires X more than Y. • The pair X-y is said to be “dissatisfied.” (不滿的) • A marriage M is called “stable marriage” if there is no dissatisfied couple.
Assume a monogamous, hetersexual society. • For example, N = 4. A: abcdB: bacdC: adcbD: dcab a: ABCDb: DCBAc: ABCDd: CDAB • Consider a marriage M: A-a, B-b, C-c, D-d, • C-d is dissatisfied. Why?
Proposal algorithm: Assume that the men are numbered in some arbitrary order. • The lowest numbered unmarried man X proposes to the most desirable woman on his list who has not already rejected him; call her x.
The woman x will accept the proposal if she is currently unmarried, or if her current mate Y is less desirable to her than X (Y is jilted and reverts to the unmarried state). • The algorithm repeats this process, terminating when every person has married. • (This algorithm is used by hospitals in North America in the match program that assigns medical graduates to residency positions.)
Does it always terminate with a stable marriage? • An unmatched man always has at least one woman available that he can proposition. • At each step the proposer will eliminate one woman on his list and the total size of the lists is n2. Thus the algorithm uses at most n2 proposals. i.e., it always terminates.
Claim that the final marriage M is stable. • Proof by contradiction: • Let X-y be a dissatisfied pair, where in M they are paired as X-x, Y-y. • Since X prefers y to x, he must have proposed to y before getting married to x.
Since y either rejected X or accepted him only to jilt (拋棄) him later, her mates thereafter (including Y) must be more desirable to her than X. • Therefore y must prefer Y to X, contradicting the assumption that y is dissatisfied.
Goal: Perform an average-case analysis of this (deterministic) algorithm. • For this average-case analysis, we assume that the men’s lists are chosen independently and uniformly at random; the women’s lists can be arbitrary but must be fixed in advance.
TP denotes the number of proposal made during the execution of the Proposal Algorithm. The running time is proportional to TP. • But it seems difficult to analyze TP.
Principle of Deferred Decisions: • The idea is to assume that the entire set of random choices is not made in advance. • At each step of the process, we fix only therandom choices that must be revealed to the algorithm. • We use it to simplify the average-case analysis of the Proposal Algorithm.
Suppose that men do not know their lists to start with. Each time a man has to make a proposal, he picks a random woman from the set of women not already propositioned by him, and proceeds to propose to her. • The only dependency that remains is that the random choice of a woman at any step depends on the set of proposals made so far by the current proposer.
However, we can eliminate the dependency by modifying the algorithm, i.e., a man chooses a woman uniformly at random from the set of all n women, including those to whom he has already proposed. • He forgets the fact that these women have already rejected him. • Call this new version the Amnesiac Algorithm.
Note that a man making a proposal to a woman who has already rejected him will be rejected again. • Thus the output by the Amnesiac Algorithm is exactly the same as that of the original Proposal Algorithm. • The only difference is that there are some wasted proposals in the Amnesiac Algorithm.
[ ] [ ] h f l l P P T T T i · t > > r r a s m m o r a m P A , . • Let TA denote the number of proposals made by the Amnesiac Algorithm. TP>m TA>m, i.e., TA stochastically dominates TP.
It suffices to find an upper bound to analyze the distribution TA. • A benefit of analyzing TA is that we need only count that total number of proposals made, without regard to the name of the proposer at each stage. • This is because each proposal is made uniformly and independently to one of n women.
The algorithm terminates with a stable marriage once all women have received at least one proposal each. • Moreover, bounding the value of TA is a special case of the coupon collector’s problem.
d l F R t t + 2 o r a n y c o n s a n c a n m n n n c n = , , ¡ c ¡ [ ] e l P T i 1 0 ¡ > r m m e ! = A : n 1 ! • Theorem: ([MR95, page 57]) • The Amnesiac Algorithm terminates with a stable marriage once all women have received at least one proposal each.
Bounding the value of TA is a special case of the coupon collector’s problem.
The Coupon Collector’s Problem • Input: Given n types of coupons. At each trial a coupon is chosen at random. Each random choice of the coupons are mutually independent. • Output: The minimumnumber of trialsrequired to collect at least one of each type of coupon.
You may regard this problem as “Hello Kitty Collector’s Problem”. • Let X be a random variable defined to be the number of trials required to collect at least one of each type of coupon. • Let C1, C2, …, CX denote the sequence of trials, where Ci{1, …, n} denotes the type of the coupon drawn in the ith trial.
Call the ith trial Ci a success if the type Ci was not drawn in any of the first i – 1 selections. • Clearly, C1 and CX are always successes. • We consider dividing the sequence into epochs (時期), where epoch i begins with the trial following the ith success and ends with the trial on which we obtain the (i+1)st success.
i ¡ 1 ¡ n n p = i P X X . n = i . i 0 = What kind of probability distribution does Xi possess? • Define the random variable Xi, for 0 i n1, to be the number of trials in the ith stage (epoch), so that • Let pi denote the probability of success on any trial of the i-th stage. • This is the probability of drawing one of the n–i remaining coupon types and so,
i h b l d b d N 2 i i i i i i [ ( ] ) = ( ) ( ) t t t t t t 1 1 1 e ¡ ¡ ¡ l E S H X £ 1 1 1 n n n ¡ + 1 o e a n o m a s r u o n a n g e o m e r c . . , o n n p ¾ p = = = i i i X n P P P [ ] [ ] [ ] , . h E E E T X X X [ ] ( ) ( ) l E X O i + u s n n n n = = = = d b i i i i i i t t t t s r u o n a r e v e r y v e r y m p o r a n p i , . i i i 0 0 0 = = = 1 ¡ n n 1 n P P H n n = = = i i n . ¡ n i i 0 1 = = • Recall that Xi is geometrically distributed with pi.
2 = 1 ¡ 6 n ¼ 2 2 P ¾ ¾ = X X i i 0 = 1 ¡ n i n P = 2 ( ) i ¡ n i 0 = 0 n ( ) i ¡ n n P = 2 0 i 0 i 1 = n 1 2 P H ¡ n n = 2 0 n . i 0 i 1 = • Xi’s are independent, thus
1 ( ) ( ) h d h l l l Y H i O · · t t t + [ ] ¯ l P X ¸ · o u m g n e e e r e s u : n n n n n n n n r n n n n . 2 2 . ¯ l n n Exercise • Use the Chebyshev’s inequality to find an upper bound on the probability that X > n ln n, for a constant > 1. • Try to prove that
b d b l h L X i i i t t t t e e a r a n o m v a r a e w e x p e c a o n d d d d h f T i i t t ¹ a n s a n a r e v a o n ¾ e n o r a n y X X . + R t 2 , 1 [ j j ] P X ¸ · t ¡ r ¹ ¾ X X 2 : t l l i t o r e q u v a e n y , 2 ¾ X [ j j ] P X ¸ · t ¡ r ¹ X 2 : t Remark: Chebyshev’s Inequality
Our next goal is to derive sharper estimates of the typical value of X. • We will show that the value of Xis unlikely to deviate far from its expectations, or, is sharply concentrated around its expected value.
» = = r ¯ 1 ¡ ¡ ¡ [ ] ( ( ) ) h ¯ » l ¯ P T F r r r n r n 1 1 · ¡ > r o r u r s n n n e e n i = = = i , , . . n n [ [ ] [ ] r » P P X > r r r = i i 1 = n n X X ( ) ¯ ¯ 1 ¡ ¡ ¡ [ ] r » P · · r n n = i : i i 1 1 = = • Let denote the event that coupon type i is not collected in the first r trials. It is still polynomially small.
So that’s it? • Is the analysis good enough? • Not yet! • Let consider the following heuristic argument which will help to establish some intuition.
» f g ¡ ¢ N N r r r N ¡ r [ ] ( ) 0 r P N r x r x 1 ¡ = r x p p i i i = = i i . . x Poisson Heuristic The crucial idea! • Let denote the number of times the coupon of type i is chosen during the first r trials. • is the same as the event • has the binomial distribution with parameter r and p = 1/n.
¸ ¡ y ¸ e [ ] P Y r y = = ! . y Recall of the Poisson distribution • Let be a positive real number. • Y: a non-negative integer r.v. • Y has the Poisson distribution with parameter if for any non-negative integer y,
= ¸ N r 0 ¸ ¡ = ¸ r n ¡ = e r n i [ ] [ ] h » P P T N r r 0 e t r r u s = = = i i ! 0 , . • For proper small and as r, the Poisson distribution with = rp is a good approximation to the binomial distribution with parameter r and p. • Approximate by the Poisson distribution with parameter since p = 1/n.
» r k k i [ ( T ) ] r r T » » [ j ] P [ ] » » » P P r r r \ r r r = k i j k 1 + i j i ( ) r l . 1 ¡ l l 1 [ j T ] = » » l P n r r 1 = r = = k i j k ( ) l 1 r ¡ l 1 [ T ] r » P n = r j l l 1 = ( ) = k 1 ¡ + r n = ¡ e r n e t = = k ¡ . r n e • Claim: , for 1 i n, are almost independent. i.e., for any index set {j1,..., jk} not containing i, • Proof:
= n n ( ( ) ) n n l ¡ [ ] k » R P + r r n m n n n c = t = e m a r : r e ¡ [ S ] [ T ( ) ] ( ) [ » ] [ S » ] [ S ] P P m m m n n 1 » » P P P X i m m 1 ¡ ¡ . t > r r : : e r r r m : = = = i i i i i i 1 1 i i 1 1 = = = = = ¡ = ¡ ¡ m n ¡ m n c ¡ ¡ n e n e e 1 1 ¡ ¡ t e t e e = . . • Thus, • Let , for any constant c. 0 for large positive c. 1 for large negative c.
= ( ( ) ) l h S i ¡ + m n n c e m n n n c w e a v e ¡ = n e 1 ¡ , e l ¡ ¡ n n c ¡ n e 1 ¡ e = l ¡ ¡ n n c ¡ ¢ n e e 1 ¡ e = 1 ¡ l ¡ n n c ¡ ¢ n e e 1 ¡ e = 1 ¡ c ¡ ¢ ¢ n e 1 ¡ e n = ¡ c ¡ e 1 ¡ e = . More Explanations for the Previous Equation: It is exponentially close to 0 as the value of positive c increases.
The Power of Poisson Heuristic • It gives a quick back-of-the-envelope type estimation of probabilistic quantities, which hopefully provides some insight into the true behavior of those quantities. • Poisson heuristic can help us do the analysis better.
N r i But… • However, it is not rigorous enough since it only approximates . • We can convert the previous argument into a rigorous proof using the Boole-Bonferroni Inequalities. (Yet the analysis will be more complex.)
Are you ready to be rigorous? • Tighten your seat belt!
Take a break! (感謝物理系黃教授提供) • 「天母」地名的由來: • 話說以前美軍曾在台北駐軍。某一日當他們行經一地時,詢問當地居民說: • “Where is it?” • 當地居民看到阿豆仔,聽不懂他們講什麼,紛紛回答說: • 「聽無啦!」 • 美軍這時恍然大悟,從此以後就給這地方取了一個名字,叫做“Tien-Mu”.
k l + m n n n c n = , P T [ ] f » P P L P n m t ¡ c r r o o : e ¡ = [ ] e l P X i 1 i k ¡ > . r m m e j = : i i i j 1 1 · · < < < n = k 1 2 n 1 ! : : : A Rigorous Analysis • Theorem 1: Let X be a random variable defined to be the number of trials for collecting each of the n types of coupons. Then, for any constant c and
Remark: denotes the event that coupon type i is not collected in the first r trials. » k r 1 + ( ) n n L S P P P P n n n n n 1 t ¡ ¡ ¡ + + e = i k 1 + f [ S g ] P S ( ) k k 1 2 3 » » P X P m m n 1 : : : ¡ > r m = = i i k d h l f d b h ¯ k i . . t t t t t e n o e s e p a r a s u m o r m e y e r s i i k 1 1 = = f h i i t t e r m s o s s e r e s . • Note that the event By the principle of Inclusion-Exclusion
d d k k b [ S ] F F Y Y i h » b h P W S S t t n m n · · t o o r r e o v e n : : : a r r a r y e v e n s r e a v e y e 1 n k i k 2 2 1 ; : : : ; . + j j k k i j j 1 1 n n + + [ [ S S ] ] P P ( ( ) ) P P [ [ T T ] ] P P P P Y Y Y Y 1 1 · ¸ ¡ ¡ l f l B B i i i i r r t r r i i i i o o e - o n e r r o n n e q u a e s : i i 1 1 . . = = r r j j i i i i i i 1 1 1 1 < < < < < < r r = = = = j j 1 1 2 2 : : : : : :
k k n [ S ] P Y [ S ] P Y r r i i i 1 i 1 = = Illustration for the Boole-Bonferroni inequalities …
» k k k k ¡ ¢ m ¡ ¡ = = ¡ ¢ ( ) n k ! k ! P c c P [ T ] n m n 1 » P P ¡ n m e e = i k = r = k k . . k i i 1 k . n = • By symmetry, all the k-wise intersections of the events are all equally likely, i.e. • More precisely, • For all positive integer k, define According to Lemma 1
¡ f l l h k h h f k k ¤ ¤ i i 0 t ( ) t t t k c k h l f T S ¡ ( ) ¡ i > > ( ) f d ¯ H e o C r a ² e r e e x s s s u c a o r 1 e i i x 1 t t ¡ ¡ j ¡ u s m c = n : o n s e r g x e r s c e c k = = . . , , , e j j 1 1 . + + . P ( ) P ( ) . S P 1 1 ¡ ¡ j ( ) j f k S ¡ = = < 1 k ! j c ² ! j k , . j j 1 1 = = • Define the partial sum of Pk’s as the first k terms of the power series expansion of
l h l S P P S S i i i n n k k k 1 + ( ) k R S P P P P n n n n 1 n n c e m w e a v e m ¡ ¡ ¡ + + j ¡ = = k k e m a r : c = k k e k k j j , . 1 1 1 2 3 + + P ( ) P ( ) : : : . S P 1 1 ¡ ¡ = = n 1 n 1 k j ! ! ! j , j j 1 1 = = h f l l d k k h ± ¤ T i 0 > > u s o r a ² a n w e n n s s u - , j j l l S S i n t ¡ < c e n y a r g e ² k k , . h f l l d k k d l h ¤ T 0 > > u s o r a ² a n a n a r g e e n o u g , j j j ( ) j h d f S S S n ¡ ¡ < < n w e a v e ² a n c ² k k k , h h l h i i i t t w c m p e s a j ( ) j j j n n n f d S S S 2 4 ¡ ¡ < < c ² a n ² k k k 2 2 1 + :