\(\newcommand{\P}{\mathbb{P}}\) \(\newcommand{\E}{\mathbb{E}}\) \(\newcommand{\R}{\mathbb{R}}\) \(\newcommand{\N}{\mathbb{N}}\) \(\newcommand{\bs}{\boldsymbol}\) \(\newcommand{\var}{\text{var}}\) \(\newcommand{\cov}{\text{cov}}\) \(\newcommand{\cor}{\text{cor}}\)
  1. Virtual Laboratories
  2. 11. Finite Sampling Models
  3. 1
  4. 2
  5. 3
  6. 4
  7. 5
  8. 6
  9. 7
  10. 8
  11. 9

5. The Matching Problem

Definitions and Notation

The Matching Experiment

The matching experiment is a random experiment that can the formulated in a number of colorful ways:

These experiments are clearly equivalent from a mathematical point of view, and correspond to selecting a random permutation \(\bs{X} = (X_1, X_2, \ldots, X_n)\) of the population \(D_n = \{1, 2, \ldots, n\}\). Here are the interpretations for the examples above:

Our modeling assumption, of course, is that \(\bs{X}\) is uniformly distributed on the sample space of permutations of \(D_n\). The number of objects \(n\) is the basic parameter of the experiment. We will also consider the case of sampling with replacement from the population \(D_n\), because the analysis is much easier but still provides insight. In this case, \(\bs{X}\) is a sequence of independent random variables, each uniformly distributed over \(D_n\).

Matches

We will say that a match occurs at position \(j\) if \(X_j = j\). Thus, number of matches is the random variable \(N\) defined mathematically by

\[ N_n = \sum_{j=1}^n I_j\]

where \(I_j = \bs{1}(X_j = j)\) is the indicator variable for the event of match at position \(j\). Our problem is to compute the probability distribution of the number of matches. This is an old and famous problem in probability that was first considered by Pierre-Remond Montmort; it sometimes referred to as Montmort's matching problem in his honor.

Sampling With Replacement

First let's solve the matching problem in the easy case, when the sampling is with replacement.

\((I_1, I_2, \ldots, I_n)\) is a sequence of \(n\) Bernoulli trials, with success probability \(\frac{1}{n}\).

Proof:

The variables are independent since the sampling is with replacement. Since \(X_j\) is uniformly distributed, \(\P(I_j = 1) = \P(X_j = j) = \frac{1}{n}\).

The number of matches \(N_n\) has the binomial distribution with trial parameter \(n\) and success parameter \(\frac{1}{n}\).

\[ \P(N_n = k) = \binom{n}{k} \left(\frac{1}{n}\right)^k \left(1 - \frac{1}{n}\right)^{n-k}, \quad k \in \{0, 1, \ldots, n\} \]
Proof:

This follows immediately from Exercise 1.

The mean and variance of the number of matches are

  1. \(\E(N_n) = 1\)
  2. \(\var(N_n) = \frac{n-1}{n}\)
Proof:

These results follow from Exercise 2. Recall that the binomial distribution with parameters \(n\) and \(p\) has mean \(n \, p\) and variance \(n \, p (1 - p)\).

The distribution of the number of matches converges to the Poisson distribution with parameter 1 as \(n \to \infty\):

\[ \P(N_n = k) \to \frac{e^{-1}}{k!} \text{ as } n \to \infty \text{ for } k \in \N \]
Proof:

This is a special case of the convergence of the binomial distribution to the Poisson. For a direct proof, note that

\[ \P(N_n = k) = \frac{1}{k!} \frac{n^{(k)}}{n^k} \left(1 - \frac{1}{n}\right)^{n-k} \]

But \(\frac{n^{(k)}}{n^k} \to 1\) as \(n \to \infty\) and \(\left(1 - \frac{1}{n}\right)^{n-k} \to e^{-1}\) as \(n \to \infty\) by a famous limit from calculus.

Sampling Without Replacement

Now let's consider the case of real interest, when the sampling is without replacement, so that \(\bs{X}\) is a random permutation of the elements of \(D_n = \{1, 2, \ldots, n\}\).

Counting Permutations with Matches

To find the probability density function of \(N_n\), we need to count the number of permutations of \(D_n\) with a specified number of matches. This will turn out to be easy once we have counted the number of permutations with no matches; these are called derangements of \(D_n\). We will denote the number of permutations of \(D_n\) with exactly \(k\) matches by \(b_n(k) = \#\{N_n = k\}\) for \(k \in \{0, 1, \ldots, n\}\). In particular, \(b_n(0)\) is the number of derrangements of \(D_n\).

The number of derrangements is

\[ b_n(0) = n! \sum_{j=0}^n \frac{(-1)^j}{j!} \]
Proof:

By the complement rule for counting measure \(b_n(0) = n! - \#(\bigcup_{i=1}^n \{X_i = i\})\). From the inclusion-exclusion formula,

\[ b_n(0) = n! - \sum_{j=1}^n (-1)^{j-1} \sum_{J \subseteq D_n, \; \#(J) = j} \#\{X_i = i \text{ for all } i \in J\} \]

But if \(J \subseteq D_n\) with \(\#(J) = j\) then \(\#\{X_i = i \text{ for all } i \in J\} = (n - j)!\). Finally, the number of subsets \(J\) of \(D_n\) with \(\#(J) = j\) is \(\binom{n}{j}\). Substituting into the displayed equation and simplifying gives the result.

The number of permutations with exactly \(k\) matches is

\[ b_n(k) = \frac{n!}{k!} \sum_{j=0}^{n-k} \frac{(-1)^j}{j!}, \quad k \in \{0, 1, \ldots, n\} \]
Proof:

The following is two-step procedure that generates all permutations with exactly \(k\) matches: First select the \(k\) integers that will match. The number of ways of performing this step is \(\binom{n}{k}\). Second, select a permutation of the remaining \(n - k\) integers with no matches. The number of ways of performing this step is \(b_{n-k}(0)\). By the multiplication principle of combinatorics it follows that \(b_n(k) = \binom{n}{k} b_{n-k}(0)\). Using the result in Exercise 5 and simplifying gives the results.

The Probability Density Function

The probability density function of the number of matches is

\[ \P(N_n = k) = \frac{1}{k!} \sum_{j=0}^{n-k} \frac{(-1)^j}{j!}, \quad k \in \{0, 1, \ldots, n\} \]
Proof:

This follows directly from Exercise 6, since \(\P(N_n = k) = \#\{N_n = k\} / n!\).

In the matching experiment, vary the parameter \(n\) and note the shape and location of the probability density function. For selected values of \(n\), run the simulation 1000 times and note the apparent convergence of empirical density function to the true probability density function.

\(\P(N_n = n - 1) = 0\).

Proof:

A simple probabilistic proof is to note that the event is impossible--if there are \(n - 1\) matches, then there must be \(n\) matches. An algebraic proof can also be constructed from the probability density function in Exercise 7.

The distribution of the number of matches converges to the Poisson distribution with parameter 1 as \(n \to \infty\):

\[ \P(N_n = k) \to \frac{e^{-1}}{k!} \text{ as } n \to \infty, \quad k \in \N \]
Proof:

From the power series for the exponential function,

\[ \sum_{j=0}^{n-k} \frac{(-1)^j}{j!} \to \sum_{j=0}^\infty \frac{(-1)^j}{j!} = e^{-1} \text{ as } n \to \infty \]

So the result follows from the probability density function in Exercise 7.

The convergence is remarkably rapid.

In the matching experiment, increase \(n\) and note how the probability density function stabilizes rapidly. For selected values of \(n\), run the simulation 1000 times and note the apparent convergence of the relative frequency function to the probability density function.

Moments

The mean and variance of the number of matches could be computed directly from the distribution. However, it is much better to use the representation in terms of indicator variables. The exchangeable property is an important tool in this section.

\(\E(I_j) = \frac{1}{n}\) for \(j \in \{1, 2, \ldots, n\}\).

Proof:

\(X_j\) is uniformly distributed on \(D_n\) for each \(j\) so \(\P(I_j = 1) = \P(X_j = x) = \frac{1}{n}\).

\(\E(N_n) = 1\) for each \(n\)

Proof:

This follows from Exercise 12 and basic properties of expected value.

Thus, the expected number of matches is 1, regardless of \(n\), just as when the sampling is with replacement.

\(\var(I_j) = \frac{n-1}{n^2}\) for \(j \in \{1, 2, \ldots, n\}\).

Proof:

This follows from \(\P(I_j = 1) = \frac{1}{n}\).

A match in one position would seem to make it more likely that there would be a match in another position. Thus, we might guess that the indicator variables are positively correlated.

For distinct \(j, \; k \in \{1, 2, \ldots, n\}\),

  1. \(\cov(I_j, I_k) = \frac{1}{n^2 (n - 1)}\)
  2. \(\cor(I_j, I_k) = \frac{1}{(n - 1)^2}\)
Proof:

Note that \(I_j I_k\) is the indicator variable of the event of a match in position \(j\) and a match in position \(k\). Hence by the exchangeable property \(\P(I_j I_k = 1) = \P(I_j = 1) \P(I_k = 1 \mid I_j = 1) = \frac{1}{n} \frac{1}{n-1}\). As before, \(\P(I_j = 1) = \P(I_k = 1) = \frac{1}{n}\). The results now follow from standard computational formulas for covariance and correlation.

From Exercise 15, when \(n = 2\), the event that there is a match in position 1 is perfectly correlated with the event that there is a match in position 2. This makes sense, since there will either be 0 matches or 2 matches.

\(\var(N_n) = 1\) for every \(n \in \{2, 3, \ldots\}\).

Proof:

This follows from the previous two exercises and basic properties of covariance. Recall that \(\var(N_n) = \sum_{j=1}^n \sum_{k=1}^n \cov(I_j, I_k)\).

In the matching experiment, vary the parameter \(n\) and note the shape and location of the mean/standard deviation bar. For selected values of the parameter, run the simulation 1000 times and note the apparent convergence of sample mean and standard deviation to the distribution mean and standard deviation.

For distinct \(j, \; k \in \{1, 2, \ldots, n\}\), \(\cov(I_j, I_k) \to 0\) as \(n \to \infty\).

Thus, the event that a match occurs in position \(j\) is nearly independent of the event that a match occurs in position \(k\) if \(n\) is large. For large \(n\), the indicator variables behave nearly like \(n\) Bernoulli trials with success probability \(\frac{1}{n}\), which of course, is what happens when the sampling is with replacement.

A Recursion Relation

In this subsection, we will give an alternate derivation of the distribution of the number of matches, in a sense by embedding the experiment with parameter \(n\) into the experiment with parameter \(n + 1\).

The probability density function of the number of matches satisfies the following recursion relation and initial condition:

  1. \(\P(N_n = k) = (k + 1) \P(N_{n+1} = k + 1), \quad k \in \{0, 1, \ldots, n\}\).
  2. \(\P(N_1 = 1) = 1\).
Proof:

First, consider the random permutation \((X_1, X_2, \ldots, X_n, X_{n+1})\) of \(D_{n+1}\). Note that \((X_1, X_2, \ldots, X_n)\) is a random permutation of \(D_n\) if and only if \(X_{n+1} = n + 1\) if and only if \(I_{n+1} = 1\). It follows that

\[ \P(N_n = k) = \P(N_{n+1} = k + 1 \mid I_{n+1} = 1), \quad k \in \{0, 1, \ldots, n\} \]

From the defnition of conditional probability argument we have

\[ \P(N_n = k) = \P(N_{n+1} = k + 1) \frac{\P(I_{n+1} = 1 \mid N_{n+1} = k + 1)}{\P(I_{n+1} = 1)}, \quad k \in \{0, 1, \ldots, n\} \]

But \(\P(I_{n+1} = 1) = \frac{1}{n+1}\) and \(\P(I_{n+1} = 1 \mid N_{n+1} = k + 1) = \frac{k+1}{n+1}\). Substituting into the last displayed equation gives the recurrence relation. The initial condition is obvious, since if \(n = 1\) we must have one match.

The results of the previous two exercises can be used to obtain the probability density function of \(N_n\) recursively for any \(n\).

The Probability Generating Function

Next recall that the probability generating function of \(N_n\) is given by

\[ G_n(t) = \E\left(t^{N_n}\right) = \sum_{j=0}^n \P(N_n = j) t^j, \quad t \in \R \]

The family of probability generating functions satisfies the following differential equations and ancillary conditions:

\[ \begin{align} G_{n+1}^\prime(t) & = G_n(t), \quad t \in \R, \; n \in \N_+ \\ G_n(1) & = 1, \quad n \in \N_+ \end{align} \]

Note also that \(G_1(t) = t\) for \(t \in \R\). Thus, the system of differential equations can be used to compute \(G_n\) for any \(n \in \N_+\).

In particular, for \(t \in \R\),

  1. \(G_2(t) = \frac{1}{2} + \frac{1}{2} t^2\)
  2. \(G_3(t) = \frac{1}{3} + \frac{1}{2} t + \frac{1}{6} t^3\)
  3. \(G_4(t) = \frac{3}{8} + \frac{1}{3} t + \frac{1}{4} t^2 + \frac{1}{24} t^4\)

For \(k, \; n \in \N_+\) with \(k \lt n\),

\[ G_n^{(k)}(t) = G_{n-k}(t), \quad t \in \R \]
Proof:

This follows from Exercise 20.

For \(n \in \N_+\),

\[ \P(N_n = k) = \frac{1}{k!} \P(N_{n-k} = 0), \quad k \in \{0, 1, \ldots, n - 1\} \]
Proof:

This follows from the previous exercise and basic properties of generating functions to conclude that

Examples and Applications

A secretary randomly stuffs 5 letters into 5 envelopes. Find each of the following:

  1. The number of outcomes with exactly \(k\) matches, for each \(k \in \{0, 1, 2, 3, 4, 5\}\).
  2. The probability density function of the number of matches.
  3. The covariance and correlation of a match in one envelope and a match in another envelope.
Answer:
  1. \(k\) 0 1 2 3 4 5
    \(b_5(k)\) 44 45 20 10 0 1
  2. \(k\) 0 1 2 3 4 5
    \(\P(N_5 = k)\) 0.3667 0.3750 0.1667 0.0833 0 0.0083
  3. Covariance: \(\frac{1}{100}\), correlation \(\frac{1}{16}\)

Ten married couples are randomly paired for a dance. Find each of the following:

  1. The probability density function of the number of matches.
  2. The mean and variance of the number of matches.
  3. The probability of at least 3 matches.
Answer:
  1. \(k\) \(\P(N_{10} = k)\)
    0 0.3678794
    1 0.3678791
    2 0.1839409
    3 0.0613095
    4 0.0153356
    5 0.0030555
    6 0.0005208
    7 0.0000661
    8 0.0000124
    9 0
    10 0.0000003
  2. \(\E(N_{10}) = 1\), \(\var(N_{10}) = 1\)
  3. \(\P(N_{10} \ge 3) = 0.0803\)

In the matching experiment, set \(n = 10\). Run the experiment 1000 times and compare the following for the number of matches:

  1. The true probabilities
  2. The relative frequencies from the simulation
  3. The limiting Poisson probabilities
Answer:
  1. See 5.25 (a)
  2. \(k\) \(\P(N = k)\)
    0 0.3678794
    1 0.3678794
    2 0.1839397
    3 0.0613132
    4 0.0153283
    5 0.0030657
    6 0.0005109
    7 0.0000730
    8 0.0000091
    9 0.0000014
    10 0.0000001