If you haven’t read my previous post, I would suggest doing so before this. We denote by the partitions of . The thing to keep in mind here is that we want to think of a coalescent process as a history of lineage. Suppose we start with the trivial partition and think of each block as a member of some population. A coalescent process on is essentially defines ancestries, in the sense that if and belong to the same block of for some , then we think of that block as the common ancestor of and .

With this in mind, define the operator by

.

With some conditions, we can define the same operator on , the partitions of . So for example if and , then . The partition tells us in this case to merge the first and third block and leave the second block alone.

This turns out to be the right tool for describing ancestral lineages as family should only merge together. Let be the restriction of to in the obvious way, i.e. if , then and let . Then we have the following nice proposition which follows from the observation that .

**Proposition:**

*The space with the binary operation defined on it is a monoid with the identity .*

With this in mind, a natural question to ask is if the operator preserves exchangeability. Recall that a random partition is exchangeable if for each permutation of that fixes all but finitely many points we have that has the same distribution as , defined by if and only if .

**Proposition:**

*Let be two independent exchangeable random partitions, then is exchangeable.*

**Proof:**

Let be a permutation and . Notice that if and only if which happens whenever or , where . Consider the cases one by one; firstly we have that the event has the same probability as the event by exchangeability of . Second, if , then where . Let be the random permutation such that , which will be independent of , then we have that if and only if , which has the same probability as by independence and exchangeability.

Thus putting this all together we have that the event has the same probability as , which proves exchangeability.

Let us now endow this space with a metric in preparation to introduce some Feller processes:

.

The choice of this metric is due to the paintbox correspondence of exchangeable partitions (as described here). It is then an easy task to verify that is Lipschitz.

We arrive now at a natural definition of homogeneous exchangeable coalescent. This will be nothing but a Levy process on the monoid we have.

**Definition:**

A process is called an **exchangeable coalescent** if for each the distribution of , conditioned on , is given by where only depends on and is exchangeable.

The fact that these processes carry the Feller property is immediate from the properties of , and thus we obtain the strong Markov property.

A famous example of such a process is the so called **Kingman’s coalescent** in which a pair of blocks merge at rate 1. Without going in to so much detail about the construction of the Kingman’s coalecent, one interesting aspect of this process is that it **comes down from infinity**, that is to say the number of blocks are finite almost surely for all . To see why this is true we just need to look at ,

where are i.i.d. standard exponential random variables. But now as the sum almost surely, this implies that the tail is vanishing and hence we must have that as .

Before we look more into this phenomena, let us first categorize the coalescents via their generators. Without loss of generality we will henceforth assume that and for , where , define

.

Denote by where . There is a natural way to associate these jump rates with the use of a certain measure which satisfies the following

- for all and each
- consequently and .
- is invariant under the action of permutations

Indeed the converse is also true, that is, if we have a measure defined on such that 2. and 3. above hold, then there exists an exchangeable coalescent such that the jump rates are given by . The construction of such a process is very similar to the case of constructing Levy processes from the jump measure by using a Poisson point process. Though interesting, we leave this aside and take the statement on face value.

As a concrete example, consider the Kingman’s coalescent described above. Recall that in the Kingman’s coalescent, any two blocks merge at rate 1, so that if we let be the partition which is all singletons except contains the set , then the rate of the Kingman’s coalescent is given by .

For the reader who is not comfortable with jump measures of Markov processes and/or is confused about the discourse above, not to worry. You do not need to understand all of that abstract non-sense, what describes is the rate of transfer in an infinitesimal amount of time.

There are nice decomposition results for jump measures of coalescent processes given in the Bertoin book (see below). We will be looking at the phenomena of coming down from infinity, that is, the process having finitely many blocks a.s. for any non-zero time. Of course there is the trivial case that we should disregard, which is the case when . This is the case when we have positive probability of going from the state to . In this case it is easy to deduce that eventually we will have finitely many blocks (in fact, one block).

With that aside and a little adaptation of the Schweinberg paper, I present to you a pretty cute result.

**Theorem:**

*Suppose that , then either a.s. for all or a.s. for all .*

Before we prove the theorem, let us see a lemma that will aid us with the main ideas of the proof. A warning I should give here is that the process need not be Markovian, let alone strong Markovian. As far as I am aware, this question still remains open.

We would like to first show that regardless of where you start your process, the time you come down from infinity is the same if your starting point has infinitely many blocks. For this notice that is the same as started from .

**Lemma:**

*For let , then for all with we have .*

**Proof:**

Notice that from the definition if and only if (just check what happens when one of them is finite). With this observation we see that in the case when , if and only if .

**Proof of Theorem:**

Let and . Suppose that , then by the above and the Markov property we have that for all and . So now by he recurrence relation we have that , but is arbitrary, so it must be that .

Next suppose that , we will prove that . There are some cases to consider which we list and do in order.

*Case 1:*

This follows from the strong Markov property and the lemma we have proved above. If and , then which is an obvious contradiction.

*Case 2: *

This follows from the fact that we have declared . In particular this implies that we cannot merge all but finitely many blocks together in one merger time, that is, if , then .

*Case 3: *

Suppose that and let . Then define recursively and where . In plain English, is the first time that the smallest integer not in any of the blocks of containing , joins a block containing some .

From this description we can see that and so it is enough now to show that for a contradiction. Notice that if denotes the total number of mergers involving some blocks, then the condition directly implies that . But now we have an upper bound on where are i.i.d. exponential random variables with parameter . Now the claim that directly follows from

.

There are interesting results on the so called -coalescents which is have given a list of below. A -coalescent is a subset of the class of coalescents that I was talking about, but has the restriction that blocks can only merge into a single block.

Further Reading:

- Jean Bertoin, Random Fragmentation and Coagulation Processes
- Nathanael Berestycki, Recent Progress in Coalescent Theory
- Jason Schweinsberg, Coalescents with Simultaneous Multiple Collisions
- Jim Pitman, Coalescents with Multiple Collisions
- Jason Schweinsberg, A Necessary and Sufficient Condition for the -coalescent to Come Down from Infinity

## Leave a comment

Comments feed for this article