Introduction

In this post we will be looking at probabilistic coupling, which is putting two random variables (or proceses) on the same probability space. This turns out to be a surprisingly powerful tool in probability and this post will hopefully brainwash you into agreeing with me.

Consider first this baby problem. You and I flip coins and X_n, Y_n denote our n-th coin flip. Suppose that I have probability p_1 of landing heads and you have probability p_2 \geq p_1 of landing heads. Let

T_Z:=\inf\{n\geq 3: Z_nZ_{n-1}Z_{n-2}=HHH\}

be the first times Z sees three heads in a row. Now obviously you know that \mathbb{E}[T_X]\leq \mathbb{E}[T_Y], but can you prove it?

Of course here it is possible to compute both of the expectations and show this directly, but this is rather messy and long. Instead this will serve as a baby example of coupling.

So we want to put X_n, Y_n in the same probability space such that X_n \leq Y_n almost surely for each n\in \mathbb{N}. Let U_1,U_2,\dots be i.i.d. uniforms on [0,1] and for each n \in \mathbb{N} say that \tilde X_n is heads if U_n \leq p_1 and otherwise tails, and \tilde Y_n is heads if U_n \leq p_2 and otherwise tails.

What is apparent here is that \tilde X_n \leq \tilde Y_n as we use the same source of “randomness”. Also the marginals are given by X_n, Y_n and hence

\mathbb{E}[T_X]=\mathbb{E}[T_{\tilde X}]\leq\mathbb{E}[T_{\tilde Y}]=\mathbb{E}[T_Y].

Random Walk

I really like this example of coupling because everyone learns this without learning coupling. We prove that using the transience of the simple symmetric random walk (SSRW) on \mathbb{Z}^3, the SSRW on \mathbb{Z}^d is transient for any d \geq 3. This is done by induction on d so we will only look at d=4.

Suppose X=(X_n:n \geq 0) is a random walk on \mathbb{Z}^4 and let Y_n\in \mathbb{Z}^3 be defined by projecting X_n on to the first three co-ordinates, i.e. Y_n=(X^1_n,X^2_n,X^3_n). Now this makes a lazy random walk on \mathbb{Z}^3, to get rid of the laziness we define T_0=0 and

T_n:=\inf\{n> T_{n-1}: Y_n\neq Y_{T_{n-1}}\}.

It is easy to see that T_n < \infty almost surely and so Z_n:=Y_{T_n} is a random walk on \mathbb{Z}^3.

Now one could make a contradiction argument here and say if X was recurrent then Z would also be recurrent, but the construction here is hidden in that statement. Explicitly we constructed a random walk X on \mathbb{Z}^4 and a random walk Z on \mathbb{Z}^3 such that

|Z_n|\leq |X_n|

where | \cdot | denotes the Euclidian norm. Hence we have that if |Z_n| \rightarrow \infty as n \rightarrow \infty, then so does |X_n|.

Liouville’s Theorem

A while ago I wrote a post on this (found here) that covers a probabilistic proof of Liouville’s theorem in \mathbb{R}^2. Here we look at \mathbb{R}^d but can be extended to manifolds as well (see here for an example).

So we suppose that u:\mathbb{R}^d \rightarrow \mathbb{R} is a bounded harmonic function. Liouville’s theorem states that u is constant.

Let B=(B_t:t \geq 0) be a Brownian motion on \mathbb{R}^d and B^x_t:=B_t+x. It is easy to show that

u(x)=\mathbb{E}[u(B_t^x)].

Now we let x\neq y be arbitrary and H be the hyperplane such that x reflected about it is y. Define

\tau:=\inf\{t\geq 0: B_t^x \in H\}.

Before we use coupling it is worth convincing yourself that \tau < \infty. Loosely speaking, this is due to the fact that according the vector normal to H, the Brownian motion is one dimensional.

Now we construct a new Brownian motion W=(W_t:t\geq 0) by reflecting B^x about H. This has the same law as B^y. We see that

B_t \buildrel \text{d} \over = W_t \quad \forall t \geq \tau.

But now

|u(x)-u(y)| \leq |\mathbb{E}[B^x_t]-\mathbb{E}[W_t]|\leq 2 ||u||_\infty \mathbb{P}(t < \tau)

which holds for all t \geq 0, where ||u||_\infty:=\sup|u(x)|<\infty. Taking t \rightarrow \infty and using the fact \tau < \infty we see that the right hand side of the equation tends to zero.