Before moving on to scaling a random walk into a Wiener process, we must understand the notion of

*convergence*of random variables.

We won't cover the notions of

*uniform*and

*pointwise*convergence of a sequence of functions here - these are covered in any undergraduate analysis class. We will delve right into

*Convergence of random variables.*

**Convergence in probability:**

Let $\left\{X_n\right\}$ be a sequence of of random variables and $X$ be a random variable. $\left\{X_n\right\}$ is said to

*converge in probability*to $X$ if $\forall \epsilon \lt 0$

$$P\left(\left| X_n-X \right| \lt \epsilon \right)=0$$ as $n \rightarrow \infty$.

Another notion of convergence of random variables is

*Convergence in distribution.*

**Convergence in distribution:**

Given random variables $\{X_1, X_2, ...\}$ and the cumulative distribution functions $F_i(x)$ of $X_i$ then we say that $X_n$

*converges to the distribution*of $X$ if

$$\lim_{n \rightarrow \infty} F_n(x) = F(x)$$ for every $x \in \mathbb{R}$ at which $F(x)$ is continuous.

Note that only the

*distributions*of the variables matter - so they random variables themselves could in fact be defined on different probability spaces.

These may seem like fairly elementary definitions, but they hold great importance in applications of probability, namely through the

*(Weak) Law of Large numbers*and the

*Central Limit Theorem -*which will be covered in the next post.

## No comments:

## Post a Comment