==Bivariate Transformation=

Identify supports and such that the transformation is a 1-to-1 map from to . If there isn't an “onto” mapping, then identify a partition of such that there is a 1-to-1 map from each to . (C&B 162)

(C&B 157)

is its determinant (ad - bc)

If ~ and ~ and and are independent, then ~

Simplify an integral into the form of a known distribution, so it integrates to 1.

See Wikipedia

Used to derive moment-generating function of Poisson.

“can be applied to completely arbitrary distributions (unknown except for mean and variance)”

Use to show that certain limits go to zero in order to satisfy conditions for a Central Limit Theorem.

Given a random variable with finite mean and variance :

http://en.wikipedia.org/wiki/Central_limit_theorem

http://cnx.org/content/m16958/latest/

The sample mean converges in distribution to for samples.

The sum of samples converges in distribution to

The Random Variables have to be independent, but not necessarily identically distributed.

Lyapunov's condition:

When Lyapunov's condition is satisfied,

The Random Variables do not need to be identically distributed, as long as they satisfy Lindeberg's condition.

Lindeberg's Condition

If Lindeberg's Condition is satisfied, then

and

If and are partitioned as follows

with sizes

with sizes

then, the distribution of conditional on is multivariate normal.

where

and covariance matrix

.

This matrix is the Schur complement of in **Σ**. This means that to calculate the conditional covariance matrix, one inverts the overall covariance matrix, drops the rows and columns corresponding to the variables being conditioned upon, and then inverts back to get the conditional covariance matrix. Here is the generalized inverse of

Note that knowing that alters the variance, though the new variance does not depend on the specific value of . Perhaps more surprisingly, the mean is shifted by .

Compare this with the situation of not knowing the value of , in which case would have distribution

.

An interesting fact derived in order to prove this result, is that the random vectors and are independent.

The matrix is known as the matrix of regression analysis coefficients.

In the bivariate case where **x** is partitioned into and , the conditional distribution of given is

where is the correlation coefficient between and .

In the case

the following result holds

where the final ratio here is called the inverse Mills ratio.

For random variable which satisfies , sample vector , samples, and a function of the random variable :

Use a Taylor-series approximation of without calculating the distribution of .

First-order