Help:Math formulas in the wiki

The purpose of this page is to explain how to create math formulas in the wiki, and to capture examples of potentially useful math formulas.

Math formulas are supported by MediaWiki's Extension:Math, which will render mathematical formulas in more user-friendly format. This is the same software used by Wikipedia.

As an additional example, some links are implemented as interwiki links.

Using math formulas
The formulas are entered using the math codes defined in Help:Displaying a formula; see this article for the full background on how this works. When the page is previewed (or saved), the wiki will convert those codes into a graphic image.

MediaWiki is using a subset of the TeX Markup Language for displaying math formulas. Compare to HTML, which is HyperText Markup Language, for the web.


 * Syntax table: Functions, symbols, special characters, copy and paste from here.

Formula examples
This section provides examples of formulas that may be close to a formula you want to include in your wiki article. If you find one that is close, click Edit next to the section title, copy the formula, then paste it into your wiki article and make the necessary edits.

Expected value
These examples are from Expected Value on Wikipedia.

Discrete random variable, finite case

Suppose random variable X can take value x1 with probability p1, value x2 with probability p2, and so on, up to value xk with probability pk. Then the expectation of this random variable X is defined as

\operatorname{E}[X] = x_1p_1 + x_2p_2 + \ldots + x_kp_k \;. $$ Since all probabilities pi add up to one: p1 + p2 + ... + pk = 1, the expected value can be viewed as the weighted average, with pi’s being the weights:

\operatorname{E}[X] = \frac{x_1p_1 + x_2p_2 + \ldots + x_kp_k}{p_1 + p_2 + \ldots + p_k} \;. $$

Discrete random variable, countable case

Let X be a discrete random variable taking values x1, x2, ... with probabilities p1, p2, ... respectively. Then the expected value of this random variable is the infinite sum

\operatorname{E}[X] = \sum_{i=1}^\infty x_i\, p_i, $$

Variance
These examples are from Variance on Wikipedia.

Expected value of throwing a six-sided die:
 * $$\frac 16(1+2+3+4+5+6)=3.5.$$

Its expected absolute deviation—the mean of the equally likely absolute deviations from the mean—is
 * $$\frac 16(|1-3.5|+|2-3.5|+|3-3.5|+|4-3.5|+|5-3.5|+|6-3.5|)=\frac 16(2.5+1.5+ 0.5+0.5+1.5+2.5)=1.5.$$

But its expected squared deviation—its variance (the mean of the equally likely squared deviations)—is


 * $$\frac 16 (2.5^2+1.5^2+0.5^2+0.5^2+1.5^2+2.5^2)=17.5/6\approx 2.9.$$

If a random variable X has the expected value (mean) = μ = E[X], then the variance of X is given by:

\operatorname{Var}(X) = \operatorname{E}\left[(X - \mu)^2 \right]. \, $$

If the random variable X is discrete with probability mass function

x1 ↦ p1, ..., xn ↦ pn, then


 * $$\operatorname{Var}(X) = \sum_{i=1}^n p_i\cdot(x_i - \mu)^2$$

where $$\mu$$ is the expected value, i.e.
 * $$\mu = \sum_{i=1}^n p_i\cdot x_i $$.

Standard deviation
These examples are from Standard deviation on Wikipedia.

Consider a population consisting of the following eight values:

2,\ 4,\  4,\  4,\  5,\  5,\  7,\  9  $$ These eight data points have the mean (average) of 5:
 * $$   \frac{2 + 4 + 4 + 4 + 5 + 5 + 7 + 9}{8} = 5  $$

To calculate the population standard deviation, first compute the difference of each data point from the mean, and square the result of each:

\begin{array}{lll} (2-5)^2 = (-3)^2 = 9  &&  (5-5)^2 = 0^2 = 0 \\    (4-5)^2 = (-1)^2 = 1  &&  (5-5)^2 = 0^2 = 0 \\    (4-5)^2 = (-1)^2 = 1  &&  (7-5)^2 = 2^2 = 4 \\    (4-5)^2 = (-1)^2 = 1  &&  (9-5)^2 = 4^2 = 16 \\    \end{array} $$ Next compute the average of these values, and take the square root:



\sqrt{ \frac{(9 + 1 + 1 + 1 + 0 + 0 + 4 + 16)}{8} } = 2 $$

In the case where X takes random values from a finite data set x1, x2, …, xN, with each value having the same probability, the standard deviation is


 * $$\sigma = \sqrt{\frac{1}{N}\left[(x_1-\mu)^2 + (x_2-\mu)^2 + \cdots + (x_N - \mu)^2\right]}, {\rm \ \ where\ \ } \mu = \frac{1}{N} (x_1 + \cdots + x_N),$$

or, using summation notation,


 * $$\sigma = \sqrt{\frac{1}{N} \sum_{i=1}^N (x_i - \mu)^2}, {\rm \ \ where\ \ } \mu = \frac{1}{N} \sum_{i=1}^N x_i.$$

If, instead of having equal probabilities, the values have different probabilities, let x1 have probability p1, x2 have probability p2, ..., xN have probability pN. In this case, the standard deviation will be
 * $$\sigma = \sqrt{\sum_{i=1}^N p_i(x_i - \mu)^2}, {\rm \ \ where\ \ } \mu = \sum_{i=1}^N p_i x_i.$$

Correlation coefficient
If there are two random variables $$X,Y$$ with means $$\mu_X,\mu_Y$$ and standard deviations $$\sigma_X,\sigma_Y$$, then their correlation coefficient is


 * $$\frac{E(XY)-\mu_X\mu_Y}{\sigma_X\sigma_Y}$$

If there is a linear relation $$Y=aX+b$$, the correlation coefficient is 1 (or -1 if $$a$$ is negative). If it is close to 1 or -1, the correlation is very strong.

Complementary error function
Demonstrates use of the integral and series summation, from Wikipedia's Error function:

$$ \operatorname{erfc}(x) = \frac{2}{\sqrt{\pi}} \int_x^{\infty} e^{-t^2}\,dt = \frac{e^{-x^2}}{x\sqrt{\pi}}\sum_{n=0}^\infty (-1)^n \frac{(2n)!}{n!(2x)^{2n}} $$

Math symbols (special characters)
Many math symbols are in the editing toolbar under Special characters --> Symbols (or Greek). However, it may be easier to copy and paste the displayed character directly from the table below.

From Help:Displaying a formula: The codes on the left produce the symbols on the right, but the latter can also be put directly in the wikitext, except for &lsquo;=&rsquo;.