# Help:Math formulas in the wiki

This page explains how to create math formulas in the wiki, and to capture examples of potentially useful math formulas.

Math formulas are supported by MediaWiki's Extension:Math, which will render mathematical formulas in more user-friendly format. This is the same software used by Wikipedia.

## Using math formulas

The formulas are entered using the math codes defined in Help:Displaying a formula. When the page is previewed (or saved), the wiki will convert those codes into a graphic image or MathML (if your browser supports it).[note 1]

To purge a page with math formulas (refresh the server's cache), append ?action=purge&mathpurge=true to the URL.

## Formula examples

This section provides examples of formulas that may be close to a formula you want to include in your wiki article. If you find one that is close, click Edit next to the section title, copy the formula, then paste it into your wiki article and make the necessary edits.

### Inline formulas

Inline fractions

Fractions can be displayed inline using tfrac (text fraction) as here ${\displaystyle {\tfrac {1}{2}}}$, instead using frac which displays them like this ${\displaystyle {\frac {1}{2}}}$. Note also that this example illustrates that braces are not required around simple arguments to frac. The dfrac (display frac) version of frac forces normal display size: ${\displaystyle {\dfrac {1}{2}}}$, which in this case generates the same result as frac.

### Multiline formulas

The use of text and equation alignment are shown.

{\displaystyle {\begin{aligned}{\text{Net worth}}&=({\text{Total Assets}})-({\text{Total Liabilities}})\\\170,000&=\397,000-\227,000\end{aligned}}}

The font size may sometimes be too large to fit in the overall context. This was not used in Net worth.

### Expected value

These examples are from Expected Value on Wikipedia.

Discrete random variable, finite case

Suppose random variable X can take value x1 with probability p1, value x2 with probability p2, and so on, up to value xk with probability pk. Then the expectation of this random variable X is defined as

${\displaystyle \operatorname {E} [X]=x_{1}p_{1}+x_{2}p_{2}+\ldots +x_{k}p_{k}\;.}$

Since all probabilities pi add up to one: p1 + p2 + ... + pk = 1, the expected value can be viewed as the weighted average, with pi’s being the weights:

${\displaystyle \operatorname {E} [X]={\frac {x_{1}p_{1}+x_{2}p_{2}+\ldots +x_{k}p_{k}}{p_{1}+p_{2}+\ldots +p_{k}}}\;.}$

Discrete random variable, countable case

Let X be a discrete random variable taking values x1, x2, ... with probabilities p1, p2, ... respectively. Then the expected value of this random variable is the infinite sum

${\displaystyle \operatorname {E} [X]=\sum _{i=1}^{\infty }x_{i}\,p_{i},}$

### Variance

This is used in our wiki Risk and Return article. It applies to historical returns, or a set of returns with each return having equal probability:

${\displaystyle \operatorname {Var} (r)=\sigma ^{2}={\frac {1}{n}}\sum _{i=1}^{n}(r_{i}-\operatorname {E} (r))^{2}}$

The following examples are from Variance on Wikipedia.

Expected value of throwing a six-sided die:

${\displaystyle {\frac {1}{6}}(1+2+3+4+5+6)=3.5.}$

Its expected absolute deviation—the mean of the equally likely absolute deviations from the mean—is

${\displaystyle {\frac {1}{6}}(|1-3.5|+|2-3.5|+|3-3.5|+|4-3.5|+|5-3.5|+|6-3.5|)={\frac {1}{6}}(2.5+1.5+0.5+0.5+1.5+2.5)=1.5.}$

But its expected squared deviation—its variance (the mean of the equally likely squared deviations)—is

${\displaystyle {\frac {1}{6}}(2.5^{2}+1.5^{2}+0.5^{2}+0.5^{2}+1.5^{2}+2.5^{2})=17.5/6\approx 2.9.}$

If a random variable X has the expected value (mean) = μ = E[X], then the variance of X is given by:

${\displaystyle \operatorname {Var} (X)=\operatorname {E} \left[(X-\mu )^{2}\right].\,}$

If the random variable X is discrete with probability mass function

x1 ↦ p1, ..., xn ↦ pn, then

${\displaystyle \operatorname {Var} (X)=\sum _{i=1}^{n}p_{i}\cdot (x_{i}-\mu )^{2}}$

where ${\displaystyle \mu }$ is the expected value, i.e.

${\displaystyle \mu =\sum _{i=1}^{n}p_{i}\cdot x_{i}}$ .

### Standard deviation

These examples are from Standard deviation on Wikipedia.

Consider a population consisting of the following eight values:

${\displaystyle 2,\ 4,\ 4,\ 4,\ 5,\ 5,\ 7,\ 9}$

These eight data points have the mean (average) of 5:

${\displaystyle {\frac {2+4+4+4+5+5+7+9}{8}}=5}$

To calculate the population standard deviation, first compute the difference of each data point from the mean, and square the result of each:

${\displaystyle {\begin{array}{lll}(2-5)^{2}=(-3)^{2}=9&&(5-5)^{2}=0^{2}=0\\(4-5)^{2}=(-1)^{2}=1&&(5-5)^{2}=0^{2}=0\\(4-5)^{2}=(-1)^{2}=1&&(7-5)^{2}=2^{2}=4\\(4-5)^{2}=(-1)^{2}=1&&(9-5)^{2}=4^{2}=16\\\end{array}}}$

Next compute the average of these values, and take the square root:

${\displaystyle {\sqrt {\frac {(9+1+1+1+0+0+4+16)}{8}}}=2}$

In the case where X takes random values from a finite data set x1, x2, …, xN, with each value having the same probability, the standard deviation is

${\displaystyle \sigma ={\sqrt {{\frac {1}{N}}\left[(x_{1}-\mu )^{2}+(x_{2}-\mu )^{2}+\cdots +(x_{N}-\mu )^{2}\right]}},{\rm {\ \ where\ \ }}\mu ={\frac {1}{N}}(x_{1}+\cdots +x_{N}),}$

or, using summation notation,

${\displaystyle \sigma ={\sqrt {{\frac {1}{N}}\sum _{i=1}^{N}(x_{i}-\mu )^{2}}},{\rm {\ \ where\ \ }}\mu ={\frac {1}{N}}\sum _{i=1}^{N}x_{i}.}$

If, instead of having equal probabilities, the values have different probabilities, let x1 have probability p1, x2 have probability p2, ..., xN have probability pN. In this case, the standard deviation will be

${\displaystyle \sigma ={\sqrt {\sum _{i=1}^{N}p_{i}(x_{i}-\mu )^{2}}},{\rm {\ \ where\ \ }}\mu =\sum _{i=1}^{N}p_{i}x_{i}.}$

### Correlation coefficient

If there are two random variables ${\displaystyle X,Y}$ with means ${\displaystyle \mu _{X},\mu _{Y}}$ and standard deviations ${\displaystyle \sigma _{X},\sigma _{Y}}$, then their correlation coefficient is

${\displaystyle {\frac {E(XY)-\mu _{X}\mu _{Y}}{\sigma _{X}\sigma _{Y}}}}$

If there is a linear relation ${\displaystyle Y=aX+b}$, the correlation coefficient is 1 (or -1 if ${\displaystyle a}$ is negative). If it is close to 1 or -1, the correlation is very strong.

### Complementary error function

Demonstrates use of the integral and series summation, from Wikipedia's Error function:

${\displaystyle \operatorname {erfc} (x)={\frac {2}{\sqrt {\pi }}}\int _{x}^{\infty }e^{-t^{2}}\,dt={\frac {e^{-x^{2}}}{x{\sqrt {\pi }}}}\sum _{n=0}^{\infty }(-1)^{n}{\frac {(2n)!}{n!(2x)^{2n}}}}$

## Math symbols (special characters)

Many math symbols are in the editing toolbar under Special characters --> Symbols (or Greek). However, it may be easier to copy and paste the displayed character directly from the table below.

From Help:Displaying a formula: The codes on the left produce the symbols on the right, but the latter can also be put directly in the wikitext, except for ‘=’.

Syntax Rendering
&alpha; &beta; &gamma; &delta; &epsilon; &zeta;
&eta; &theta; &iota; &kappa; &lambda; &mu; &nu;
&xi; &omicron; &pi; &rho; &sigma; &sigmaf;
&tau; &upsilon; &phi; &chi; &psi; &omega;
&Gamma; &Delta; &Theta; &Lambda; &Xi; &Pi;
&Sigma; &Phi; &Psi; &Omega;

α β γ δ ε ζ
η θ ι κ λ μ ν
ξ ο π ρ σ ς
τ υ φ χ ψ ω
Γ Δ Θ Λ Ξ Π
Σ Φ Ψ Ω
&int; &sum; &prod; &radic; &minus; &plusmn; &infin;
&asymp; &prop; {{=}} &equiv; &ne; &le; &ge;
&times; &middot; &divide; &part; &prime; &Prime;
&nabla; &permil; &deg; &there4; &Oslash; &oslash;
&isin; &notin;
&cap; &cup; &sub; &sup; &sube; &supe;
&not; &and; &or; &exist; &forall;
&rArr; &hArr; &rarr; &harr; &uarr;
&alefsym; - &ndash; &mdash;

∫ ∑ ∏ √ − ± ∞
≈ ∝ = ≡ ≠ ≤ ≥
× · ÷ ∂ ′ ″
∇ ‰ ° ∴ Ø ø
∈ ∉ ∩ ∪ ⊂ ⊃ ⊆ ⊇
¬ ∧ ∨ ∃ ∀
⇒ ⇔ → ↔ ↑
ℵ - – —

## Notes

1. Firefox is the only major browser to support MathML directly, meaning that the math formulas will appear as text on the web page.
Other browsers, such as Chrome and Internet Explorer, do not support MathML. In this case, the math formulas will be converted to a Scalable Vector Graphics (SVG) image file.