Está en la página 1de 2

Variance

Definition
Variance is a measure of the dispersion of a set of data points around their mean value. In other words,
variance is a mathematical expectation of the average squared deviations from the mean. It is computed
by finding the probability-weighted average of squared deviations from the expected value. Variance
measures the volatility; in other words, the variability from an average. This is the measure of risk and
can help professionals assess the risk worthiness of an investment when making financial decisions.

Once an investment’s variance is calculated, it can be compared to similar investments to see which is
riskier (the higher the variance, the greater the investment’s volatility the greater the risk). A variance
of zero indicates that all the values are identical.

Calculating variance
Individual points in the data set are compared to the mean of the dataset. These differences are then
squared in order to make them all positive, and them added together. Finally, this sum is divided by the
total items in a data set.

Pro’s of using Variance


- Doesn’t give appearance of no variability at all, by squaring the differences, all values are
positive and cannot truly add up to zero unless they are identical
- Great tool to see how individual numbers relate to each other within a data set

Con’s of using Variance


- Gives added weight to numbers’ distances from the mean. By squaring the values can skew the
actual interpretations of the data
- Not always easily interpreted depending on how much variability is observed. Often times the
square root of its value is taken to get the standard deviation as it is analyzed easier and for
reporting purposes
Standard Deviation
Definition
The standard deviation is also a measure of dispersion of a dataset relative to its mean, similar to
variance. However, it is calculated as the square root of the variance. it is s key financial tool used to
measure investment risk by finance professionals. When applying this concept to portfolios and funds, a
large dispersion indicates how much the return on the fund is deviating from the expected normal
returns. Since it is easier to understand, standard deviation is often times reported to the end users or
clients as opposed to variance.

Standard Deviation vs Variance


Presentation (picturing) –
Often times, when plotting points in a dataset, variances aren’t as easily presented. When squaring the
values, the new datapoints may look skewed compared to the original data points, based on the volume
and size of the data.

Application –
standard deviation is expressed as the same unit of measurement as the original data, which isn’t always
the case with variance. Using the standard deviation, professionals can determine if the dataset is
normally distributed if 68% of the data falls within one standard deviation of the mean.

También podría gustarte