There are four main indicators of investment risk that apply to the analysis of stocks, bonds and mutual fund portfolios. They are alpha, beta, r-squared, standard deviation and the Sharpe ratio. These statistical measures are historical predictors of investment risk/volatility and are all major components of modern portfolio theory (MPT). MPT is a standard financial and academic methodology used for assessing the performance of equity, fixed-income and mutual fund investments by comparing them to market benchmarks. All of these risk measurements are intended to help investors determine the risk-reward parameters of their investments. In this article, we'll give a brief explanation of each of these commonly used indicators.
Alpha is a measure of an investment's performance on a risk-adjusted basis. It takes the volatility (price risk) of a security or fund portfolio and compares its risk-adjusted performance to a benchmark index. The excess return of the investment relative to the return of the benchmark index is its "alpha." Simply stated, alpha is often considered to represent the value that a portfolio manager adds or subtracts from a fund portfolio's return. An alpha of 1.0 means the fund has outperformed its benchmark index by 1%. Correspondingly, an alpha of -1.0 would indicate an underperformance of 1%. For investors, the higher the alpha the better.
Beta, also known as the "beta coefficient," is a measure of the volatility, or systematic risk, of a security or a portfolio in comparison to the market as a whole. Beta is calculated using regression analysis, and you can think of it as the tendency of an investment's return to respond to movements in the market. By definition, the market has a beta of 1.0. Individual security and portfolio values are measured according to how they deviate from the market.
A beta of 1.0 indicates that the investment's price will move in lock-step with the market. A beta of less than 1.0 indicates that the investment will be less volatile than the market. Correspondingly, a beta of more than 1.0 indicates that the investment's price will be more volatile than the market. For example, if a fund portfolio's beta is 1.2, it's theoretically 20% more volatile than the market.
Conservative investors looking to preserve capital should focus on securities and fund portfolios with low betas, while those investors willing to take on more risk in search of higher returns should look for high beta investments.
Standard deviation measures the dispersion of data from its mean. In plain English, the more that data is spread apart, the higher the difference is from the norm. In finance, standard deviation is applied to the annual rate of return of an investment to measure its volatility (risk). A volatile stock would have a high standard deviation. With mutual funds, the standard deviation tells us how much the return on a fund is deviating from the expected returns based on its historical performance.
Developed by Nobel laureate economist William Sharpe, the Sharpe ratio measures risk-adjusted performance. It is calculated by subtracting the risk-free rate of return (U.S. Treasury Bond) from the rate of return for an investment and dividing the result by the investment's standard deviation of its return. The Sharpe ratio tells investors whether an investment's returns are due to wise investment decisions or the result of excess risk. This measurement is very useful because, while one portfolio or security may generate higher returns than its peers, it is only a good investment if those higher returns do not come with too much additional risk. The greater an investment's Sharpe ratio, the better its risk-adjusted performance.