Suggestions for Modern Security Analysts
Click to Print This Page
Economics is the social science that most identifies itself with the natural sciences. There is much that can be written about this statement in light of the events that unfolded in the 2007–2008 credit crisis, but this article focuses on the consequences pertaining to the field of Security Analysis, which is an economics-based discipline.
Security Analysis seeks to value firms based on the goods and services sold to customers via the assets (tangible and intangible) and obligations (liabilities) generated to support those sales. Despite the simplicity of this exposition, and the related simplicity of cash flow-based valuation, assessing value can be extremely difficult. The difficulty stems from the well-known fact that value is subjective, and from the equally well-known fact that the future is uncertain. Subjectivity and uncertainty means that Security Analysis requires many working assumptions, which is important because modern economics is currently grounded in mathematics that accommodates only a limited number of assumptions. As a purely theoretical, science-like endeavor this may (or may not) work, but Security Analysis is a profession, and professions are concerned with decision-making in contrast to science, which is concerned with prediction.
There are differences between making predictions and making decisions, some of which are significant. Nevertheless, as a profession, security analysts have, over the past decades, generally come to embrace the theories of modern economics, seemingly without adequate consideration for how to bridge the gap between theory and practice. The consequences of this, most recently experienced in the 2007–2008 credit crisis, have been significant. To help rectify this, this article provides practical suggestions for modern security analysts to consider during the course and scope of their work.
MODERN SECURITY ANALYSIS
First and foremost, the practice of Security Analysis requires sound professional judgment, which is as important today as it ever was. For example, in the year 1921, Chicago economist Frank Knight noted that the, “capacity for forming correct judgments (in a more or less extended or restricted field) is the principle fact which makes a man serviceable in business; it is the characteristic human activity, the most important endowment for which wages are received.”
The study of history has long been known to provide context and perspective, both of which are critical elements of judgment, and yet financial/economic history courses or programs of study are in decline today, if they are offered at all. A possible reason for this is that science-based curriculums seemingly do not have much need for history; whatever the reason(s), modern security analysts should be knowledgeable of financial/economic history. As Benjamin Graham, who helped found the profession of Security Analysis, observed:
"To invest intelligently in securities one should be forearmed with an adequate knowledge of how the various types of bonds and stocks have actually behaved under varying conditions — some of which, at least, one is likely to meet again in one’s own experience. No statement is more true and better applicable to Wall Street than the famous warning of Santayana: 'Those who do not remember the past are condemned to repeat it.'"
Consider the recent credit crisis: many descriptions of it indicate that it was “unprecedented,” but that is not entirely accurate: financial crises, panics and contagions have occurred over time, and for many of the same reasons. This is not to suggest that the magnitude of the recent crisis was not severe; only that similar crises have occurred in the past, and that recently those occurrences seemed to steadily increase in magnitude. Such a pattern results in “predictable surprises” not “black swans.”3 For example, consider the financial crisis that occurred 10 years prior to the most recent one, which resulted in the failure of the first modern firm deemed “too big to fail”: the hedge fund Long-Term Capital Management (LTCM). LTCM’s plight was eloquently described in a popular case history as follows:
"Long-Term fooled itself into thinking it had diversified in substance when, in fact, it had done so only in form. Basically, the fund made the same bet on lower-rated bonds in every imaginable permutation. It is hardly a surprise that when the cycle turned and credit tightened — as, throughout recorded time, it periodically has — Long-Term’s trades fell in lockstep."
Substitute the name “Long-Term” from the above quote with Bear Sterns, Lehman Brothers, AIG, etc., and it is uncanny how closely the causes of loss from the two crises track. That said, knowledge of the LTCM case would not have enabled someone to predict specific events or failures experienced during the recent credit crises; rather, it would have put the dynamics of the 2004–2007 economy into historical context, which was important for decision-making purposes. Consider, for example, the former CEO of Citigroup’s infamous statement to the Financial Times in July of 2007 — right before the onset of the credit crisis — that “As long as the music is playing, you’ve got to get up and dance. We’re still dancing.” Security analysts aware of LTCM’s historic losses 10 years earlier could have forcefully challenged this CEO on both his statement and strategy using the very same case history we cited above, which presciently observed that:
"Investors have a pretty good idea about balance sheet risks [but] they are completely befuddled with regard to derivative risks. Some of the reporting standards are being changed (over the opposition of both [then Federal Reserve Chairman Alan] Greenspan and the banks), but gaping holes remain. As the use of derivatives grows, this deficiency will return to haunt us."
LTCM is not the only example we can cite: starting with the financial panic of 1907 there has generally been at least one financial crisis every decade or so, especially since the late ’70s (the “great inflation”) to the stock market “crash” of 1987, the “Asian Contagion” of 1997–1998 to the most recent credit crisis of 2007–2008. Despite this pattern it seems that the recurring nature of financial crises was generally ignored by mainstream economists and analysts.
Doing the opposite of the mainstream, or contrary thinking, has a long history in trading activities, and in investment. Applied to economics prior to the recent crisis, contrary thinking could have generated prescient Security Analysis-related insights such as the following:
- Markets not only are not efficient, at times they can seem mindfully inefficient
- Capital structure not only matters, during times of crisis it is one of the very few things that does matter
- Asset and option pricing models not only understate the possibility of extreme events, they can actually be counter-productive before, during and immediately after such events
For a humorous example of contrary thinking, consider a season five episode of the popular sitcom Seinfeld. In this episode, one of the show’s characters laments how every decision he ever made was not only wrong, but resulted in an outcome directly opposite of what he wanted. To rectify this, he decided to “do the opposite” of whatever his instincts told him, which humorously resulted in a beautiful girlfriend and getting a high-profile job with the New York Yankees. In practice, contrary thinking is not so easy to implement because a large percentage of the time conventional wisdom is correct: the trick is to judge when that is no longer the case and then to act conservatively to take advantage of that judgment. The difficulty in doing this is likely a reason why, prior to the recent credit crisis, so few analysts “did the opposite” of mainstream economic theory.
A few distinguished economic minds were beginning to recommend change, though. For example, in 2005, Robert Merton and Zvi Bodie argued that economics should involve a synthesis of disciplines, including behavioral considerations. Building on these thoughts it seems a multi-discipline orientation is a base-level requirement for modern Security Analysis. Significantly, the various disciplines that are used should interlock — like a set of Lego building blocks — subject to recombination as may be required for any specific valuation or market analysis. Learning how to do this is not easy, and it is also generally not taught in business schools (aside from a few possible exceptions).
One discipline that must be included in any analytical approach is a commonsense approach to risk management. Risk in this context is simply the possibility of loss. Despite the simplicity of this definition, the concept of risk management has grown increasingly complicated and impractical. To help rectify this Peter Bernstein, author of the bestselling book Against the Gods: The Remarkable Story of Risk, published an article in 2004 — after LTCM’s failure and the “new economy” bust — that noted in part:
"If more things can happen than will happen, and if we are denied precise knowledge of the range of possible outcomes, some decisions we make are going to be wrong. How many, how often, how seriously? We have no way of knowing even that. Even the most elegant model, as Leibniz reminded Jacob Bernouilli in 1703, is going to work “only for the most part.” What lurks in that smaller part is hidden from us, but it could be loaded with dynamite. (Emphasis original)"
Bernstein’s warning was, unfortunately, generally ignored. For example, consider the huge expenditures the banking industry incurred to implement Basel II risk management programs: despite the extent of those expenditures, and the alleged sophistication of Basel-based models, risk management in the banking industry was less than optimal prior to the recent credit crisis. This is curious because many potential causes of financial loss (or risks) are fairly well-known. Consider, for example, the following sample:
- Not adequately understanding the dynamics of asset and liability holdings
- Aggressive valuation assumptions and inadequate pricing
- Excessive leverage
- Not understanding exposure aggregations, etc
Actively seeking to identify and mitigate potential losses is the management part of risk management; it is also the part of risk management that is most frequently ignored, especially when compared to the plethora of material produced on risk modeling. A model is only useful in risk management if it helps to shed light on or identify a potential cause(s) of loss. This is important because it implies that risk management is a bottom-up exercise designed to identify and mitigate potential causes of loss; not a portfolio-level exercise designed to minimize inefficient exposure frontiers. If a model augments or facilitates loss analyses it could prove useful for risk management purposes, but modeled output is only one of many that security analysts must consider. This is as it must be for given the inherent complexity of financial markets they cannot be definitively modeled. Complexity in this context is not simply a synonym for complicated; rather, it is the name given to the study of highly inter-related systems such as financial markets. A full description of complexity theory is beyond the scope of this article, but there is one aspect of complexity theory that deserves mention here: feedback.
Feedback refers to a system’s outputs influencing its inputs and processes in either a positive/reinforcing or negative manner. Positive feedback can be prevalent in financial markets; for example, speculator George Soros refers to it as reflexivity, or the feedback between security analysis and market prices. Soros noted that, given the positive nature of the reflexive feedback loop, it can influence the fundamentals that security analysts seek to value thereby potentially corrupting their analyses and corresponding investment activities. We found that duri ng business (or boom-bust) cycles this feedback loop can close, thereby resulting in feedback-based prices rather than prices reflecting fundamentals. This phenomenon is a consequence of what Benjamin Graham referred to in 1934 as “new era” thinking, which was powerfully experienced during the “new economy” NASDAQ boom of the late 1990s, and the “new paradigm” credit boom that recently ended in crisis. In essence, “new era”-inspired buying feeds on itself leading to a dramatic run-up in prices, which eventually top-out and then reverse. This market behavior takes time to unfold, which gives business cycles their bubble-like shape, as Soros depicts graphically in his writings.
Despite the practical logic of Soros’ theory it has generally not been addressed academically, which in January of 2001 (during the “new economy” bust) caused Soros to publicly question why:
"Why is my reasoning dismissed out of hand, without giving it serious consideration? Because it leads to the conclusion that financial markets are inherently unpredictable. And what is the value of a scientific theory that does not yield usable predictions? I contend that it would be better to recognize the uncertainties inherent in the behavior of financial markets than to cling to a purportedly scientific theory that distorts reality."
Security Analysis-based research should, at least in part, reconcile with what security analysts and investment professionals, especially highly-successful ones, actually do rather than simply mirror theoretical research. This is not to say that practice should be divorced from theory, only that it should not be dominated by it, especially when theory is based on limited assumptions that do not conform to reality.
Modern financial economic theory effectively began in 1952, and was promoted by influential early adopters before spreading more widely. As that theory has once again broken down modern security analysts should begin to shift their attention, at least in part, away from modern economic theory to more practical subjects such as those identified above. Who knows, doing so may even inspire Security Analysis-oriented economists: as science historian Thomas Kuhn observed “the emergence of new theories is generally preceded by a period of pronounced professional insecurity. As one might expect, that insecurity is generated by the persistent failure of the puzzles of normal science to come out as they should. Failure of existing rules is the prelude to a search for new ones.”
Given the failures of modern economics over the past 20 years, i.e., from the 1987 stock market crash to the 2007–2008 credit crises, social scientists are, appropriately, beginning the search for new and better theories. Where this research will lead is not currently known; however, what is known is that tried and true subjects like economic/financial history and contrary thinking are just as relevant today as they ever were. And when combined with a multi-discipline analytical framework and an understanding of, and appreciation for, market complexity they could lead to more insightful analyses and profitable investments at lower levels of risk over time.
Frank Knight, Risk, Uncertainty and Profit (NY: Cosimo, 2005 ): 229.
Benjamin Graham, The Intelligent Investor (NY: Harper & Row, 1973 ): ix.
Max Bazerman and Michael Watkins, Predictable Surprises: The Disasters You Should Have Seen Coming, and How to Prevent Thems (Boston: HBS Press, 2004).
Nassim Taleb, The Black Swan: The Impact of the Highly Improbable: With a new section: "On Robustness and Fragility" (NY: Random House, 2007).
Roger Lowenstein, When Genius Failed: The Rise and Fall of Long-Term Capital Management (NY: Random House, 2000): 233-234.
“Investment Banking Revenues,” Financial Times (7/10/2007).
Peter Bernstein, “Risk: The Whole Versus the Parts,” CFA Magazine, March-April (2004): 5.
George Soros, “Letters to the Editor: My Market Theory? Forget Theories,” Wall Street Journal (1/8/2001): A33.
Thomas Kuhn, The Structure of Scientific Revolutions (Chicago: UC Press, 1996 ): 67-68.
–Joseph Calandro, Jr., is the author of Applied Value Investing (NY: McGraw-Hill, 2009), the Enterprise Risk Manager of a global financial services firm, and a finance department faculty member of the University of Connecticut. He can be contacted at firstname.lastname@example.org. The author would like to thank Mitchell Julis for his insightful and helpful questions, comments and suggestions based on ideas developed in his forthcoming book. The usual caveat applies.