Refactoring Research: The New Model of Information
Click to Print This Page
The computer science process of “refactoring” makes an interesting parallel to certain recent developments in investment research. In the software world, engineers will refactor a computer program when they want to overhaul the interior—change the internal code, reduce cost, increase efficiency, swap in new modules—but leave the external behavior, the functionality, the same.
Something similar is now under way in the redesign of research for the buy side. The end goal is the same as it has always been—excellent and differentiated insight into investment strategies—but the research process is evolving in the face of changes in the industry structure, regulations, and information access.
The investment research process in the US has been under assault since the early years of the decade. First, the Global Research Analyst Settlement and Regulation Fair Disclosure drove the revenue model of sellside research into the ground. Regulation Fair Disclosure prevented differentiated disclosure by companies to their analysts, while the Global Research Analyst Settlement’s mandated separation of research activities and investment banking led to a steep decline in both the quantity and quality of sellside research.
Disruptions like these shook up the traditional process of steering trading business to the analyst who provided the best research, access to company experts, and ideas. On the technology front, platforms like Capital IQ or FactSet began radically simplifying the laborious processes of combing through filings and hand-building company models.
The devastation of the sell side has only increased since 2008. The bear market and bankruptcies have suppressed the use of leverage, further pressuring margins. Client withdrawals from the most profitable components of broker–dealers’ businesses—and from hedge funds and mutual funds—have pressured asset management firms to lower fees and differentiate themselves, or go out of business. As more and more companies are overlooked or inadequately covered, the research burden is shifting to the buy side, and key talent is headed there too (though, in the current market, many buy-side firms have slowed new hiring or simply gone under). In conjunction, these developments have exerted severe pressure on the research profession, increasing systemic market risks in a world without much of a sell-side bench to backstop coverage.
The environment that’s emerging is one in which access to on-demand tools will be crucial—for generating ideas, performing research, evaluating risk and reward, and promoting rigorous discipline. New “alternative research” models have arisen within the increasingly competitive landscape. Feebased expert networks are multiplying as the incremental value of an incremental insight on a market multiplies. Standalone channel checking services spot-monitor conditions on the ground for retailers, distribution, and manufacturing.
Amidst the confusion, another phenomenon has been quietly accelerating: the liberation of information. The effect of the information revolution on the traditional newspaper business has been much publicized, the impact on investment research somewhat less so. Nevertheless, over the last 20 years, the Internet has grown from a repository for stock prices and e-mail to the most complete global database of investment information, only a small percentage of which is trapped behind subscription login pages. Just as YouTube and iTunes forcibly reinvented the television and music industries, the Web is restructuring the investment research marketplace, taking information that used to be sold and making it widely available, on the house.
With the research process in a state of flux, then, the moment is ripe for pulling apart, upgrading, and recombining its fundamental elements. The end result could be shocking: research and market efficiency may emerge from this bear market in better shape than ever.
THE END OF DO-IT-YOURSELF
“Ten years ago, you could find a good idea, read a couple of reports from different brokers, have each of those analysts give you a one-on-one call and share their models, and then have the bank set you up a meeting with management where you could ask your questions,” recalls Charles Frumberg, principal at Emancipation Capital. “Now they’ll set you up with the meetings, but as for the research, you’re on your own.” For the buy side, that means creating or paying for access to research assets and data.
Asset managers first responded to this shifting status quo with a surge of analyst hiring. While that came with significant costs, the burden was bearable while the market was up. Now that the market is down, budgets are falling with it, and the research budget, especially analyst head count, is the largest target on the screen. This has the fundamental funding model for research.
So the buy side has had to consider how to outsource some of its research and how to develop strategies to minimize research costs. On March 2, 2009, Michael Mayhew posted an entry on the Integrity Research Associates blog that pointed out: “While not a widespread phenomenon, in the past few months we have heard of a small but growing number of buy-side firms both here in the US and in Europe that have decided to radically reduce their internal research staffs and focus instead on purchasing the best research from sell-side and alternative research providers.”
This isn’t the first time the industry has faced componentization. When David Williams broke away from SAC to start Williams Trading, he showed that portfolio managers could get the best trading available not by spending the millions (and sharing the equity) to hire the best, but by outsourcing. Now the research sector is confronted with a similar question: Is building up a complete research team in the core competence? Many companies must be answering in the negative, as the last 12 months have seen an explosion in new outsourced research firms. Broad-based shops like Research Edge have top talent for each sector and for macroresearch. Boutiques like Ed Wolfe’s Wolfe Research focus on specific industries (in the case of Wolfe Research, transportation).
ALL FOR FREE
Bigger than the global financial crisis, scarier than the dynamism that’s been remaking the structure of investment research as we know it: a broad secular shift has transpired, dragging old industries down and upending established orders. Information is free.
While the Wall Street Journal and the New York Times continue to flirt with charging for access to slivers of their information, the vast majority of media providers have already raised the white flag, and the investment world has surrendered, too. The most obvious proof of that is analysts’ new ability to get their hands on any given sell-side research report, regardless of their trading activities. Even in the more structured world of financial data feeds and opinion, innovative firms like StreetAccount satisfy their clients not by providing the news, but by selecting and commenting on the key developments, providing analysis and extracting significance.
The real threat in this new model is to data providers, of course, and it’s due in large part to the phenomenon of the “long tail.” In October 2004, Wired published Chris Anderson’s now legendary article on the long-tail distribution of most elements of the Web. A long tail is a non-bell-curve distribution, in which the biggest area under a curve consists of many dispersed, small elements, not a few large elements. The long tail is manifest in the investment professional’s focus on “incremental data points” that are “on the margin” as a way of obtaining an edge when studying a company.
For analysts, the fundamental economic model of a given company or a large-grain analysis of its business has a huge impact on their understanding of a firm—but these facts are so widely known that they provide little benefit. Analysts get a bigger bang for their buck when they focus on longtail data points such as confidence among a firm’s major suppliers or middle-manager turnover in the employee base.
The long, widely distributed tail of incremental investment data swats data providers’ fundamental business model right out the window. Even the largest investment manager isn’t staffed to find, much less vet and write, contracts with 50,000 different sources. So, under the new rules, the providers aren’t even attempting to monetize their content in a traditional way. From David Jackson’s financial blog aggregator Seeking Alpha (advertising based) to StockTwits (still evolving), the long tail has content that investment professionals need.
Every day, the fluctuating data set that describes market environments, customer trends, competitive moves, and official disclosures becomes larger, more complex, harder to stay on top of, and more and more critical to driving a systematic, repeatable research process. The essential challenge of the new information reality lies in collecting and managing the widely dispersed data. Investment professionals have begun shifting their spending toward firms that detect patterns, extract insightful data points, and help navigate suddenly immense data sets in a cost-effective manner.
The Web has changed all the rules, as usual. As a massive source of qualitative data on markets, industries, and companies, it can make generational improvements in the research process, led by search-driven companies like FirstRain, Capital IQ, and FactSet. [Full disclosure: Penny Herscher is president and CEO of FirstRain.]
For example, a catalog of thousands of investment topics can help pull critical information from the long tail. Pattern detection technology can extract fundamental events from the myriad of sector-specific sources, premium subscriptions, influential blogs, and local and global news. With enriched data, analysts construct personalized adaptive market models surrounding a company, industry, or sector, tailored to their particular investment strategies and views of the market. Search-driven research technologies allow investment professionals to spend more time developing ideas and less time compiling information.
Where search-driven research is really pushing the envelope is in analytical algorithms that extract targeted streams of data, add value to the data with analysis, then deliver the information to users. Increasingly, the data set to which these technologies are applied is simply the Web itself.
TODAY IS THE NEW TOMORROW
Alternative research is no longer alternative—it’s fundamental. By cost-effectively employing expert networks, portfolio managers can generate incremental and differentiated investment ideas. With channel checks as an external service, risk-conscious firms have a broader universe in which to invest. Search-driven research can open up a wider range of investable ideas by reducing the time-to-conviction on a new name.
Now that so many components of the research process are primed for upgrades, it’s essential that investment professionals take a long, hard look at some strategy shifts that can increase the impact and reduce the costs of research. And topping the list, from the perspective of both impact and cost, is the deployment of human talent.
Many asset managers will find that they’re about to enjoy unprecedented freedom when it comes to making decisions about strategy, growth, and execution. They should consider that:
- New funds can be introduced by hiring a portfolio manager and outsourcing research input.
- In-house analysts can cover a broader range of companies by using new automated tools such as search-driven research, supplemented by channel checks and expert networks, therby reducing the need to pay and manage a supportive research staff.
- Large firms evaluating costs can opt to invest in areas where they can build the highest competency in the industry, and to adjust strategy and spending levels where they cannot do so.
Those asset managers are also going to be offered the chance to understand information in a whole new way. They have always operated on the belief that, although the facts are a necessary input, it’s the analysis of those facts that creates value and makes stock picking possible. Even with that innate privileging of analysis over information, however, they’ve continued to spend significant amounts, wasteful amounts, acquiring data.
Spending money on information shouldn’t stop, even if it is increasingly free on the Web. What does need to happen is a redirection of that money toward funding technology for extracting meaningful patterns and trends from data. Financial data feeds are important, but alpha screening directly impacts stock picking. In the same way, asset managers must come to appreciate the importance of the fact that analytics on qualitative data can provide insight into unfolding events—not just documentation of a story that’s already over. Those who recognize and embrace the refactoring at work in the investment research process will create more alpha, deploy capital more efficiently, and reap the rewards of streamlined access to high-quality research.
–Penny Herscher is currently the CEO and president of FirstRain. She was also the former CEO and chairman of Simplex Solutions and a chief marking officer and general manager at Cadence. She serves on the boards of FirstRain, JDSU, Rambus, the Anita Borg Institute, Planned Parenthood Mar Monte, and California Community Partners for Youth.
This article was originally published in the Spring 2009 issue of the Investment Professional.