This chapter explains how individuals, business processes, and sensors produce alternative data. It also provides a framework to navigate and evaluate the proliferating supply of alternative data for investment purposes.
It demonstrates the workflow, from acquisition to preprocessing and storage using Python for data obtained through web scraping to set the stage for the application of ML. It concludes by providing examples of sources, providers, and applications.
- The Alternative Data Revolution
- Sources of alternative data
- Criteria for evaluating alternative datasets
- The Market for Alternative Data
- Working with Alternative Data
For algorithmic trading, new data sources offer an informational advantage if they provide access to information unavailable from traditional sources, or provide access sooner. Following global trends, the investment industry is rapidly expanding beyond market and fundamental data to alternative sources to reap alpha through an informational edge. Annual spending on data, technological capabilities, and related talent are expected to increase from the current $3 billion by 12.8% annually through 2020.
Today, investors can access macro or company-specific data in real-time that historically has been available only at a much lower frequency. Use cases for new data sources include the following:
- Online price data on a representative set of goods and services can be used to measure inflation
- The number of store visits or purchases permits real-time estimates of company or industry-specific sales or economic activity
- Satellite images can reveal agricultural yields, or activity at mines or on oil rigs before this information is available elsewhere
- The Digital Universe in 2020
- Big data: The next frontier for innovation, competition, and productivity, McKinsey 2011
- McKinsey on Artificial Intelligence
Alternative datasets are generated by many sources but can be classified at a high level as predominantly produced by:
- Individuals who post on social media, review products, or use search engines
- Businesses that record commercial transactions, in particular, credit card payments, or capture supply-chain activity as intermediaries
- Sensors that, among many other things, capture economic activity through images such as satellites or security cameras, or through movement patterns such as cell phone towers
The nature of alternative data continues to evolve rapidly as new data sources become available and sources previously labeled “alternative” become part of the mainstream. The Baltic Dry Index (BDI), for instance, assembles data from several hundred shipping companies to approximate the supply/demand of dry bulk carriers and is now available on the Bloomberg Terminal.
Alternative data sources differ in crucial respects that determine their value or signal content for algorithmic trading strategies.
The ultimate objective of alternative data is to provide an informational advantage in the competitive search for trading signals that produce alpha, namely positive, uncorrelated investment returns. In practice, the signals extracted from alternative datasets can be used on a standalone basis or combined with other signals as part of a quantitative strategy.
- Big Data and AI Strategies, Kolanovic, M. and Krishnamachari, R., JP Morgan, May 2017
The investment industry is going to spend an estimated $2bn-3bn on data services in 2018, and this number is expected to grow at double digits per year in line with other industries. This expenditure includes the acquisition of alternative data, investments in related technology, and the hiring of qualified talent.
This section illustrates the acquisition of alternative data using web scraping, targeting first OpenTable restaurant data, and then move to earnings call transcripts hosted by Seeking Alpha.
- Quantifying Trading Behavior in Financial Markets Using Google Trends, Preis, Moat and Stanley, Nature, 2013
- Quantifying StockTwits semantic terms’ trading behavior in financial markets: An effective application of decision tree algorithms, Al Nasseri et al, Expert Systems with Applications, 2015
Note: different from all other examples, the code that uses Selenium is written to run on a host rather than using the Docker image because it relies on a browser. The code has been tested on Ubuntu and Mac only.
This subfolder 01_opentable contains the script opentable_selenium to scrape OpenTable data using Scrapy and Selenium.
Update: unfortunately, seekingalpha has updated their website to use captcha so automatic downloads are no longer possible in the way described here.
Note: different from all other examples, the code is written to run on a host rather than using the Docker image because it relies on a browser. The code has been tested on Ubuntu and Mac only.
The subfolder 02_earnings_calls contains the script sa_selenium to scrape earnings call transcripts from the SeekingAlpha website.