I just finished reading Gregory Zuckerman’s biography of Jim Simons. The best answers I was able to get from the book on what their solution to the market was:
- Markov chains, HMM models
- Linear factor models, followed by high dimensional kernel regression methods
- Big monolithic models and lots of data
- A well orchestrated engineering effort
1 was the start of their bonds, commodities, and currencies team, and 2 was the start of their stocks team, and 3 & 4 were guiding principles for both these teams.
Quantitative Bonds, commodities, and currencies team:
James Ax, a number theorist and strong believer in Markov Chains for financial markets was JS’s first quant, and headed the effort under the name Axcom in CA. By ’86 Axcom traded 21 different futures contracts. René Carmona then joined them to try to incorporate SDEs, his field of expertise. When that failed, Carmona suggested changing their existing linear regression approaches to be nonlinear, high dimensional kernel regression methods, and having the model directly suggest buy/sell orders. These improved results on trending models. Elwyn Berlekamp (who had worked with Kelly) took over Axcom in ’89 and brings in Henry Laufer in ‘92, they worked on mean reversion strategies and looking at correlations between time periods. They start using data in 5 minute bars, used pairs trading, and had an online learner, constantly looking for trading signals (one auspicious one they called “Henry’s signal”). By ’97 “more than half of the trading signals Simons’s team was discovering were nonintuitive” and they ignored them. But they figured out the right day and time to make their trades. Later Simons says “we’re the best at estimating the cost of a trade.” In ’98 they are the majority of the firm, with stock trading only 10% of the profits.
Quantitative Stock team:
This team starts to make progress in ’93 when Nick Patterson contacts Brown and Mercer at IBM’s speech group (which uses algorithms like Baum-Welch in their HMMs for speech-to-text). Brown and Mercer take over Robert Frey’s factor stock trading fund named Kepler (later Nova). Frey (previously statistical arbitrage at Morgan Stanley) was identifying various independent variables for factor trading models. Brown & Mercer retain Frey’s model, and elaborate it to cover real-life technicalities he was ignoring. They make an adaptive single trading system for their whole portfolio, and self-correcting for when the trades it suggests are unexecutable. The system repeated on loop several times an hour. It was a well engineered product, and usually bet on mean reversion strategies. By ’03 their profits are 2x Laufer’s team, and they work on a model to replace the futures team’s. Alexey Kononenko rises through the ranks of this team. Two members Belopolsky & Volfbeyn bolt to Israel Englander’s Millennium management, allegedly with (millions of lines of) code and ideas, and have great success there (“some of the most successful traders Englander had encountered”). By 2010 the system is huge, executing thousands of simultaneous trades throughout the day, with lots of factors and interrelations.
Some thoughts:
HMMs, factor models, non-linear kernel regression, correlations between various time periods, these are all separately well known pieces. But the book says many times that a single model was handing out trading instructions. My first guess for how to put these different elements together would be that everything else was generating features for the HMM.
RenTech is having its best years right now! $63B of its $104B trading profits are since 2010, and with just 300 employees. Surely it has updated its techniques, and probably cashed in on the deep learning revolution raging since (at least) 2014!
Additional details with links/figures is here:
Submitted January 04, 2020 at 10:54PM by ka-cirt-bu-bo-fo https://ift.tt/2STZmHC