Leinweber heads the Computational Research Division's Center for Innovative Financial Technology, created to help build a bridge between the computational science and financial markets communities. Leinweber, author of "Nerds on Wall Street: Math, Machines and Wired Markets" (Wiley 2009) was a Haas Fellow in Finance at the University of California, Berkeley from 2008-10. His professional interests focus on how modern information technologies are best applied in trading and investing. As the founder of two financial technology companies, and a quantitative investment manager he is an active participant in today's transformation of markets. He earned his Ph.D. in applied mathematics from Harvard University.
Other personal accomplishments include a stint at the RAND Corporation where he directed research on real-time applications of artificial intelligence that led to the founding of Integrated Analytics Corporation. IAC was acquired by the Investment Technology Group, and, with the addition of electronic order execution, its product became QuantEx, an electronic execution system used for millions of institutional equity transactions daily.
Prior to coming to the Berkeley Lab Leinweber also served as Managing Director at First Quadrant, where he was responsible for institutional quantitative global equity portfolios totaling $6 billion. These long and market neutral strategies utilized a wide range of computerized techniques for stock selection and efficient trading. Quantitative investing is driven by electronic information, and the Internet dramatically transformed the financial information landscape.
This led to the founding of Codexa Corporation, a Net based information collection, aggregation and filtering service for institutional investors and traders. The company's clients included many of the world's largest brokerage and investment firms. As a visiting faculty member at Caltech, Leinweber worked on practical applications of ideas at the juncture of technology and finance. Over the years, Leinweber has advanced the state of the art in the application of information technology in both the sell-side world of trading and the buy-side world of quantitative investment. He's published and spoken widely in both fields. He's gone five rounds against the Wall Street Journal dartboard. He is an advisor to investment firms, stock exchanges, brokerages, and technology firms in areas related to financial markets , and a frequent speaker and author on these subjects, including a recent Google Tech Talk. Leinweber graduated from MIT, in physics and computer science where he was one of the first 5000 people on the Internet. That was when it was called the ARPAnet and wasn't cool. He also has a Ph.D. in Applied Mathematics from Harvard.
Kesheng Wu, Wes Bethel, Ming Gu, David, Oliver Rubel, "A Big Data Approach to Analyzing Market Volatility", Algorithmic Finance (2013), 2:3-4, 241-267, 2013, LBNL LBNL-6382E, doi: 10.3233/AF-13030
Understanding the microstructure of the financial market requires the processing of a vast amount of data related to individual trades, and sometimes even multiple levels of quotes. Analyzing such a large volume of data requires tremendous computing power that is not easily available to financial academics and regulators. Fortunately, public funded High Performance Computing (HPC) power is widely available at the National Laboratories in the US. In this paper we demonstrate that the HPC resource and the techniques for data-intensive sciences can be used to greatly accelerate the computation of an early warning indicator called Volume-synchronized Probability of Informed trading (VPIN). The test data used in this study contains five and a half year's worth of trading data for about 100 most liquid futures contracts, includes about 3 billion trades, and takes 140GB as text files. By using (1) a more efficient file format for storing the trading records, (2) more effective data structures and algorithms, and (3) parallelizing the computations, we are able to explore 16,000 different ways of computing VPIN in less than 20 hours on a 32-core IBM DataPlex machine. Our test demonstrates that a modest computer is sufficient to monitor a vast number of trading activities in real-time -- an ability that could be valuable to regulators.
Our test results also confirm that VPIN is a strong predictor of liquidity-induced volatility. With appropriate parameter choices, the false positive rates are about 7% averaged over all the futures contracts in the test data set. More specifically, when VPIN values rise above a threshold (CDF > 0.99), the volatility in the subsequent time windows is higher than the average in 93% of the cases.