Buell Lane Press
 
blp-banner.jpg

Purpose

Signal Processing for Algorithmic Trading: Interdisciplinary Quantitative Finance adapts the rigorous treatment of signal processing from electrical engineering and telecommunications to financial time-series analysis. The discipline of signal processing has attracted the attention of giants such as Wiener, Kolmogorov, Khinchin, Chebyshev, Shannon, Mandelbrot, and Akaike, and its foundation is built upon the dual pillars of temporal and spectral analysis. This text bridges the gap between this foundation and its application to quantitative finance.

Applications in this work are geared toward the algorithmically driven electronic markets, including foreign exchange, treasury and sovereign debt, equity and futures marketplaces. Optimal estimators for rates of change, returns- and jump-based realized volatilities, and cross-asset / cross-market relationships help drive contemporaneous and predictive decisions, all in a framework that is adjustable to the temporal scales of interest.

The intended audience includes quantitative finance professionals, graduate students with a technical background, and anyone who is curious about a cross-disciplinary approach to the financial markets.

Progress Update

I now have a candidate for the first edition. The text runs a bit over 800 pgs on a 6x9” layout, with about 225 references and about 440 figures. I am in the process of proofing.

multi-scale.managing-detail.png

Managing detail: the market is viewed on longer time scales than the underlying update scale.

Motivation

The liquid, electronically traded capital markets are awash in data, with interarrival update times routinely on the sub-microsecond scale. While there is a place for high-frequency trading, which seeks to manage orders directly with the tempo of the market, other trading styles operate at a slower pace. For instance, the round-trip time to and from a marketplace has minimum physical limits and is much longer than update interarrival times. Or, execution of block orders requires a tactical, high-speed component coupled with a strategic, lower-speed component. In these cases and others, a means to manage detail is necessary to keep the pulse of the market while viewing the market on longer horizons.

 
 
mirror-filter-domains-2.png

 Approach

Central to the signal-processing approach is the design and application of a filter. For real-time markets, essential filter properties include

  • causality,

  • implementation in discrete time,

  • optimal design, and

  • O(1) update complexity.

A well designed filter can consume, say, prices, and report the current levels of price change and volatility in a precisely calibrated manner, without bias, and with a consistent, sub-microsecond compute time. Such measurements can be used as factors for real-time prediction.

Signal Processing For Quantitative Finance covers the rigorous mathematical treatment of continuous- and discrete-time filter analysis and design, and details a range of applications that have been used by the author for pricing and trading in the global capital markets.

Outline

Signal Processing For Quantitative Finance as written in three main parts.

Front Matter

  • Introduction

  • The filters atlas: a catalog of filters and their applications that are covered in this volume

Part 1: Continuous Time

  • Separation of Filter and Data: Convolution and System Correlation

  • Spectral Representation of a Filter: Fourier Transform

  • Random Data and Random Processes

  • Representation of Casual Analog Filters: Laplace Transform

  • Level Filters

  • Slope and Curvature Filters

  • Appendix: The Ubiquitous Delta Function

  • Appendix: Design Details of the KT-Bessel Filter

Part 2: Discrete Time

  • From Continuous Time to Discrete Time

  • Representation of Causal Discrete-Time Filters: z-Transform

  • Design of Digital Filters

Part 3: Applications

  • An Anatomy of Making Markets

  • A Filter’s Capacity to Capture Dynamics

  • Measurements of Change

  • Measurements of Volatility

  • Covariance, Eigenanalysis, and Prediction

  • Appendix: Student-t Robust Parametric Estimation

Principal Bibliography

Index

 Logical Chapter Layout

The diagram to the right shows one perspective of the interrelationship between the chapters in this work. Convolution, the bedrock of signals analysis, allows for the separation of signal from filter so each can be studied on its own right. Ultimately, discrete-time (DT) filter designs are required for realtime applications, but the continuous-time (CT) designs are required to scale the DT filters according to the application.

The description of the action of a filter depends on the type of input signal. For market data, which has finite duration, the convolution of filter with signal works. But for stochastic signals, the framework has to be recast to recover convolution. Additionally, fractional Gaussian noise and fractional Brownian motion is derived from passing white noise through a suitably designed long-memory filter.

chapter layout

Publications to Date

Left- and right-handed bases in R3