Abstract

Recently, there has been an increased interest in the development of kernel methods for learning with sequential data. The signature kernel is a learning tool with the potential to handle irregularly sampled, multivariate time series. In [F. J. Király and H. Oberhauser, J. Mach. Learn. Res., 20 (2019), 31] the authors introduced a kernel trick for the truncated version of this kernel avoiding the exponential complexity that would have been involved in a direct computation. Here we show that for continuously differentiable paths, the signature kernel solves a hyperbolic PDE and recognize the connection with a well-known class of differential equations known in the literature as Goursat problems. This Goursat PDE only depends on the increments of the input sequences, does not require the explicit computation of signatures, and can be solved efficiently using state-of-the-art hyperbolic PDE numerical solvers, giving a kernel trick for the untruncated signature kernel, with the same raw complexity as the method from Király and Oberhauser, but with the advantage that the PDE numerical scheme is well suited for GPU parallelization, which effectively reduces the complexity by a full order of magnitude in the length of the input sequences. In addition, we extend the previous analysis to the space of geometric rough paths and establish, using classical results from rough path theory, that the rough version of the signature kernel solves a rough integral equation analogous to the aforementioned Goursat problem. Finally, we empirically demonstrate the effectiveness of this PDE kernel as a machine learning tool in various data science applications dealing with sequential data. We make the library \tt sigkernel publicly available at https://github.com/crispitagorico/sigkernel.

Keywords

  1. kernel
  2. path signature
  3. Goursat PDE
  4. sequential data
  5. geometric rough path
  6. rough integration

MSC codes

  1. 60L10
  2. 60L20

Get full access to this article

View all available purchase options and get full access to this article.

Supplementary Material


PLEASE NOTE: These supplementary files have not been peer-reviewed.


Index of Supplementary Materials

Title of paper: The Signature Kernel is the solution of a Goursat PDE

Authors: Cristopher Salvi, Thomas Cass, James Foster, Terry Lyons, and Weixin Yang

File: supplement.pdf

Type: PDF

Contents: Supplementary material contains a short summary of rough path theory.

References

1.
G. E. Andrews, R. Askey, and R. Roy, Special Functions, Encyclopedia Math. Appl. 71, Cambridge University Press, Cambridge, UK, 2001.
2.
I. P. Arribas, G. M. Goodwin, J. R. Geddes, T. Lyons, and K. E. Saunders, A signature-based machine learning model for distinguishing bipolar disorder and borderline personality disorder, Transl. Psychiatry, 8 (2018), pp. 1--7.
3.
F. Bach, R. Jenatton, J. Mairal, and G. Obozinski, Optimization with sparsity-inducing penalties, Found. Trends Mach. Learn., 4 (2012), pp. 1--106.
4.
F. R. Bach, S. Lacoste-Julien, and G. Obozinski, On the equivalence between herding and conditional gradient algorithms, in Proceedings of the 29th International Conference on Machine Learning (ICML), Omnipress, 2012, pp. 1355--1362.
5.
A. Bagnall, J. Lines, A. Bostrom, J. Large, and E. Keogh, The great time series classification bake off: A review and experimental evaluation of recent algorithmic advances, Data Min. Knowl. Discov., 31 (2017), pp. 606--660.
6.
C. Bayer, P. Friz, and J. Gatheral, Pricing under rough volatility, Quant. Finance, 16 (2016), pp. 887--904.
7.
C. T. C. Salvi, private communication, 2020.
8.
Y. Chen, M. Welling, and A. Smola, Super-samples from kernel herding, in Proceedings of the 26th Conference on Uncertainty in Artificial Intelligence, AUAI Press, 2010, pp. 109--116.
9.
I. Chevyrev and H. Oberhauser, Signature Moments to Characterize Laws of Stochastic Processes, preprint, https://arxiv.org/abs/1810.10971, 2018.
10.
K. Chouk and M. Gubinelli, Rough Sheets, preprint, https://arxiv.org/abs/1406.7748, 2014.
11.
T. Cochrane, P. Foster, V. Chhabra, M. Lemercier, C. Salvi, and T. Lyons, SK-Tree: A Systematic Malware Detection Algorithm on Streaming Trees via the Signature Kernel, preprint, https://arxiv.org/abs/2102.07904, 2021.
12.
CoRoPa: Computational Rough Paths, software library, 2010, http://coropa.sourceforge.net/.
13.
F. Cosentino, H. Oberhauser, and A. Abate, A Randomized Algorithm to Reduce the Support of Discrete Measures, preprint, https://arxiv.org/abs/2006.01757, 2020.
14.
C. Cuchiero, W. Khosrawi, and J. Teichmann, A generative adversarial network approach to calibration of local stochastic volatility models, Risks, 8 (2020), 101.
15.
M. Cuturi, Fast global alignment kernels, in Proceedings of the 28th International Conference on Machine Learning (ICML-11), Omnipress, 2011, pp. 929--936.
16.
J. T. Day, A Runge-Kutta method for the numerical solution of the Goursat problem in hyperbolic partial differential equations, Comput. J., 9 (1966), pp. 81--83.
17.
T. S. Furey, N. Cristianini, N. Duffy, D. W. Bednarski, M. Schummer, and D. Haussler, Support vector machine classification and validation of cancer tissue samples using microarray expression data, Bioinform., 16 (2000), pp. 906--914.
18.
J. Gatheral, T. Jaisson, and M. Rosenbaum, Volatility is rough, Quant. Finance, 18 (2018), pp. 933--949.
19.
E. Goursat, A Course in Mathematical Analysis: Part 2. Differential Equations, Vol. 2, Dover, 1916.
20.
T. Hofmann, B. Schölkopf, and A. J. Smola, Kernel methods in machine learning, Ann. Statist., 36 (2008), pp. 1171--1220.
21.
W. Huang, Y. Nakamori, and S.-Y. Wang, Forecasting stock market movement direction with support vector machine, Comput. Oper. Res., 32 (2005), pp. 2513--2522.
22.
P. Kidger, P. Bonnier, I. Perez Arribas, C. Salvi, and T. Lyons, Deep signature transforms, in Advances in Neural Information Processing Systems, Vol. 32, 2019, pp. 3099--3109.
23.
P. Kidger, J. Foster, X. Li, H. Oberhauser, and T. Lyons, Neural SDEs as Infinite-Dimensional GANs, preprint, https://arxiv.org/abs/2102.03657, 2021.
24.
P. Kidger and T. Lyons, Signatory: Differentiable Computations of the Signature and Logsignature Transforms, on Both CPU and GPU, preprint, https://arxiv.org/abs/2001.00706, 2020; see also https://github.com/patrick-kidger/signatory.
25.
F. J. Király and H. Oberhauser, Kernels for Sequentially Ordered Data, preprint, https://arxiv.org/abs/1601.08169, 2016.
26.
F. J. Király and H. Oberhauser, Kernels for sequentially ordered data, J. Mach. Learn. Res., 20 (2019), 31.
27.
M. Lees, The Goursat problem, J. Soc. Indust. Appl. Math., 8 (1960), pp. 518--530, https://doi.org/10.1137/0108036.
28.
M. Lemercier, C. Salvi, T. Damoulas, E. V. Bonilla, and T. Lyons, Distribution Regression for Continuous-Time Processes via the Expected Signature, preprint, https://arxiv.org/abs/2006.05805, 2020.
29.
X. Li, T.-K. L. Wong, R. T. Chen, and D. K. Duvenaud, Scalable gradients and variational inference for stochastic differential equations, in Symposium on Advances in Approximate Bayesian Inference, PMLR, 2020, pp. 1--28.
30.
C. Litterer and T. Lyons, High order recombination and an application to cubature on Wiener space, Ann. Appl. Probab., 22 (2012), pp. 1301--1327.
31.
T. Lyons, Rough paths, signatures and the modelling of functions on streams, in International Congress of Mathematicians, Seoul, 2014.
32.
T. Lyons, M. Caruana, and T. Lévy, Differential equations driven by rough paths, Ecole d'été de Probabilités de Saint-Flour XXXIV, Lectures from the 34th Summer School on Probability Theory held in Saint-Flour, Springer, 2004, pp. 1--93.
33.
T. Lyons, S. Nejad, and I. Perez Arribas, Numerical method for model-free pricing of exotic derivatives in discrete time using rough path signatures, Appl. Math. Finance, 26 (2019), pp. 583--597.
34.
T. Lyons and N. Victoir, An extension theorem to rough paths, Ann. Inst. H. Poincaré Anal. Non Linéaire, 24 (2007), pp. 835--847.
35.
T. J. Lyons, Differential equations driven by rough signals, Rev. Mat. Iberoamericana, 14 (1998), pp. 215--310.
36.
P. Moore, T. Lyons, and J. Gallacher for the Alzheimer's Disease Neuroimaging Initiative, Using path signatures to predict a diagnosis of Alzheimer's disease, PloS ONE, 14 (2019), e0222212.
37.
J. H. Morrill, A. Kormilitzin, A. J. Nevado-Holgado, S. Swaminathan, S. D. Howison, and T. J. Lyons, Utilization of the signature method to identify the early onset of sepsis from multivariate physiological time series in critical care monitoring, Critical Care Medicine, 48 (2020), pp. e976--e981.
38.
A. D. Polyanin and V. E. Nazaikinskii, Handbook of Linear Partial Differential Equations for Engineers and Scientists, 2nd ed., Chapman and Hall/CRC, 2015.
39.
J. Reizenstein and B. Graham, The iisignature Library: Efficient Calculation of Iterated-Integral Signatures and Log Signatures, preprint, https://arxiv.org/abs/1802.08252, 2018.
40.
N. I. Sapankevych and R. Sankar, Time series prediction using support vector machines: A survey, IEEE Comput. Intell. Mag., 4 (2009), pp. 24--38.
41.
M. Schmidt, N. L. Roux, and F. R. Bach, Convergence rates of inexact proximal-gradient methods for convex optimization, in Advances in Neural Information Processing Systems, 2011, pp. 1458--1466.
42.
J. Shawe-Taylor and N. Cristianini, Kernel Methods for Pattern Analysis, Cambridge University Press, 2004.
43.
A. Smola, A. Gretton, L. Song, and B. Schölkopf, A Hilbert space embedding for distributions, in International Conference on Algorithmic Learning Theory, Springer, 2007, pp. 13--31.
44.
R. Tavenard, J. Faouzi, G. Vandewiele, F. Divo, G. Androz, C. Holtz, M. Payne, R. Yurchak, M. Rußwurm, K. Kolar, and E. Woods, Tslearn, a machine learning toolkit for time series data, J. Mach. Learn. Res., 21 (2020), pp. 1--6, http://jmlr.org/papers/v21/20-091.html.
45.
S. Tong and E. Chang, Support vector machine active learning for image retrieval, in Proceedings of the 9th ACM International Conference on Multimedia, ACM, 2001, pp. 107--118.
46.
S. Tong and D. Koller, Support vector machine active learning with applications to text classification, J. Mach. Learn. Res., 2 (2001), pp. 45--66.
47.
C. Toth and H. Oberhauser, Bayesian learning from sequential data using Gaussian processes with signature covariances, in Proceedings of the 37th International Conference on Machine Learning (ICML), 2020, pp. 9548--9560.
48.
B. Tzen and M. Raginsky, Neural Stochastic Differential Equations: Deep Latent Gaussian Models in the Diffusion Limit, preprint, https://arxiv.org/abs/1905.09883, 2019.
49.
V. Vapnik, The support vector method of function estimation, in Nonlinear Modeling, Springer, 1998, pp. 55--85.
50.
A. M. Wazwaz, On the numerical solution for the Goursat problem, Appl. Math. Comput., 59 (1993), pp. 89--95.
51.
W. Yang, T. Lyons, H. Ni, C. Schmid, and L. Jin, Developing the Path Signature Methodology and Its Application to Landmark-Based Human Action Recognition, preprint, https://arxiv.org/abs/1707.03993, 2017.

Information & Authors

Information

Published In

cover image SIAM Journal on Mathematics of Data Science
SIAM Journal on Mathematics of Data Science
Pages: 873 - 899
ISSN (online): 2577-0187

History

Submitted: 14 September 2020
Accepted: 3 June 2021
Published online: 9 September 2021

Keywords

  1. kernel
  2. path signature
  3. Goursat PDE
  4. sequential data
  5. geometric rough path
  6. rough integration

MSC codes

  1. 60L10
  2. 60L20

Authors

Affiliations

Funding Information

Alan Turing Institute https://doi.org/10.13039/100012338 : EP/N510129/1

Funding Information

Engineering and Physical Sciences Research Council https://doi.org/10.13039/501100000266 : EP/S026347/1, EP/R513295/1

Metrics & Citations

Metrics

Citations

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

Cited By

There are no citations for this item

Media

Figures

Other

Tables

Share

Share

Copy the content Link

Share with email

Email a colleague

Share on social media