By Sebastian Cassel, Head of Valuation Model Risk, BNP Paribas
A core application of tensor networks is function approximation. Tensors can be interpreted as multi-dimensional data structures that generalise vectors (Vi) and matrices (Mij) to objects with an arbitrary number of indices (Ti…k). Tensor networks then correspond to tensor product expressions where certain indices are connected between terms:
Sebastian Cassel
Head of Valuation Model Risk, BNP Paribas
Don't miss Sebastian Cassel's presentation at QuantMinds International on Tuesday 8 November. Read more here.
The decomposition of high-dimensional functions into products of lower dimensional functions is the foundation of tensor network methods. The nodal functions may be deduced by regression, but constructive algorithms also exist for certain network structures. Tensor networks can support interpolation with respect to line (or hypersurface) constraints, instead of just point constraints.
Decompositions may otherwise be based on the discretisation of integral relations.
On forming representations that are fully separable, e.g. f(x,y,z) ~ Σ A(x) B(y) C(z), the approximants can be efficiently integrated or differentiated using one-dimensional methods. Furthermore, the computational complexity can be controlled to scale linearly in the number of dimensions instead of exponentially.
"Tensor networks can support interpolation with respect to line (or hypersurface) constraints, instead of just point constraints."
Tensor network methods are generally useful for numerically solving differential or integral equations, as well as calculating expectations and associated parameter sensitivities. The following figure and table demonstrates results for d-dimensional European basket option valuation subject to the Black-Scholes model:
Tensor network convergence properties are problem-specific and network-specific, but compelling advantages can emerge with respect to traditional methods.
I. Oseledets and E. Tyrtyshnikov, 2010. TT-cross approximation for multidimensional arrays. Linear Algebra and its Applications, 432(1):70-88
K. Glau, D. Kressner, and F. Statti, 2020. Low-rank tensor approximation for Chebyshev interpolation in parametric option pricing. SIAM Journal on Financial Mathematics, 11(3):897-927
A. Antonov and V. Piterbarg, 2021. Alternatives to deep neural networks for function approximations in finance. SSRN:3958331
S. Cassel, 2022. Fast high-dimensional integration using tensor networks. arXiv:2202.09780