Academic Log
January 2024
Building upon the groundwork laid in the preceding month, I delved deeper into the works discovered previously. Additionally, I embarked on drafting a manuscript aimed at establishing a bridge between the weighted function spaces elucidated in [1] and the classical theory of universal kernels. While acknowledging this contribution to be modest, I deem it pertinent and undoubtedly interesting.
My continued exploration was driven by the motivation to apply these theoretical constructs to optimal stopping problems. Thus, I persisted in investigating adapted topologies, the higher rank signatures outlined in [2], and the Wasserstein spaces delineated in [3]. I discerned a profound resonance among these works and [1]. At the core of these endeavors lies the idea of modifying the underlying topology of a space to either strengthen it, potentially leading to the continuity of certain mappings, or to weaken it, thereby engendering other desirable properties such as the compactness of balls. With enough ingenuity one may find the right balance and take advantage of both scenarios.
Admittingly, January afforded me the luxury of indulging in mathematical exploration, allowing me to immerse myself fully in the intricacies of the subject matter at hand.
Main works of January:
- Cuchiero, Christa, Philipp Schmocker, and Josef Teichmann. ‘Global Universal Approximation of Functional Input Maps on Weighted Spaces’. arXiv, 21 September 2023. link.
- Bonnier, Patric, Chong Liu, and Harald Oberhauser. ‘Adapted Topologies and Higher Rank Signatures’. arXiv, 13 April 2021. link.
- Bartl, Daniel, Mathias Beiglböck, and Gudmund Pammer. ‘The Wasserstein Space of Stochastic Processes’. arXiv, 11 August 2023. link.
- Eder, Manu. “Compactness in adapted weak topologies.” arXiv preprint (2019). link
- Pammer, Gudmund. “A note on the adapted weak topology in discrete time.” arXiv (2022): 1-13.link
December 2023
The commencement of December marked the discovery of a pivotal paper, [1]. This paper, without delving excessively into theoretical intricacies, elucidates that the well-established universality property of Signatures on the space of continuous functions over a compact set of paths can undergo significant generalization when applied to certain functions across the entire path space by employing weighted spaces. Intuitively, a weighted space comprises a topological space coupled with a mapping that regulates the growth of functions outside of compact sets within said topological space. The implications of this discovery resonated deeply with me, sparking immediate contemplation of potential applications within various contexts.
On another front, I did some work within the context of optimal stopping problems. I paid special attention to the works [2] and [3]. The former operates within the space of stopped rough paths, while the latter navigates the space of naturally adapted processes. My engagement with [3] exposed me to Wasserstein spaces and adapted topologies, which, in essence, are stronger topologies in the realm of filtered processes that incorporate the information embedded in their respective filtrations. Simultaneously, I sustained my exploration of kernel theory, delving into proof attempts and discerning the emergence of novel ideas with promising potential. For now, I won’t go into more details.
Main works of December:
- Cuchiero, Christa, Philipp Schmocker, and Josef Teichmann. ‘Global Universal Approximation of Functional Input Maps on Weighted Spaces’. arXiv, 21 September 2023. link.
- Bayer, Christian, Luca Pelizzari, and John Schoenmakers. ‘Primal and Dual Optimal Stopping with Signatures’. arXiv, 6 December 2023. link.
- Horvath, Blanka, Maud Lemercier, Chong Liu, Terry Lyons, and Cristopher Salvi. ‘Optimal Stopping via Distribution Regression: A Higher Rank Signature Approach’. arXiv, 3 April 2023. link.
- Bartl, Daniel, Mathias Beiglböck, and Gudmund Pammer. ‘The Wasserstein Space of Stochastic Processes’. arXiv, 11 August 2023. link.
- Bonnier, Patric, Chong Liu, and Harald Oberhauser. ‘Adapted Topologies and Higher Rank Signatures’. arXiv, 13 April 2021. link.
- Backhoff-Veraguas, Julio, Daniel Bartl, Mathias Beiglböck, and Manu Eder. ‘All Adapted Topologies Are Equal’. Probability Theory and Related Fields 178, no. 3–4 (December 2020): 1125–72. link.
November 2023
In light of my discussions in the previous month, I made a strategic decision to shift my focus, opting to discontinue exploration of the Neural Tangent Kernel (NTK) of the controlled ResNets. However, I maintained a steadfast commitment to working with signatures in the realm of kernel methods. A significant portion of my efforts was dedicated to deepening my understanding of signature kernels, delving into relevant literature such as [1], [2], and [4].
In the course of my research, I encountered the concepts of universality and characteristicness within the broader scope of kernel learning. Additionally, I came across kernel scoring rules, a discovery facilitated by the insights provided in [5], along with an exploration of the associated concept of strict properness. This led me to engage in the development of a more straightforward proof (or so I believe) regarding the strict properness of the scoring rule outlined in [5].
In parallel, my curiosity extended to pondering the Reproducing Kernel Hilbert Space (RKHS) associated with the signature kernel. This contemplation added a layer of depth to my exploration of kernel methods. Moreover, November proved to be a fruitful month for expanding my knowledge in Analysis within Banach spaces, enriching my understanding of the mathematical foundations underpinning my research pursuits.
Main works of November:
- Király, Franz J., and Harald Oberhauser. ‘Kernels for Sequentially Ordered Data’. arXiv, 29 January 2016. link.
- Lee, Darrick, and Harald Oberhauser. ‘The Signature Kernel’. arXiv, 8 May 2023. link.
- Salvi, Cristopher, Thomas Cass, James Foster, Terry Lyons, and Weixin Yang. ‘The Signature Kernel Is the Solution of a Goursat PDE’. SIAM Journal on Mathematics of Data Science 3, no. 3 (January 2021): 873–99. link.
- Cass, Thomas, Terry Lyons, and Xingcheng Xu. ‘General Signature Kernels’. arXiv, 1 July 2021. link.
- Issa, Zacharia, Blanka Horvath, Maud Lemercier, and Cristopher Salvi. ‘Non-Adversarial Training of Neural SDEs with Signature Kernel Scores’. arXiv, 25 May 2023. link.
- Steinwart, Ingo, and Johanna F. Ziegel. ‘Strictly Proper Kernel Scores and Characteristic Kernels on Compact Spaces’. Applied and Computational Harmonic Analysis 51 (1 March 2021): 510–42. link.
October 2023
In the initial part of October, my primary focus was directed towards exploring the applications of path signatures in the context of portfolio optimization [1]. It was during this exploration that I became acquainted with the universality property of signatures, a concept likely to feature prominently in my upcoming thesis. Despite lacking a clear thesis direction, fueled by curiosity, I delved into Stochastic Portfolio Theory and Signature-based methods within this domain, aiming to grasp their implications and potential advantages [2].
However, a significant shift occurred as my attention veered towards kernel learning. The introduction to the Neural Tangent Kernel (NTK) marked a pivotal moment, prompting further exploration into Neural Signature Kernels [3]. In an effort to deepen my comprehension, I undertook the derivation of an explicit expression for the NTK of the controlled ResNets as detailed in [3]. Concurrently, I delved into the tensor programs framework [4,5]. The direct derivation posed challenges with intricate recursions, and the tensor programs raised reservations regarding the arguments presented. Seeking clarification, I reached out to the principal author of [4], learning of progress in deriving an NTK expression, with certain technicalities yet to be resolved. In addition, it’s noteworthy to mention that most things related to Deep Learning or Kernel Learning were entirely novel to me, rendering this month particularly enriching in terms of knowledge acquisition.
Main works of October:
- Futter, Owen, Blanka Horvath, and Magnus Wiese. ‘Signature Trading: A Path-Dependent Extension of the Mean-Variance Framework with Exogenous Signals’. SSRN Scholarly Paper. Rochester, NY, 24 August 2023. link.
- Cuchiero, Christa, and Janka Möller. ‘Signature Methods in Stochastic Portfolio Theory’. arXiv, 3 October 2023. link.
- Cirone, Nicola Muca, Maud Lemercier, and Cristopher Salvi. ‘Neural Signature Kernels as Infinite-Width-Depth-Limits of Controlled ResNets’. arXiv, 4 June 2023. link.
- Yang, Greg. ‘Tensor Programs II: Neural Tangent Kernel for Any Architecture’. arXiv, 29 November 2020. link.
- Yang, Greg. ‘Tensor Programs I: Wide Feedforward or Recurrent Neural Networks of Any Architecture Are Gaussian Processes’. arXiv, 8 May 2021. link.
- Karatzas, Ioannis, and Robert Fernholz. ‘Stochastic Portfolio Theory: An Overview’. In Handbook of Numerical Analysis, 15:89–167. Elsevier, 2009. link.
September 2023
Initiated the process of selecting supervision for my thesis during this month. By the end of September, I confirmed my academic team, consisting of Prof. Fenghui Yu as the primary supervisor and Prof. Christa Cuchiero as the co-supervisor. Despite the lack of a clearly defined thesis direction, I expressed a keen interest in exploring something related to Rough Path Theory, specifically focusing on Path Signatures. The allure of signatures, discovered in the previous semester, stemmed from their integration of both Analysis and Algebra, providing a compelling intersection that resonated with my academic pursuits. And, beyond their mathematical elegance, signatures offer many practical applications in fields such as Machine Learning and Financial Mathematics. Limited time during the month allowed for only light reading, but this did not hinder progress, as I found myself comfortably ahead of schedule.
