The juxtaposition of "machine learning" and "pure mathematics and theoretical physics" may first appear as contradictory in terms. The rigours of proofs and derivations in the latter seem to reside in a different world from the randomness of data and statistics in the former. Yet, an often under-appreciated component of mathematical discovery, typically not presented in a final draft, is experimentation: both with ideas and with mathematical data. Think of the teenage Gauss, who conjectured the Prime Number Theorem by plotting the prime-counting function, many decades before complex analysis was formalized to offer a proof.
Can modern technology in part mimic Gauss's intuition? The past five years saw an explosion of activity in using AI to assist the human mind in uncovering new mathematics: finding patterns, accelerating computations, and raising conjectures via the machine learning of pure, noiseless data. The aim of this book, a first of its kind, is to collect research and survey articles from experts in this emerging dialogue between theoretical mathematics and machine learning. It does not dwell on the well-known multitude of mathematical techniques in deep learning, but focuses on the reverse relationship: how machine learning helps with mathematics. Taking a panoramic approach, the topics range from combinatorics to number theory, and from geometry to quantum field theory and string theory. Aimed at PhD students as well as seasoned researchers, each self-contained chapter offers a glimpse of an exciting future of this symbiosis.
Sample Chapter(s)
preface
Contents:
- Machine Learning Meets Number Theory: The Data Science of Birch–Swinnerton-Dyer (Laura Alessandretti, Andrea Baronchelli and Yang-Hui He)
- On the Dynamics of Inference and Learning (David S Berman, Jonathan J Heckman and Marc Klinger)
- Machine Learning: The Dimension of a Polytope (Tom Coates, Johannes Hofscheier and Alexander M Kasprzyk)
- Intelligent Explorations of the String Theory Landscape (Andrei Constantin)
- Deep Learning: Complete Intersection Calabi–Yau Manifolds (Harold Erbin and Riccardo Finotello)
- Deep-Learning the Landscape (Yang-Hui He)
- hep-th (Yang-Hui He, Vishnu Jejjala and Brent D Nelson)
- Symmetry-via-Duality: Invariant Neural Network Densities from Parameter-Space Correlators (Anindita Maiti, Keegan Stoner and James Halverson)
- Supervised Learning of Arithmetic Invariants (Thomas Oliver)
- Calabi–Yau Volumes, Reflexive Polytopes and Machine Learning (Rak-Kyeong Seong)
Readership: Researchers in mathematics, theoretical physics, and machine learning, interested in their interactions. Students curious about how AI can help with research in mathematics.
Professor Yang-Hui He is a Fellow of the London Institute for Mathematical Sciences, professor of mathematics at City, University of London, Lecturer in mathematics at Merton College, Oxford, and Chang-Jiang Chair of physics at Nankai University in China. He obtained his BA at Princeton (summa cum laude, Shenstone Prize and Kusaka Prize), MA at Cambridge (Distinction, Tripos), and PhD at MIT. After a postdoc at the University of Pennsylvania, he joined Oxford as the FitzJames Fellow and an STFC Advanced Fellow. He works at the interface of string theory, algebraic and combinatorial geometry, and machine learning.
Professor He is the Editor-in-Chief of the International Journal of Data Science in the Mathematical Sciences (World Scientific), and has over 200 journal publications and invited chapters. He also has a recent textbook The Calabi-Yau Landscape: From Physics, to Geometry, to Machine-Learning (Springer, LNM, 2022) and an editorial Topology and Physics, with the Nobel Laureate C N Yang and M-L Ge (World Scientific, 2019; voted by Book Authority as one of the top 100 quantum field theory books of all time).