Connections between Neural Networks and Pure Mathematics

neural_net.jpg

In Connections between Neural Networks and Pure Mathematics, I argue that a few powerful theorems involving nested/composite functions, proved by Kolmogorov (1957), Arnold (1958) and Sprecher (1965), help to explain why neural networks can be used to represent almost any process in nature.

Deep Learning Explainability: Hints from Physics

shutterstock_1142730587.jpg

In the article Deep Learning Explainability: Hints from Physics I show that deep learning and renormalization group theory are deeply interconnected. More specifically I describe in some detail a recent article showing that deep neural networks seem to “mimic” the process of zooming-out that characterizes the renormalization group process.

Neural Quantum States

shutterstock_340384811.jpg

In the article Neural Quantum StatesI discuss some recent research on the interface between machine learning and theoretical physics. I describe how Restricted Boltzmann Machines (RBMs), building blocks of deep neural networks, can be used to compute with extremely high accuracy the state of lowest energy of many-particle quantum systems (among other things).

Machine Learning and Particle Motion in Liquids: An Elegant Link

shutterstock_410264557.jpg

In this article, I argue, based on recent findings, that by thinking of the stochastic gradient descent algorithm (or the mini-batch gradient descent) as a Langevin stochastic process with an extra level of randomization (implemented via the learning rate), one can better understand the reasons why the stochastic gradient descent works so remarkably well as a global optimizer.