Recognition: unknown
A Differentiable Programming System to Bridge Machine Learning and Scientific Computing
read the original abstract
Scientific computing is increasingly incorporating the advancements in machine learning and the ability to work with large amounts of data. At the same time, machine learning models are becoming increasingly sophisticated and exhibit many features often seen in scientific computing, stressing the capabilities of machine learning frameworks. Just as the disciplines of scientific computing and machine learning have shared common underlying infrastructure in the form of numerical linear algebra, we now have the opportunity to further share new computational infrastructure, and thus ideas, in the form of Differentiable Programming. We describe Zygote, a Differentiable Programming system that is able to take gradients of general program structures. We implement this system in the Julia programming language. Our system supports almost all language constructs (control flow, recursion, mutation, etc.) and compiles high-performance code without requiring any user intervention or refactoring to stage computations. This enables an expressive programming model for deep learning, but more importantly, it enables us to incorporate a large ecosystem of libraries in our models in a straightforward way. We discuss our approach to automatic differentiation, including its support for advanced techniques such as mixed-mode, complex and checkpointed differentiation, and present several examples of differentiating programs.
This paper has not been read by Pith yet.
Forward citations
Cited by 5 Pith papers
-
VertAX: a differentiable vertex model for learning epithelial tissue mechanics
VertAX supplies a differentiable JAX implementation of vertex models for confluent epithelia that enables forward simulation, mechanical parameter inference, and inverse design of tissue-scale behaviors.
-
Decision-Focused Federated Learning Under Heterogeneous Objectives and Constraints
New bounds on SPO+ loss heterogeneity in federated predict-then-optimize with varying objectives and constraints indicate federation benefits when statistical gains exceed heterogeneity costs, with robustness in stron...
-
Physics-informed reservoir characterization from bulk and extreme pressure events with a differentiable simulator
A physics-informed ML method embeds a differentiable flow simulator into neural network training to infer permeability from sparse pressure data, halving inference error versus data-driven baselines across scenarios a...
-
Learning Non-Markovian Noise via Ensemble Optimal Control
Machine learning trains an ensemble optimal control scheme to pick optimal measurement times for non-Markovian quantum noise parameters, reaching near Cramér-Rao bound precision.
-
Neural Computers
Neural Computers are introduced as a new machine form where computation, memory, and I/O are unified in a learned runtime state, with initial video-model experiments showing acquisition of basic interface primitives f...
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.