Recognition: unknown
MXNet: A Flexible and Efficient Machine Learning Library for Heterogeneous Distributed Systems
read the original abstract
MXNet is a multi-language machine learning (ML) library to ease the development of ML algorithms, especially for deep neural networks. Embedded in the host language, it blends declarative symbolic expression with imperative tensor computation. It offers auto differentiation to derive gradients. MXNet is computation and memory efficient and runs on various heterogeneous systems, ranging from mobile devices to distributed GPU clusters. This paper describes both the API design and the system implementation of MXNet, and explains how embedding of both symbolic expression and tensor operation is handled in a unified fashion. Our preliminary experiments reveal promising results on large scale deep neural network applications using multiple GPU machines.
This paper has not been read by Pith yet.
Forward citations
Cited by 3 Pith papers
-
MONAI: An open-source framework for deep learning in healthcare
MONAI is a community-supported PyTorch framework that extends deep learning to medical data with domain-specific architectures, transforms, and deployment tools.
-
MediaPipe: A Framework for Building Perception Pipelines
MediaPipe is a new open-source framework that lets developers assemble, prototype, and deploy ML-based perception pipelines across platforms with reproducible performance measurements.
-
A Survey of Large Language Models
This survey reviews the background, key techniques, and evaluation methods for large language models, emphasizing emergent abilities that appear at large scales.
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.