pith. machine review for the scientific record. sign in

arxiv: 1412.7024 · v5 · submitted 2014-12-22 · 💻 cs.LG · cs.CV· cs.NE

Recognition: unknown

Training deep neural networks with low precision multiplications

Authors on Pith no claims yet
classification 💻 cs.LG cs.CVcs.NE
keywords networksmultiplicationsneuralpointprecisiontrainingdatasetsdeep
0
0 comments X
read the original abstract

Multipliers are the most space and power-hungry arithmetic operators of the digital implementation of deep neural networks. We train a set of state-of-the-art neural networks (Maxout networks) on three benchmark datasets: MNIST, CIFAR-10 and SVHN. They are trained with three distinct formats: floating point, fixed point and dynamic fixed point. For each of those datasets and for each of those formats, we assess the impact of the precision of the multiplications on the final error after training. We find that very low precision is sufficient not just for running trained networks but also for training them. For example, it is possible to train Maxout networks with 10 bits multiplications.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 2 Pith papers

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. LLM.int8(): 8-bit Matrix Multiplication for Transformers at Scale

    cs.LG 2022-08 conditional novelty 7.0

    LLM.int8() performs 8-bit inference for transformers up to 175B parameters with no accuracy loss by combining vector-wise quantization for most features with 16-bit mixed-precision handling of systematic outlier dimensions.

  2. MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications

    cs.CV 2017-04 accept novelty 7.0

    MobileNets introduce depthwise separable convolutions plus width and resolution multipliers to produce efficient CNNs that trade off latency and accuracy for mobile and embedded vision applications.