pith. machine review for the scientific record. sign in

arxiv: 1805.11801 · v1 · submitted 2018-05-30 · 💻 cs.ET · physics.app-ph

Recognition: unknown

Long short-term memory networks in memristor crossbars

Authors on Pith no claims yet
classification 💻 cs.ET physics.app-ph
keywords lstmmemorymemristorbottleneckcapabilitycomputinglargelong
0
0 comments X
read the original abstract

Recent breakthroughs in recurrent deep neural networks with long short-term memory (LSTM) units has led to major advances in artificial intelligence. State-of-the-art LSTM models with significantly increased complexity and a large number of parameters, however, have a bottleneck in computing power resulting from limited memory capacity and data communication bandwidth. Here we demonstrate experimentally that LSTM can be implemented with a memristor crossbar, which has a small circuit footprint to store a large number of parameters and in-memory computing capability that circumvents the 'von Neumann bottleneck'. We illustrate the capability of our system by solving real-world problems in regression and classification, which shows that memristor LSTM is a promising low-power and low-latency hardware platform for edge inference.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.