pith. machine review for the scientific record. sign in

arxiv: 1704.04110 · v3 · submitted 2017-04-13 · 💻 cs.AI · cs.LG· stat.ML

Recognition: unknown

DeepAR: Probabilistic Forecasting with Autoregressive Recurrent Networks

Authors on Pith no claims yet
classification 💻 cs.AI cs.LGstat.ML
keywords forecastingprobabilisticrighttimedeeparrecurrentseriesaccuracy
0
0 comments X
read the original abstract

Probabilistic forecasting, i.e. estimating the probability distribution of a time series' future given its past, is a key enabler for optimizing business processes. In retail businesses, for example, forecasting demand is crucial for having the right inventory available at the right time at the right place. In this paper we propose DeepAR, a methodology for producing accurate probabilistic forecasts, based on training an auto regressive recurrent network model on a large number of related time series. We demonstrate how by applying deep learning techniques to forecasting, one can overcome many of the challenges faced by widely-used classical approaches to the problem. We show through extensive empirical evaluation on several real-world forecasting data sets accuracy improvements of around 15% compared to state-of-the-art methods.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 3 Pith papers

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. ParaRNN: An Interpretable and Parallelizable Recurrent Neural Network for Time-Dependent Data

    stat.ML 2026-05 unverdicted novelty 6.0

    ParaRNN decouples RNN dynamics into interpretable additive components, enabling parallelization and nonparametric regression bounds while matching vanilla RNN performance on sequential tasks.

  2. Exploring the Potential of Probabilistic Transformer for Time Series Modeling: A Report on the ST-PT Framework

    cs.LG 2026-04 unverdicted novelty 6.0

    ST-PT turns transformers into explicit factor graphs for time series, enabling structural injection of symbolic priors, per-sample conditional generation, and principled latent autoregressive forecasting via MFVI iterations.

  3. MR-ImagenTime: Multi-Resolution Time Series Generation through Dual Image Representations

    cs.LG 2026-03 unverdicted novelty 4.0

    MR-CDM uses hierarchical multi-resolution decomposition and multi-scale conditional diffusion to generate forecasts that reduce MAE and RMSE by 6-10% versus baselines like CSDI and Informer on four datasets.