pith. machine review for the scientific record. sign in

arxiv: 1612.01452 · v2 · submitted 2016-12-05 · 💻 cs.CV

Recognition: unknown

ImageNet pre-trained models with batch normalization

Authors on Pith no claims yet
classification 💻 cs.CV
keywords modelspre-trainedimagenetnetworksstate-of-the-artalexnetapproachesarchitecture
0
0 comments X
read the original abstract

Convolutional neural networks (CNN) pre-trained on ImageNet are the backbone of most state-of-the-art approaches. In this paper, we present a new set of pre-trained models with popular state-of-the-art architectures for the Caffe framework. The first release includes Residual Networks (ResNets) with generation script as well as the batch-normalization-variants of AlexNet and VGG19. All models outperform previous models with the same architecture. The models and training code are available at http://www.inf-cv.uni-jena.de/Research/CNN+Models.html and https://github.com/cvjena/cnn-models

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.