pith. machine review for the scientific record. sign in

arxiv: 1511.03908 · v4 · submitted 2015-11-12 · 💻 cs.LG · cs.CV· cs.NE

Recognition: unknown

Learning Human Identity from Motion Patterns

Authors on Pith no claims yet
classification 💻 cs.LG cs.CVcs.NE
keywords humantemporalauthenticationidentitykinematicslearningmulti-modalneural
0
0 comments X
read the original abstract

We present a large-scale study exploring the capability of temporal deep neural networks to interpret natural human kinematics and introduce the first method for active biometric authentication with mobile inertial sensors. At Google, we have created a first-of-its-kind dataset of human movements, passively collected by 1500 volunteers using their smartphones daily over several months. We (1) compare several neural architectures for efficient learning of temporal multi-modal data representations, (2) propose an optimized shift-invariant dense convolutional mechanism (DCWRNN), and (3) incorporate the discriminatively-trained dynamic features in a probabilistic generative framework taking into account temporal characteristics. Our results demonstrate that human kinematics convey important information about user identity and can serve as a valuable component of multi-modal authentication systems.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 1 Pith paper

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. HARMES: A Multi-Modal Dataset for Wearable Human Activity Recognition with Motion, Environmental Sensing and Sound

    cs.LG 2026-05 conditional novelty 7.0

    HARMES is the first large-scale dataset to combine wrist IMU, environmental, and audio sensors for recognizing 15 household activities across over 80 hours of data from 20 participants.