pith. machine review for the scientific record. sign in

arxiv: 1812.01484 · v1 · submitted 2018-12-04 · 💻 cs.LG · stat.ML

Recognition: unknown

Privacy-Preserving Distributed Deep Learning for Clinical Data

Authors on Pith no claims yet
classification 💻 cs.LG stat.ML
keywords datadistributedprivacytrainingclinicaldeepinformationlearning
0
0 comments X
read the original abstract

Deep learning with medical data often requires larger samples sizes than are available at single providers. While data sharing among institutions is desirable to train more accurate and sophisticated models, it can lead to severe privacy concerns due the sensitive nature of the data. This problem has motivated a number of studies on distributed training of neural networks that do not require direct sharing of the training data. However, simple distributed training does not offer provable privacy guarantees to satisfy technical safe standards and may reveal information about the underlying patients. We present a method to train neural networks for clinical data in a distributed fashion under differential privacy. We demonstrate these methods on two datasets that include information from multiple independent sites, the eICU collaborative Research Database and The Cancer Genome Atlas.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 1 Pith paper

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. Distributed Deep Variational Approach for Privacy-preserving Data Release

    cs.CR 2026-05 unverdicted novelty 5.0

    GPP trains local variational encoders in federated settings to release representations that keep utility within 1% of an autoencoder baseline while driving adversary AUC on sensitive attributes to near-random levels o...