Recognition: unknown
Informational Divergence Approximations to Product Distributions
classification
💻 cs.IT
math.IT
keywords
distributiondivergenceinformationinformationalproductresultaccuratelyapproximate
read the original abstract
The minimum rate needed to accurately approximate a product distribution based on an unnormalized informational divergence is shown to be a mutual information. This result subsumes results of Wyner on common information and Han-Verd\'{u} on resolvability. The result also extends to cases where the source distribution is unknown but the entropy is known.
This paper has not been read by Pith yet.
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.