TY - GEN
T1 - Flat metric minimization with applications in generative modeling
AU - Möllenhoff, Thomas
AU - Cremers, Daniel
N1 - Publisher Copyright:
© 2019 by the author(s).
PY - 2019
Y1 - 2019
N2 - We take the novel perspective to view data not as a probability distribution but rather as a current. Primarily studied in the field of geometric measure theory, k-currents are continuous linear functionals acting on compactly supported smooth differential forms and can be understood as a generalized notion of oriented k-dimensional manifold. By moving from distributions (which are 0-currents) to k-currents, we can explicitly orient the data by attaching a k-dimensional tangent plane to each sample point. Based on the flat metric which is a fundamental distance between currents, we derive FlatGAN, a formulation in the spirit of generative adversarial networks but generalized to k-currents. In our theoretical contribution we prove that the flat metric between a parametrized current and a reference current is Lipschitz continuous in the parameters. In experiments, we show that the proposed shift to k > 0 leads to interpretable and disentangled latent representations which behave equivariantly to the specified oriented tangent planes.
AB - We take the novel perspective to view data not as a probability distribution but rather as a current. Primarily studied in the field of geometric measure theory, k-currents are continuous linear functionals acting on compactly supported smooth differential forms and can be understood as a generalized notion of oriented k-dimensional manifold. By moving from distributions (which are 0-currents) to k-currents, we can explicitly orient the data by attaching a k-dimensional tangent plane to each sample point. Based on the flat metric which is a fundamental distance between currents, we derive FlatGAN, a formulation in the spirit of generative adversarial networks but generalized to k-currents. In our theoretical contribution we prove that the flat metric between a parametrized current and a reference current is Lipschitz continuous in the parameters. In experiments, we show that the proposed shift to k > 0 leads to interpretable and disentangled latent representations which behave equivariantly to the specified oriented tangent planes.
UR - http://www.scopus.com/inward/record.url?scp=85077959875&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85077959875
T3 - 36th International Conference on Machine Learning, ICML 2019
SP - 8137
EP - 8148
BT - 36th International Conference on Machine Learning, ICML 2019
PB - International Machine Learning Society (IMLS)
T2 - 36th International Conference on Machine Learning, ICML 2019
Y2 - 9 June 2019 through 15 June 2019
ER -