pith. machine review for the scientific record. sign in

arxiv: 1811.01900 · v3 · submitted 2018-11-05 · 💻 cs.LG · stat.ML

Recognition: unknown

Janossy Pooling: Learning Deep Permutation-Invariant Functions for Variable-Size Inputs

Authors on Pith no claims yet
classification 💻 cs.LG stat.ML
keywords functionspermutation-invariantjanossypoolingconsiderfunctionliteraturepermutation-sensitive
0
0 comments X
read the original abstract

We consider a simple and overarching representation for permutation-invariant functions of sequences (or multiset functions). Our approach, which we call Janossy pooling, expresses a permutation-invariant function as the average of a permutation-sensitive function applied to all reorderings of the input sequence. This allows us to leverage the rich and mature literature on permutation-sensitive functions to construct novel and flexible permutation-invariant functions. If carried out naively, Janossy pooling can be computationally prohibitive. To allow computational tractability, we consider three kinds of approximations: canonical orderings of sequences, functions with $k$-order interactions, and stochastic optimization algorithms with random permutations. Our framework unifies a variety of existing work in the literature, and suggests possible modeling and algorithmic extensions. We explore a few in our experiments, which demonstrate improved performance over current state-of-the-art methods.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 3 Pith papers

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. How Powerful are Graph Neural Networks?

    cs.LG 2018-10 accept novelty 9.0

    GIN is provably as expressive as the Weisfeiler-Lehman graph isomorphism test, while GCN and GraphSAGE have strictly weaker discriminative power on some graphs.

  2. Neural Operator: Graph Kernel Network for Partial Differential Equations

    cs.LG 2020-03 unverdicted novelty 7.0

    Graph Kernel Networks learn PDE solution operators that generalize across discretization methods and grid resolutions using graph-based kernel integration.

  3. Geometric Deep Learning: Grids, Groups, Graphs, Geodesics, and Gauges

    cs.LG 2021-04 accept novelty 6.0

    Geometric deep learning provides a unified mathematical framework based on grids, groups, graphs, geodesics, and gauges to explain and extend neural network architectures by incorporating physical regularities.