1 code implementation • 22 Jul 2022 • Rongzhe Wei, Haoteng Yin, Junteng Jia, Austin R. Benson, Pan Li
Graph neural networks (GNNs) have shown superiority in many prediction tasks over graphs due to their impressive capability of capturing nonlinear relations in graph-structured data.
1 code implementation • 23 May 2022 • Kiran Tomlinson, Austin R. Benson
We show that incorporating social network structure can improve the predictions of the standard econometric choice model, the multinomial logit.
1 code implementation • NeurIPS 2021 • Nate Veldt, Austin R. Benson, Jon Kleinberg
We develop the first approximation algorithms for this problem, where the approximations can be quickly computed via reduction to a sparse graph cut problem, with graph sparsity controlled by the desired approximation factor.
2 code implementations • 30 Jun 2021 • Abhay Singh, Qian Huang, Sijia Linda Huang, Omkar Bhalerao, Horace He, Ser-Nam Lim, Austin R. Benson
Here, we demonstrate how simply adding a set of edges, which we call a \emph{proposal set}, to the graph as a pre-processing step can improve the performance of several link prediction algorithms.
Ranked #1 on
Link Property Prediction
on ogbl-ddi
1 code implementation • 6 Jun 2021 • Junteng Jia, Cenk Baykal, Vamsi K. Potluru, Austin R. Benson
With the wide-spread availability of complex relational data, semi-supervised node classification in graphs has become a central machine learning problem.
1 code implementation • 2 Jun 2021 • Nate Veldt, Austin R. Benson, Jon Kleinberg
Finding dense subgraphs of a large graph is a standard problem in graph mining that has been studied extensively both for its theoretical richness and its many practical applications.
1 code implementation • 17 May 2021 • Kiran Tomlinson, Johan Ugander, Austin R. Benson
Standard methods in preference learning involve estimating the parameters of discrete choice models from data of selections (choices) made by individuals from a discrete set of alternatives (the choice set).
no code implementations • 27 Mar 2021 • Francesco Tudisco, Konstantin Prokopchik, Austin R. Benson
Hypergraphs are a common model for multiway relationships in data, and hypergraph semi-supervised learning is the problem of assigning labels to all nodes in a hypergraph, given labels on just a few nodes.
2 code implementations • 24 Jan 2021 • Philip S. Chodrow, Nate Veldt, Austin R. Benson
Many graph algorithms for this task are based on variants of the stochastic blockmodel, a random graph with flexible cluster structure.
1 code implementation • 19 Jan 2021 • Junteng Jia, Austin R. Benson
Semi-supervised learning on graphs is a widely applicable problem in network science and machine learning.
no code implementations • 29 Oct 2020 • Austin R. Benson, Anil Damle, Alex Townsend
We draw connections between simple neural networks and under-determined linear systems to comprehensively explore several interesting theoretical questions in the study of neural networks.
7 code implementations • ICLR 2021 • Qian Huang, Horace He, Abhay Singh, Ser-Nam Lim, Austin R. Benson
Graph Neural Networks (GNNs) are the predominant technique for learning over graphs.
Node Classification on Non-Homophilic (Heterophilic) Graphs
Node Property Prediction
2 code implementations • 7 Sep 2020 • Kiran Tomlinson, Austin R. Benson
Using our models, we identify new context effects in widely used choice datasets and provide the first analysis of choice set context effects in social network growth.
1 code implementation • 5 Sep 2020 • Vasileios Charisopoulos, Austin R. Benson, Anil Damle
Spectral methods are a collection of such problems, where solutions are orthonormal bases of the leading invariant subspace of an associated data matrix, which are only unique up to rotation and reflections.
1 code implementation • 15 Jun 2020 • Derek Lim, Austin R. Benson
For example, expertise on song annotations follows a "U shape" where experts are both early and late contributors with non-experts contributing intermediately; we develop a user utility model that captures such behavior.
1 code implementation • 10 Jun 2020 • Ilya Amburg, Nate Veldt, Austin R. Benson
In contrast to related problems on fair or balanced clustering, we model diversity in terms of variety of past experience (instead of, e. g., protected attributes), with a goal of forming groups that have both experience and diversity with respect to participation in edge types.
1 code implementation • 8 Jun 2020 • Francesco Tudisco, Austin R. Benson, Konstantin Prokopchik
Label spreading is a general technique for semi-supervised learning with point cloud or network data, which can be interpreted as a diffusion of labels on a graph.
no code implementations • 7 Mar 2020 • Katherine Van Koevering, Austin R. Benson, Jon Kleinberg
These binomials are common across many areas of speech, in both formal and informal text.
1 code implementation • 21 Feb 2020 • Nate Veldt, Austin R. Benson, Jon Kleinberg
However, there are only a few specialized approaches for localized clustering in hypergraphs.
1 code implementation • NeurIPS 2020 • Vasileios Charisopoulos, Austin R. Benson, Anil Damle
Several problems in machine learning, statistics, and other fields rely on computing eigenvectors.
2 code implementations • 19 Feb 2020 • Junteng Jia, Austin R. Benson
A graph neural network transforms features in each vertex's neighborhood into a vector representation of the vertex.
1 code implementation • ICML 2020 • Kiran Tomlinson, Austin R. Benson
The way that people make choices or exhibit preferences can be strongly affected by the set of available alternatives, often called the choice set.
1 code implementation • 22 Oct 2019 • Ilya Amburg, Nate Veldt, Austin R. Benson
Here, we develop a computational framework for the problem of clustering hypergraphs with categorical edge labels --- or different interaction types --- where clusters corresponds to groups of nodes that frequently participate in the same type of interaction.
2 code implementations • NeurIPS 2019 • Junteng Jia, Austin R. Benson
Many time series are effectively generated by a combination of deterministic continuous flows along with discrete jumps sparked by stochastic events.
1 code implementation • 23 May 2019 • Kun Dong, Austin R. Benson, David Bindel
Much of spectral graph theory descends directly from spectral geometry, the study of differentiable manifolds through the spectra of associated differential operators.
Social and Information Networks Numerical Analysis
1 code implementation • 17 May 2019 • Junteng Jia, Michael T. Schaub, Santiago Segarra, Austin R. Benson
The first strategy selects edges to minimize the reconstruction error bound and works well on flows that are approximately divergence-free.
1 code implementation • 14 May 2019 • Ilya Amburg, Jon Kleinberg, Austin R. Benson
In various application areas, networked data is collected by measuring interactions involving some specific set of core nodes.
1 code implementation • 6 Feb 2019 • Xiang Fu, Shangdi Yu, Austin R. Benson
Large Question-and-Answer (Q&A) platforms support diverse knowledge curation on the Web.
1 code implementation • 28 Nov 2018 • Austin R. Benson, Jon Kleinberg
However, we find that this is not true; in fact, there is substantial variability in the value of the fringe nodes for prediction.
1 code implementation • 20 Aug 2018 • Junteng Jia, Austin R. Benson
The core-periphery structure, which decompose a network into a densely-connected core and a sparsely-connected periphery, constantly emerges from spatial networks such as traffic, biological and social networks.
Social and Information Networks Physics and Society
1 code implementation • 25 Jul 2018 • Austin R. Benson
Eigenvector centrality is a standard network analysis tool for determining the importance of (or ranking of) entities in a connected system that is represented by a graph.
1 code implementation • 13 Jul 2018 • Michael T. Schaub, Austin R. Benson, Paul Horn, Gabor Lippner, Ali Jadbabaie
Simplicial complexes, a mathematical object common in topological data analysis, have emerged as a model for multi-nodal interactions that occur in several complex systems; for example, biological interactions occur between a set of molecules rather than just two, and communication systems can have group messages and not just person-to-person messages.
Social and Information Networks Discrete Mathematics Algebraic Topology Physics and Society
1 code implementation • NeurIPS 2018 • Austin R. Benson, Jon Kleinberg
A typical way in which network data is recorded is to measure all the interactions among a specified set of core nodes; this produces a graph containing this core together with a potentially larger set of fringe nodes that have links to the core.
2 code implementations • 20 Feb 2018 • Austin R. Benson, Rediet Abebe, Michael T. Schaub, Ali Jadbabaie, Jon Kleinberg
Networks provide a powerful formalism for modeling complex systems by using a model of pairwise interactions.
1 code implementation • 19 Feb 2018 • Austin R. Benson
Networks are a fundamental model of complex systems throughout the sciences, and network datasets are typically analyzed through lower-order connectivity patterns described at the level of individual nodes and edges.
no code implementations • 12 Apr 2017 • Hao Yin, Austin R. Benson, Jure Leskovec
Here we introduce higher-order clustering coefficients that measure the closure probability of higher-order network cliques and provide a more comprehensive view of how the edges of complex networks cluster.
no code implementations • 29 Dec 2016 • Ashwin Paranjape, Austin R. Benson, Jure Leskovec
Networks are a fundamental tool for modeling complex systems in a variety of domains including social and communication networks as well as biology and neuroscience.
no code implementations • 26 Dec 2016 • Austin R. Benson, David F. Gleich, Jure Leskovec
Many networks are known to exhibit rich, lower-order connectivity patterns that can be captured at the level of individual nodes and edges.
Social and Information Networks Discrete Mathematics Physics and Society
1 code implementation • NeurIPS 2016 • Tao Wu, Austin R. Benson, David F. Gleich
Spectral clustering and co-clustering are well-known techniques in data analysis, and recent work has extended spectral clustering to square, symmetric tensors and hypermatrices derived from a network.
1 code implementation • NeurIPS 2014 • Austin R. Benson, Jason D. Lee, Bartek Rajwa, David F. Gleich
We demonstrate the efficacy of these algorithms on terabyte-sized synthetic matrices and real-world matrices from scientific computing and bioinformatics.