3 papers with code • 0 benchmarks • 0 datasets
We propose a balanced coarsening scheme for multilevel hypergraph partitioning.
For many of today's data-centric applications, this data is streaming; new items arrive continuously, and the data grows with time.
We also show improvement for the min-cut solution on 2-uniform hypergraphs (graphs) over the standard spectral partitioning algorithm.
The acyclic hypergraph partitioning problem is to partition the hypernodes of a directed acyclic hypergraph into a given number of blocks of roughly equal size such that the corresponding quotient graph is acyclic while minimizing an objective function on the partition.
We introduce a new convex optimization problem, termed quadratic decomposable submodular function minimization (QDSFM), which allows to model a number of learning tasks on graphs and hypergraphs.
Typical large-scale recommender systems use deep learning models that are stored on a large amount of DRAM.
This article presents a novel memetic algorithm which remains effective on larger initial hypergraphs.
We also remove several further bottlenecks in processing large hyperedges, develop a faster contraction algorithm, and a new adaptive stopping rule for local search.
This work is motivated by two issues that arise when a hypergraph partitioning approach is used to tackle computer vision problems: (i) The uniform hypergraphs constructed for higher-order learning contain all edges, but most have negligible weights.