The lottery ticket hypothesis conjectures the existence of sparse subnetworks of large randomly initialized deep neural networks that can be successfully trained in isolation.
Modeling the time evolution of discrete sets of items (e. g., genetic mutations) is a fundamental problem in many biomedical applications.
We disprove a recent conjecture regarding discrete distributions and their generating polynomials stating that strong log-concavity implies log-submodularity.
We consider the problem of inference in discrete probabilistic models, that is, distributions over subsets of a finite ground set.
Parameter identification and comparison of dynamical systems is a challenging task in many fields.
Submodular and supermodular functions have found wide applicability in machine learning, capturing notions such as diversity and regularity, respectively.