Point Processes
121 papers with code • 0 benchmarks • 2 datasets
Benchmarks
These leaderboards are used to track progress in Point Processes
Libraries
Use these libraries to find Point Processes models and implementationsMost implemented papers
Explaining Machine Learning Classifiers through Diverse Counterfactual Explanations
Post-hoc explanations of machine learning models are crucial for people to understand and act on algorithmic predictions.
Determinantal point processes for machine learning
Determinantal point processes (DPPs) are elegant probabilistic models of repulsion that arise in quantum physics and random matrix theory.
Kronecker Determinantal Point Processes
Determinantal Point Processes (DPPs) are probabilistic models over all subsets a ground set of $N$ items.
Determinantal thinning of point processes with network learning applications
A new type of dependent thinning for point processes in continuous space is proposed, which leverages the advantages of determinantal point processes defined on finite spaces and, as such, is particularly amenable to statistical, numerical, and simulation techniques.
Scalable Log Determinants for Gaussian Process Kernel Learning
For applications as varied as Bayesian neural networks, determinantal point processes, elliptical graphical models, and kernel learning for Gaussian processes (GPs), one must compute a log determinant of an $n \times n$ positive definite matrix, and its derivatives - leading to prohibitive $\mathcal{O}(n^3)$ computations.
Intensity-Free Learning of Temporal Point Processes
The standard way of learning in such models is by estimating the conditional intensity function.
Transformer Hawkes Process
Modern data acquisition routinely produce massive amounts of event sequence data in various domains, such as social media, healthcare, and financial markets.
Modelling Behavioural Diversity for Learning in Open-Ended Games
Promoting behavioural diversity is critical for solving games with non-transitive dynamics where strategic cycles exist, and there is no consistent winner (e. g., Rock-Paper-Scissors).
Diversity Networks: Neural Network Compression Using Determinantal Point Processes
We introduce Divnet, a flexible technique for learning networks with diverse neurons.
Modeling The Intensity Function Of Point Process Via Recurrent Neural Networks
In this paper, we model the background by a Recurrent Neural Network (RNN) with its units aligned with time series indexes while the history effect is modeled by another RNN whose units are aligned with asynchronous events to capture the long-range dynamics.