Unsupervised feature selection aims to reduce the number of features, often using feature importance scores to quantify the relevancy of single features to the task at hand.
Although precision and recall are standard performance measures for anomaly detection, their statistical properties in sequential detection settings are poorly understood.
Graph Neural Networks (GNNs) have achieved state-of-the-art results on many graph analysis tasks such as node classification and link prediction.
We build on recent advances in learning continuous warping functions and propose a novel family of warping functions based on the two-sided power (TSP) distribution.
We propose a novel statistical methodology to measure, test and visualize the systematic association between rare events and peaks in a time series.
Unfortunately, it is often non-trivial to select both a time series that is informative about events and a powerful detection algorithm: detection may fail because the detection algorithm is not suitable, or because there is no shared information between the time series and the events of interest.
Deep approaches to anomaly detection have recently shown promising results over shallow methods on large and complex datasets.
The ability to represent and compare machine learning models is crucial in order to quantify subtle model changes, evaluate generative models, and gather insights on neural network architectures.
Representing a graph as a vector is a challenging task; ideally, the representation should be easily computable and conducive to efficient comparisons among graphs, tailored to the particular data and analytical task at hand.
Despite the great advances made by deep learning in many machine learning problems, there is a relative dearth of deep learning approaches for anomaly detection.
Ranked #28 on Anomaly Detection on One-class CIFAR-10
However, it is a hard task in terms of the expressiveness of the employed similarity measure and the efficiency of its computation.
Social and Information Networks
Embedding a web-scale information network into a low-dimensional vector space facilitates tasks such as link prediction, classification, and visualization.