Recent work has shown that imperceptible perturbations can be applied to craft unlearnable examples (ULEs), i. e. images whose content cannot be used to improve a classifier during training.
Fast, highly accurate, and reliable inference of the sky origin of gravitational waves would enable real-time multi-messenger astronomy.
The heavy tail and skewness characteristics of such distributions in the spectral domain allow to capture long-range covariance of the signal in the time domain.
Unsupervised domain adaptation (UDA) amounts to assigning class labels to the unlabeled instances of a dataset from a target domain, using labeled instances of a dataset from a related source domain.
Under the assumption that features from pre-trained deep neural networks are transferable across related domains, domain adaptation reduces to aligning source and target domain at class prediction uncertainty level.
By assuming that the network is uniform, we can approximate the structure of unobserved parts of the network to obtain a method for local community detection.
Despite their success, state of the art methods based on this approach are either involved or unable to directly scale to data with many features.
2) How is the performance of hyperspectral image classification methods affected when using disjoint train and test sets?
Our results show that: 1) feature learning outperforms handcrafted features; 2) parameter transfer learning is beneficial in longitudinal settings; 3) using LSTM to learn the temporal dynamic of signals enhances the detection rate especially for skewed training data; 4) an ensemble of LSTMs provides more accurate and stable detectors.
Unsupervised Domain Adaptation (DA) is used to automatize the task of labeling data: an unlabeled dataset (target) is annotated using a labeled dataset (source) from a related domain.
We investigate the relation of conductance with weighted kernel k-means for a single community, which leads to the introduction of a new objective function, $\sigma$-conductance.
We argue that this is a desirable property, provide conditions under which NMF quality functions are local, and propose a novel class of local probabilistic NMF quality functions for soft graph clustering.
This motivates the derivation of a new family of quality functions, adaptive scale modularity, which does satisfy the proposed axioms.