Semantic communication is envisioned as a promising technique to break through the Shannon limit.
To overcome the aforementioned limitations, in this paper, we propose a new GANs called Involution Generative Adversarial Networks (GIU-GANs).
Time-series classification approaches based on deep neural networks are easy to be overfitting on UCR datasets, which is caused by the few-shot problem of those datasets.
Inverse design of nanoparticles for desired scattering spectra and dynamic switching between the two opposite scattering anomalies, i. e. superscattering and invisibility, is important in realizing cloaking, sensing and functional devices.
In sequence-to-sequence models, classical optimal transport (OT) can be applied to semantically match generated sentences with target sentences.
High-quality dialogue-summary paired data is expensive to produce and domain-sensitive, making abstractive dialogue summarization a challenging task.
The relative importance of global versus local structure for the embeddings is learned automatically.
This paper considers a novel variational formulation of network embeddings, with special focus on textual networks.
Vector representations of sentences, trained on massive text corpora, are widely used as generic sentence embeddings across a variety of NLP problems.
Constituting highly informative network embeddings is an important tool for network analysis.
We present a syntax-infused variational autoencoder (SIVAE), that integrates sentences with their syntactic trees to improve the grammar of generated sentences.
Network embeddings, which learn low-dimensional representations for each vertex in a large-scale network, have received considerable attention in recent years.
Textual network embedding leverages rich text information associated with the network to learn low-dimensional vectorial representations of vertices.
Word embeddings are effective intermediate representations for capturing semantic regularities between words, when learning the representations of text sequences.
Ranked #11 on Text Classification on DBpedia
Low-rank signal modeling has been widely leveraged to capture non-local correlation in image processing applications.
Since diagnoses are typically correlated, a deep residual network is employed on top of the CNN encoder, to capture label (diagnosis) dependencies and incorporate information directly from the encoded sentence vector.