Search Results for author: Shuai Tang

Found 22 papers, 6 papers with code

Membership Inference Attacks on Diffusion Models via Quantile Regression

no code implementations8 Dec 2023 Shuai Tang, Zhiwei Steven Wu, Sergul Aydore, Michael Kearns, Aaron Roth

Our proposed MI attack learns quantile regression models that predict (a quantile of) the distribution of reconstruction loss on examples not used in training.

Image Generation regression

Private Synthetic Data for Multitask Learning and Marginal Queries

no code implementations15 Sep 2022 Giuseppe Vietri, Cedric Archambeau, Sergul Aydore, William Brown, Michael Kearns, Aaron Roth, Ankit Siva, Shuai Tang, Zhiwei Steven Wu

A key innovation in our algorithm is the ability to directly handle numerical features, in contrast to a number of related prior approaches which require numerical features to be first converted into {high cardinality} categorical features via {a binning strategy}.

MAMDR: A Model Agnostic Learning Method for Multi-Domain Recommendation

1 code implementation25 Feb 2022 Linhao Luo, Yumeng Li, Buyu Gao, Shuai Tang, Sinan Wang, Jiancheng Li, Tanchao Zhu, Jiancai Liu, Zhao Li, Shirui Pan

We integrate these components into a unified framework and present MAMDR, which can be applied to any model structure to perform multi-domain recommendation.

Spectrally Adaptive Common Spatial Patterns

no code implementations9 Feb 2022 Mahta Mousavi, Eric Lybrand, Shuangquan Feng, Shuai Tang, Rayan Saab, Virginia de Sa

In this work, we propose a novel algorithm called Spectrally Adaptive Common Spatial Patterns (SACSP) that improves CSP by learning a temporal/spectral filter for each spatial filter so that the spatial filters are concentrated on the most relevant temporal frequencies for each user.

EEG Motor Imagery

Fast Adaptation with Linearized Neural Networks

1 code implementation2 Mar 2021 Wesley J. Maddox, Shuai Tang, Pablo Garcia Moreno, Andrew Gordon Wilson, Andreas Damianou

The inductive biases of trained neural networks are difficult to understand and, consequently, to adapt to new settings.

Domain Adaptation Gaussian Processes +2

Deep Transfer Learning with Ridge Regression

no code implementations11 Jun 2020 Shuai Tang, Virginia R. de Sa

The large amount of online data and vast array of computing resources enable current researchers in both industry and academia to employ the power of deep learning with neural networks.

regression Transfer Learning

Similarity of Neural Networks with Gradients

3 code implementations25 Mar 2020 Shuai Tang, Wesley J. Maddox, Charlie Dickens, Tom Diethe, Andreas Damianou

A suitable similarity index for comparing learnt neural networks plays an important role in understanding the behaviour of the highly-nonlinear functions, and can provide insights on further theoretical analysis and empirical studies.

Network Pruning

Improving Style Transfer with Calibrated Metrics

1 code implementation21 Oct 2019 Mao-Chuang Yeh, Shuai Tang, Anand Bhattad, Chuhang Zou, David Forsyth

Style transfer methods produce a transferred image which is a rendering of a content image in the manner of a style image.

Style Transfer

An Empirical Study on Post-processing Methods for Word Embeddings

no code implementations27 May 2019 Shuai Tang, Mahta Mousavi, Virginia R. de Sa

Word embeddings learnt from large corpora have been adopted in various applications in natural language processing and served as the general input representations to learning systems.

Retrieval Sentence +1

A Simple Recurrent Unit with Reduced Tensor Product Representations

1 code implementation29 Oct 2018 Shuai Tang, Paul Smolensky, Virginia R. de Sa

idely used recurrent units, including Long-short Term Memory (LSTM) and the Gated Recurrent Unit (GRU), perform well on natural language tasks, but their ability to learn structured representations is still questionable.

Natural Language Inference

Improving Sentence Representations with Consensus Maximisation

no code implementations ICLR 2019 Shuai Tang, Virginia R. de Sa

Consensus maximisation learning can provide self-supervision when different views are available of the same data.

Self-Supervised Learning Sentence

Exploiting Invertible Decoders for Unsupervised Sentence Representation Learning

no code implementations ACL 2019 Shuai Tang, Virginia R. de Sa

The encoder-decoder models for unsupervised sentence representation learning tend to discard the decoder after being trained on a large unlabelled corpus, since only the encoder is needed to map the input sentence into a vector representation.

Representation Learning Sentence

Multi-view Sentence Representation Learning

no code implementations18 May 2018 Shuai Tang, Virginia R. de Sa

Multi-view learning can provide self-supervision when different views are available of the same data.

MULTI-VIEW LEARNING Representation Learning +1

Quantitative Evaluation of Style Transfer

no code implementations31 Mar 2018 Mao-Chuang Yeh, Shuai Tang, Anand Bhattad, D. A. Forsyth

Style transfer methods produce a transferred image which is a rendering of a content image in the manner of a style image.

Style Transfer

Improved Style Transfer by Respecting Inter-layer Correlations

no code implementations5 Jan 2018 Mao-Chuang Yeh, Shuai Tang

This paper demonstrates that controlling inter-layer correlations yields visible improvements in style transfer methods.

Style Transfer Texture Synthesis

Exploring Asymmetric Encoder-Decoder Structure for Context-based Sentence Representation Learning

no code implementations ICLR 2018 Shuai Tang, Hailin Jin, Chen Fang, Zhaowen Wang, Virginia R. de Sa

Context information plays an important role in human language understanding, and it is also useful for machines to learn vector representations of language.

Representation Learning Sentence

Rethinking Skip-thought: A Neighborhood based Approach

no code implementations WS 2017 Shuai Tang, Hailin Jin, Chen Fang, Zhaowen Wang, Virginia R. de Sa

We train our skip-thought neighbor model on a large corpus with continuous sentences, and then evaluate the trained model on 7 tasks, which include semantic relatedness, paraphrase detection, and classification benchmarks.

General Classification

Trimming and Improving Skip-thought Vectors

no code implementations9 Jun 2017 Shuai Tang, Hailin Jin, Chen Fang, Zhaowen Wang, Virginia R. de Sa

The skip-thought model has been proven to be effective at learning sentence representations and capturing sentence semantics.

Sentence text-classification +1

What Happened to My Dog in That Network: Unraveling Top-down Generators in Convolutional Neural Networks

no code implementations23 Nov 2015 Patrick W. Gallagher, Shuai Tang, Zhuowen Tu

Top-down information plays a central role in human perception, but plays relatively little role in many current state-of-the-art deep networks, such as Convolutional Neural Networks (CNNs).

Data Augmentation Zero-Shot Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.