Search Results for author: Mozhi Zhang

Found 10 papers, 5 papers with code

Optimization of Graph Neural Networks: Implicit Acceleration by Skip Connections and More Depth

no code implementations10 May 2021 Keyulu Xu, Mozhi Zhang, Stefanie Jegelka, Kenji Kawaguchi

Our results show that the training of GNNs is implicitly accelerated by skip connections, more depth, and/or a good label distribution.

How Does a Neural Network's Architecture Impact Its Robustness to Noisy Labels?

no code implementations NeurIPS 2021 Jingling Li, Mozhi Zhang, Keyulu Xu, John P. Dickerson, Jimmy Ba

Our framework measures a network's robustness via the predictive power in its representations -- the test performance of a linear model trained on the learned representations using a small set of clean labels.

Learning with noisy labels

How Neural Networks Extrapolate: From Feedforward to Graph Neural Networks

3 code implementations ICLR 2021 Keyulu Xu, Mozhi Zhang, Jingling Li, Simon S. Du, Ken-ichi Kawarabayashi, Stefanie Jegelka

Second, in connection to analyzing the successes and limitations of GNNs, these results suggest a hypothesis for which we provide theoretical and empirical evidence: the success of GNNs in extrapolating algorithmic tasks to new data (e. g., larger graphs or edge weights) relies on encoding task-specific non-linearities in the architecture or features.

Interactive Refinement of Cross-Lingual Word Embeddings

1 code implementation EMNLP 2020 Michelle Yuan, Mozhi Zhang, Benjamin Van Durme, Leah Findlater, Jordan Boyd-Graber

Cross-lingual word embeddings transfer knowledge between languages: models trained on high-resource languages can predict in low-resource languages.

Active Learning General Classification +2

Are Girls Neko or Sh\=ojo? Cross-Lingual Alignment of Non-Isomorphic Embeddings with Iterative Normalization

no code implementations ACL 2019 Mozhi Zhang, Keyulu Xu, Ken-ichi Kawarabayashi, Stefanie Jegelka, Jordan Boyd-Graber

Cross-lingual word embeddings (CLWE) underlie many multilingual natural language processing systems, often through orthogonal transformations of pre-trained monolingual embeddings.

Translation Word Embeddings

Are Girls Neko or Shōjo? Cross-Lingual Alignment of Non-Isomorphic Embeddings with Iterative Normalization

1 code implementation4 Jun 2019 Mozhi Zhang, Keyulu Xu, Ken-ichi Kawarabayashi, Stefanie Jegelka, Jordan Boyd-Graber

Cross-lingual word embeddings (CLWE) underlie many multilingual natural language processing systems, often through orthogonal transformations of pre-trained monolingual embeddings.

Translation Word Embeddings

Cannot find the paper you are looking for? You can Submit a new open access paper.