Search Results for author: Michael Zhang

Found 19 papers, 6 papers with code

Shoring Up the Foundations: Fusing Model Embeddings and Weak Supervision

no code implementations24 Mar 2022 Mayee F. Chen, Daniel Y. Fu, Dyah Adila, Michael Zhang, Frederic Sala, Kayvon Fatahalian, Christopher Ré

Despite the black-box nature of foundation models, we prove results characterizing how our approach improves performance and show that lift scales with the smoothness of label distributions in embedding space.

Triangle and Four Cycle Counting with Predictions in Graph Streams

no code implementations ICLR 2022 Justin Y. Chen, Talya Eden, Piotr Indyk, Honghao Lin, Shyam Narayanan, Ronitt Rubinfeld, Sandeep Silwal, Tal Wagner, David P. Woodruff, Michael Zhang

We propose data-driven one-pass streaming algorithms for estimating the number of triangles and four cycles, two fundamental problems in graph analytics that are widely studied in the graph data stream literature.

Correct-N-Contrast: A Contrastive Approach for Improving Robustness to Spurious Correlations

no code implementations3 Mar 2022 Michael Zhang, Nimit S. Sohoni, Hongyang R. Zhang, Chelsea Finn, Christopher Ré

As ERM models can be good spurious attribute predictors, CNC works by (1) using a trained ERM model's outputs to identify samples with the same class but dissimilar spurious features, and (2) training a robust model with contrastive learning to learn similar representations for same-class samples.

Contrastive Learning

The Details Matter: Preventing Class Collapse in Supervised Contrastive Learning

no code implementations29 Sep 2021 Daniel Yang Fu, Mayee F Chen, Michael Zhang, Kayvon Fatahalian, Christopher Ré

Supervised contrastive learning optimizes a loss that pushes together embeddings of points from the same class while pulling apart embeddings of points from different classes.

Contrastive Learning Transfer Learning

On the Opportunities and Risks of Foundation Models

no code implementations16 Aug 2021 Rishi Bommasani, Drew A. Hudson, Ehsan Adeli, Russ Altman, Simran Arora, Sydney von Arx, Michael S. Bernstein, Jeannette Bohg, Antoine Bosselut, Emma Brunskill, Erik Brynjolfsson, Shyamal Buch, Dallas Card, Rodrigo Castellon, Niladri Chatterji, Annie Chen, Kathleen Creel, Jared Quincy Davis, Dora Demszky, Chris Donahue, Moussa Doumbouya, Esin Durmus, Stefano Ermon, John Etchemendy, Kawin Ethayarajh, Li Fei-Fei, Chelsea Finn, Trevor Gale, Lauren Gillespie, Karan Goel, Noah Goodman, Shelby Grossman, Neel Guha, Tatsunori Hashimoto, Peter Henderson, John Hewitt, Daniel E. Ho, Jenny Hong, Kyle Hsu, Jing Huang, Thomas Icard, Saahil Jain, Dan Jurafsky, Pratyusha Kalluri, Siddharth Karamcheti, Geoff Keeling, Fereshte Khani, Omar Khattab, Pang Wei Kohd, Mark Krass, Ranjay Krishna, Rohith Kuditipudi, Ananya Kumar, Faisal Ladhak, Mina Lee, Tony Lee, Jure Leskovec, Isabelle Levent, Xiang Lisa Li, Xuechen Li, Tengyu Ma, Ali Malik, Christopher D. Manning, Suvir Mirchandani, Eric Mitchell, Zanele Munyikwa, Suraj Nair, Avanika Narayan, Deepak Narayanan, Ben Newman, Allen Nie, Juan Carlos Niebles, Hamed Nilforoshan, Julian Nyarko, Giray Ogut, Laurel Orr, Isabel Papadimitriou, Joon Sung Park, Chris Piech, Eva Portelance, Christopher Potts, aditi raghunathan, Rob Reich, Hongyu Ren, Frieda Rong, Yusuf Roohani, Camilo Ruiz, Jack Ryan, Christopher Ré, Dorsa Sadigh, Shiori Sagawa, Keshav Santhanam, Andy Shih, Krishnan Srinivasan, Alex Tamkin, Rohan Taori, Armin W. Thomas, Florian Tramèr, Rose E. Wang, William Wang, Bohan Wu, Jiajun Wu, Yuhuai Wu, Sang Michael Xie, Michihiro Yasunaga, Jiaxuan You, Matei Zaharia, Michael Zhang, Tianyi Zhang, Xikun Zhang, Yuhui Zhang, Lucia Zheng, Kaitlyn Zhou, Percy Liang

AI is undergoing a paradigm shift with the rise of models (e. g., BERT, DALL-E, GPT-3) that are trained on broad data at scale and are adaptable to a wide range of downstream tasks.

Transfer Learning

Neural Partial Differential Equations with Functional Convolution

no code implementations1 Jan 2021 Ziqian Wu, Xingzhe He, Michael Zhang, Yijun Li, Cheng Yang, Rui Liu, Shiying Xiong, Bo Zhu

Identifying the underlying structures of a PDE system based upon a small set of data samples on the solution space is challenging for machine learning.

Personalized Federated Learning with First Order Model Optimization

1 code implementation ICLR 2021 Michael Zhang, Karan Sapra, Sanja Fidler, Serena Yeung, Jose M. Alvarez

While federated learning traditionally aims to train a single global model across decentralized local datasets, one model may not always be ideal for all participating clients.

Personalized Federated Learning

Characterizing Policy Divergence for Personalized Meta-Reinforcement Learning

no code implementations9 Oct 2020 Michael Zhang

Despite ample motivation from costly exploration and limited trajectory data, rapidly adapting to new environments with few-shot reinforcement learning (RL) can remain a challenging task, especially with respect to personalized settings.

Meta-Learning Meta Reinforcement Learning +1

PLATON II: New Capabilities And A Comprehensive Retrieval on HD 189733b Transit and Eclipse Data

1 code implementation20 Apr 2020 Michael Zhang, Yayaati Chachan, Eliza M. -R. Kempton, Heather Knutson, Wenjun, Chang

Recently, we introduced PLanetary Atmospheric Tool for Observer Noobs (PLATON), a Python package that calculates model transmission spectra for exoplanets and retrieves atmospheric characteristics based on observed spectra.

Earth and Planetary Astrophysics Instrumentation and Methods for Astrophysics

Deciphering hierarchical organization of topologically associated domains through change-point testing

no code implementations24 Nov 2019 Haipeng Xing, Yingru Wu, Yong Chen, Michael Zhang

Background: The nucleus of eukaryotic cells spatially packages chromosomes into a hierarchical and distinct segregation that plays critical roles in maintaining transcription regulation.

Applications Methodology

Forward modelling and retrievals with PLATON, a fast open source tool

1 code implementation28 Nov 2018 Michael Zhang, Yayaati Chachan, Eliza M. -R. Kempton, Heather A. Knutson

It also has less commonly included features, such as a Mie scattering cloud model and unocculted starspot corrections.

Earth and Planetary Astrophysics Instrumentation and Methods for Astrophysics

Deep Weighted Averaging Classifiers

2 code implementations6 Nov 2018 Dallas Card, Michael Zhang, Noah A. Smith

Recent advances in deep learning have achieved impressive gains in classification accuracy on a variety of types of data, including images and text.

General Classification

Detection of a Westward Hotspot Offset in the Atmosphere of a Hot Gas Giant CoRoT-2b

1 code implementation19 Jan 2018 Lisa Dang, Nicolas B. Cowan, Joel C. Schwartz, Emily Rauscher, Michael Zhang, Heather A. Knutson, Michael Line, Ian Dobbs-Dixon, Drake Deming, Sudarsan Sundararajan, Jonathan J. Fortney, Ming Zhao

The peculiar infrared flux map of CoRoT-2b may result from westward winds due to non-synchronous rotation magnetic effects, or partial cloud coverage, that obscures the emergent flux from the planet's eastern hemisphere.

Earth and Planetary Astrophysics

Reverse Curriculum Generation for Reinforcement Learning

no code implementations17 Jul 2017 Carlos Florensa, David Held, Markus Wulfmeier, Michael Zhang, Pieter Abbeel

The robot is trained in reverse, gradually learning to reach the goal from a set of start states increasingly far from the goal.

reinforcement-learning

Probabilistically Safe Policy Transfer

no code implementations15 May 2017 David Held, Zoe McCarthy, Michael Zhang, Fred Shentu, Pieter Abbeel

Although learning-based methods have great potential for robotics, one concern is that a robot that updates its parameters might cause large amounts of damage before it learns the optimal policy.

Cannot find the paper you are looking for? You can Submit a new open access paper.