Search Results for author: Yunan Zhang

Found 10 papers, 2 papers with code

Phi-3 Technical Report: A Highly Capable Language Model Locally on Your Phone

no code implementations22 Apr 2024 Marah Abdin, Sam Ade Jacobs, Ammar Ahmad Awan, Jyoti Aneja, Ahmed Awadallah, Hany Awadalla, Nguyen Bach, Amit Bahree, Arash Bakhtiari, Harkirat Behl, Alon Benhaim, Misha Bilenko, Johan Bjorck, Sébastien Bubeck, Martin Cai, Caio César Teodoro Mendes, Weizhu Chen, Vishrav Chaudhary, Parul Chopra, Allie Del Giorno, Gustavo de Rosa, Matthew Dixon, Ronen Eldan, Dan Iter, Amit Garg, Abhishek Goswami, Suriya Gunasekar, Emman Haider, Junheng Hao, Russell J. Hewett, Jamie Huynh, Mojan Javaheripi, Xin Jin, Piero Kauffmann, Nikos Karampatziakis, Dongwoo Kim, Mahoud Khademi, Lev Kurilenko, James R. Lee, Yin Tat Lee, Yuanzhi Li, Chen Liang, Weishung Liu, Eric Lin, Zeqi Lin, Piyush Madan, Arindam Mitra, Hardik Modi, Anh Nguyen, Brandon Norick, Barun Patra, Daniel Perez-Becker, Thomas Portet, Reid Pryzant, Heyang Qin, Marko Radmilac, Corby Rosset, Sambudha Roy, Olatunji Ruwase, Olli Saarikivi, Amin Saied, Adil Salim, Michael Santacroce, Shital Shah, Ning Shang, Hiteshi Sharma, Xia Song, Masahiro Tanaka, Xin Wang, Rachel Ward, Guanhua Wang, Philipp Witte, Michael Wyatt, Can Xu, Jiahang Xu, Sonali Yadav, Fan Yang, ZiYi Yang, Donghan Yu, Chengruidong Zhang, Cyril Zhang, Jianwen Zhang, Li Lyna Zhang, Yi Zhang, Yue Zhang, Yunan Zhang, Xiren Zhou

We introduce phi-3-mini, a 3. 8 billion parameter language model trained on 3. 3 trillion tokens, whose overall performance, as measured by both academic benchmarks and internal testing, rivals that of models such as Mixtral 8x7B and GPT-3. 5 (e. g., phi-3-mini achieves 69% on MMLU and 8. 38 on MT-bench), despite being small enough to be deployed on a phone.

Language Modelling

Model Tells You What to Discard: Adaptive KV Cache Compression for LLMs

no code implementations3 Oct 2023 Suyu Ge, Yunan Zhang, Liyuan Liu, Minjia Zhang, Jiawei Han, Jianfeng Gao

In this study, we introduce adaptive KV cache compression, a plug-and-play method that reduces the memory footprint of generative inference for Large Language Models (LLMs).

A Neural Span-Based Continual Named Entity Recognition Model

1 code implementation23 Feb 2023 Yunan Zhang, Qingcai Chen

Named Entity Recognition (NER) models capable of Continual Learning (CL) are realistically valuable in areas where entity types continuously increase (e. g., personal assistants).

Continual Named Entity Recognition Knowledge Distillation +3

Towards Disentangling Relevance and Bias in Unbiased Learning to Rank

no code implementations28 Dec 2022 Yunan Zhang, Le Yan, Zhen Qin, Honglei Zhuang, Jiaming Shen, Xuanhui Wang, Michael Bendersky, Marc Najork

We give both theoretical analysis and empirical results to show the negative effects on relevance tower due to such a correlation.

Learning-To-Rank

Unifying Model Explainability and Robustness for Joint Text Classification and Rationale Extraction

1 code implementation20 Dec 2021 Dongfang Li, Baotian Hu, Qingcai Chen, Tujie Xu, Jingcong Tao, Yunan Zhang

Recent works have shown explainability and robustness are two crucial ingredients of trustworthy and reliable text classification.

text-classification Text Classification

No Feature Is An Island: Adaptive Collaborations Between Features Improve Adversarial Robustness

no code implementations1 Jan 2021 Yufeng Zhang, Yunan Zhang, ChengXiang Zhai

To classify images, neural networks extract features from raw inputs and then sum them up with fixed weights via the fully connected layer.

Adversarial Robustness counterfactual

Cooperative Reasoning on Knowledge Graph and Corpus: A Multi-agentReinforcement Learning Approach

no code implementations4 Dec 2019 Yunan Zhang, Xiang Cheng, Heting Gao, ChengXiang Zhai

We model the question answering on KG as a cooperative task between two agents, a knowledge graph reasoning agent and an information extraction agent.

Question Answering

Macross: Urban Dynamics Modeling based on Metapath Guided Cross-Modal Embedding

no code implementations28 Nov 2019 Yunan Zhang, Heting Gao, Tarek Abdelzaher

As the ongoing rapid urbanization takes place with an ever-increasing speed, fully modeling urban dynamics becomes more and more challenging, but also a necessity for socioeconomic development.

Cannot find the paper you are looking for? You can Submit a new open access paper.