no code implementations • 28 Sep 2017 • Yinyan Zhang, Pei Zhang, Shuai Li
Bio-inspired algorithms such as neural network algorithms and genetic algorithms have received a significant amount of attention in both academic and engineering societies.
no code implementations • WS 2018 • Yongchao Deng, Shanbo Cheng, Jun Lu, Kai Song, Jingang Wang, Shenglan Wu, Liang Yao, Guchun Zhang, Haibo Zhang, Pei Zhang, Changfeng Zhu, Boxing Chen
We participated in 5 translation directions including English ↔ Russian, English ↔ Turkish in both directions and English → Chinese.
no code implementations • ACL 2019 • Pei Zhang, Boxing Chen, Niyu Ge, Kai Fan
Recent advances in sequence modeling have highlighted the strengths of the transformer architecture, especially in achieving state-of-the-art machine translation results.
Automatic Speech Recognition Automatic Speech Recognition (ASR) +3
no code implementations • 27 Dec 2019 • Pengcheng Yang, Boxing Chen, Pei Zhang, Xu sun
Further analysis demonstrates that the proposed regularized training can effectively improve the agreement of attention on the image, leading to better use of visual information.
no code implementations • 30 Mar 2020 • Pei Zhang, Xu Zhang, Wei Chen, Jian Yu, Yan-Feng Wang, Deyi Xiong
In this paper, we propose a new framework to model cross-sentence dependencies by training neural machine translation (NMT) to predict both the target translation and surrounding sentences of a source sentence.
no code implementations • 6 Apr 2020 • Madhumitha Harishankar, Jun Han, Sai Vineeth Kalluru Srinivas, Faisal Alqarni, Shi Su, Shijia Pan, Hae Young Noh, Pei Zhang, Marco Gruteser, Patrick Tague
and yields 100% lane classification accuracy with 200 meters of driving data, achieving over 90% with just 100 m (correspondingly to roughly one minute of driving).
no code implementations • 11 Sep 2020 • Ji-Yue Wang, Pei Zhang, Wen-feng Pang, Jie Li
The experiment results confirm that the TC can help LsrKD and MrKD to boost training, especially on the networks they are failed.
no code implementations • EMNLP 2020 • Pei Zhang, Boxing Chen, Niyu Ge, Kai Fan
In this paper, we research extensively the pros and cons of the standard transformer in document-level translation, and find that the auto-regressive property can simultaneously bring both the advantage of the consistency and the disadvantage of error accumulation.
no code implementations • 12 Jan 2021 • Rui Qu, Yunlong Wang, Min An, Feiran Wang, Hongrong Li, Hong Gao, Fuli Li, Pei Zhang
One of the most often implied benefits of high-dimensional (HD) quantum systems is to lead to stronger forms of correlations, featuring increased robustness to noise.
Quantum Physics
no code implementations • 1 May 2021 • Chen Zhang, Siwei Wang, Wenxuan Tu, Pei Zhang, Xinwang Liu, Changwang Zhang, Bo Yuan
Multi-view clustering is an important yet challenging task in machine learning and data mining community.
no code implementations • 1 May 2021 • Chen Zhang, Siwei Wang, Jiyuan Liu, Sihang Zhou, Pei Zhang, Xinwang Liu, En Zhu, Changwang Zhang
iii) The partition level information has not been utilized in existing work.
no code implementations • NAACL 2021 • Pengcheng Yang, Pei Zhang, Boxing Chen, Jun Xie, Weihua Luo
Document machine translation aims to translate the source sentence into the target language in the presence of additional contextual information.
no code implementations • 17 Jun 2021 • Yifeng Zhao, Zicheng Liu, Pei Zhang, S. A. Galindo-Torres, Stan Z. Li
Whereas implicit ML-driven methods are black-boxes in nature, explicit ML-driven methods have more potential in prediction of LDC.
no code implementations • 16 Jul 2021 • Yifeng Zhao, Pei Zhang, S. A. Galindo-Torres, Stan Z. Li
Then, a global optimal feature set (the channel width, the flow velocity, the channel slope and the cross sectional area) was proposed through numerical comparison of the distilled local optimums in performance with representative ML models.
no code implementations • 29 Sep 2021 • Pei Zhang, Hua Liu
Attention mechanism has been widely applied to tasks that output some sequence from an input image.
no code implementations • IWSLT (EMNLP) 2018 • Nguyen Bach, Hongjie Chen, Kai Fan, Cheung-Chi Leung, Bo Li, Chongjia Ni, Rong Tong, Pei Zhang, Boxing Chen, Bin Ma, Fei Huang
This work describes the En→De Alibaba speech translation system developed for the evaluation campaign of the International Workshop on Spoken Language Translation (IWSLT) 2018.
no code implementations • 22 Jul 2022 • Jong Youl Choi, Pei Zhang, Kshitij Mehta, Andrew Blanchard, Massimiliano Lupo Pasini
Graph Convolutional Neural Network (GCNN) is a popular class of deep learning (DL) models in material science to predict material properties from the graph representation of molecular structures.
no code implementations • 1 Dec 2022 • Jingcan Duan, Siwei Wang, Pei Zhang, En Zhu, Jingtao Hu, Hu Jin, Yue Liu, Zhibin Dong
However, they neglect the subgraph-subgraph comparison information which the normal and abnormal subgraph pairs behave differently in terms of embeddings and structures in GAD, resulting in sub-optimal task performance.
no code implementations • 7 Dec 2022 • Yiwen Dong, Jesse R Codling, Gary Rohrer, Jeremy Miles, Sudhendu Sharma, Tami Brown-Brandl, Pei Zhang, Hae Young Noh
In this paper, we introduce PigV$^2$, the first system to monitor pig heart rate and respiratory rate through ground vibrations.
no code implementations • 5 Feb 2023 • Yifeng Zhao, Xiangbo Gao, Pei Zhang, Liang Lei, S. A. Galindo-Torres, Stan Z. Li
This algorithm can capture the main contour of parental particles with a series of non-overlapping spheres and refine surface-texture details through gradient search.
no code implementations • 6 Jun 2023 • Pei Zhang, Shuo Zhu, Edmund Y. Lam
Bio-inspired neuromorphic cameras sense illumination changes on a per-pixel basis and generate spatiotemporal streaming events within microseconds in response, offering visual information with high temporal resolution over a high dynamic range.
no code implementations • 27 Sep 2023 • Pei Zhang, Chutian Wang, Edmund Y. Lam
Bio-inspired neuromorphic cameras asynchronously record pixel brightness changes and generate sparse event streams.
no code implementations • 28 Sep 2023 • Pei Zhang, Haosen Liu, Zhou Ge, Chutian Wang, Edmund Y. Lam
Neuromorphic imaging reacts to per-pixel brightness changes of a dynamic scene with high temporal precision and responds with asynchronous streaming events as a result.
no code implementations • 6 Oct 2023 • Shuaiwen Leon Song, Bonnie Kruft, Minjia Zhang, Conglong Li, Shiyang Chen, Chengming Zhang, Masahiro Tanaka, Xiaoxia Wu, Jeff Rasley, Ammar Ahmad Awan, Connor Holmes, Martin Cai, Adam Ghanem, Zhongzhu Zhou, Yuxiong He, Pete Luferenko, Divya Kumar, Jonathan Weyn, Ruixiong Zhang, Sylwester Klocek, Volodymyr Vragov, Mohammed AlQuraishi, Gustaf Ahdritz, Christina Floristean, Cristina Negri, Rao Kotamarthi, Venkatram Vishwanath, Arvind Ramanathan, Sam Foreman, Kyle Hippe, Troy Arcomano, Romit Maulik, Maxim Zvyagin, Alexander Brace, Bin Zhang, Cindy Orozco Bohorquez, Austin Clyde, Bharat Kale, Danilo Perez-Rivera, Heng Ma, Carla M. Mann, Michael Irvin, J. Gregory Pauloski, Logan Ward, Valerie Hayot, Murali Emani, Zhen Xie, Diangen Lin, Maulik Shukla, Ian Foster, James J. Davis, Michael E. Papka, Thomas Brettin, Prasanna Balaprakash, Gina Tourassi, John Gounley, Heidi Hanson, Thomas E Potok, Massimiliano Lupo Pasini, Kate Evans, Dan Lu, Dalton Lunga, Junqi Yin, Sajal Dash, Feiyi Wang, Mallikarjun Shankar, Isaac Lyngaas, Xiao Wang, Guojing Cong, Pei Zhang, Ming Fan, Siyan Liu, Adolfy Hoisie, Shinjae Yoo, Yihui Ren, William Tang, Kyle Felker, Alexey Svyatkovskiy, Hang Liu, Ashwin Aji, Angela Dalton, Michael Schulte, Karl Schulz, Yuntian Deng, Weili Nie, Josh Romero, Christian Dallago, Arash Vahdat, Chaowei Xiao, Thomas Gibbs, Anima Anandkumar, Rick Stevens
In the upcoming decade, deep learning may revolutionize the natural sciences, enhancing our capacity to model and predict natural occurrences.
no code implementations • 25 Oct 2023 • Pei Zhang, Logan Kearney, Debsindhu Bhowmik, Zachary Fox, Amit K. Naskar, John Gounley
Transformer-based large language models have remarkable potential to accelerate design optimization for applications such as drug development and materials discovery.
no code implementations • 14 Nov 2023 • Pei Zhang, Zhaobo Hua, Jinliang Ding
Designing controllers to achieve natural motor capabilities for multi-joint robots is a significant challenge.
no code implementations • 3 Jan 2024 • Qiyuan Ou, Pei Zhang, Sihang Zhou, En Zhu
Late fusion multi-view clustering (LFMVC) has become a rapidly growing class of methods in the multi-view clustering (MVC) field, owing to its excellent computational speed and clustering performance.
1 code implementation • 30 Jun 2023 • Yiming Wang, Zhuosheng Zhang, Pei Zhang, Baosong Yang, Rui Wang
Neural-symbolic methods have demonstrated efficiency in enhancing the reasoning abilities of large language models (LLMs).
1 code implementation • 11 Oct 2023 • Qiyuan Ou, Siwei Wang, Pei Zhang, Sihang Zhou, En Zhu
However, we propose Anchor-based Multi-view Subspace Clustering with Hierarchical Feature Descent(MVSC-HFD) to tackle the discrepancy among views through hierarchical feature descent and project to a common subspace( STAGE 1), which reveals dependency of different views.
1 code implementation • 31 Aug 2023 • Yi Wen, Suyuan Liu, Xinhang Wan, Siwei Wang, Ke Liang, Xinwang Liu, Xihong Yang, Pei Zhang
Anchor-based multi-view graph clustering (AMVGC) has received abundant attention owing to its high efficiency and the capability to capture complementary structural information across multiple views.
1 code implementation • 25 Nov 2022 • Pei Zhang, Baosong Yang, Haoran Wei, Dayiheng Liu, Kai Fan, Luo Si, Jun Xie
The lack of competency awareness makes NMT untrustworthy.
1 code implementation • 12 Sep 2023 • Jingcan Duan, Pei Zhang, Siwei Wang, Jingtao Hu, Hu Jin, Jiaxin Zhang, Haifang Zhou, Xinwang Liu
Finally, the model is refined with the only input of reliable normal nodes and learns a more accurate estimate of normality so that anomalous nodes can be more easily distinguished.
1 code implementation • ICLR 2022 • Siyan Liu, Pei Zhang, Dan Lu, Guannan Zhang
First, existing PI methods require retraining of neural networks (NNs) for every given confidence level and suffer from the crossing issue in calculating multiple PIs.
1 code implementation • 28 Aug 2019 • Asim Smailagic, Pedro Costa, Alex Gaudio, Kartik Khandelwal, Mostafa Mirshekari, Jonathon Fagert, Devesh Walawalkar, Susu Xu, Adrian Galdran, Pei Zhang, Aurélio Campilho, Hae Young Noh
Our online method enhances performance of its underlying baseline deep network.
1 code implementation • 4 Feb 2022 • Massimiliano Lupo Pasini, Pei Zhang, Samuel Temple Reeve, Jong Youl Choi
We train HydraGNN on an open-source ab initio density functional theory (DFT) dataset for iron-platinum (FePt) with a fixed body centered tetragonal (BCT) lattice structure and fixed volume to simultaneously predict the mixing enthalpy (a global feature of the system), the atomic charge transfer, and the atomic magnetic moment across configurations that span the entire compositional range.
1 code implementation • 12 Jul 2023 • Xiangpeng Wei, Haoran Wei, Huan Lin, TianHao Li, Pei Zhang, Xingzhang Ren, Mei Li, Yu Wan, Zhiwei Cao, Binbin Xie, Tianxiang Hu, Shangjie Li, Binyuan Hui, Bowen Yu, Dayiheng Liu, Baosong Yang, Fei Huang, Jun Xie
Large language models (LLMs) demonstrate remarkable ability to comprehend, reason, and generate following nature language instructions.