no code implementations • Findings (EMNLP) 2021 • Jun Gao, YuHan Liu, Haolin Deng, Wei Wang, Yu Cao, Jiachen Du, Ruifeng Xu
The emotion cause is a stimulus for human emotions.
no code implementations • 28 Jun 2023 • Huu-Thanh Nguyen, Yu Cao, Chong-Wah Ngo, Wing-Kwong Chan
The power of the framework is a novel difficulty assessment model, which forecasts how challenging an unlabelled sample is to the latest trained instance segmentation model.
1 code implementation • 5 Jun 2023 • Yibin Lei, Liang Ding, Yu Cao, Changtong Zan, Andrew Yates, DaCheng Tao
Dense retrievers have achieved impressive performance, but their demand for abundant training data limits their application scenarios.
no code implementations • 3 Jun 2023 • Yu Cao, Jingrun Chen, Yixin Luo, Xiang Zhou
However, when the perturbation occurs earlier, the SDE model outperforms the ODE model, and we demonstrate that the error of sample generation due to pulse-shape error can be exponentially suppressed as the diffusion term's magnitude increases to infinity.
no code implementations • 17 Apr 2023 • Hao Tian, Yu Cao, P. Y. Mok
Clothing segmentation and fine-grained attribute recognition are challenging tasks at the crossing of computer vision and fashion, which segment the entire ensemble clothing instances as well as recognize detailed attributes of the clothing products from any input human images.
no code implementations • 20 Mar 2023 • Yu Cao, Xiangqiao Meng, P. Y. Mok, Xueting Liu, Tong-Yee Lee, Ping Li
Through multiple quantitative metrics evaluated on our dataset and a user study, we demonstrate AnimeDiffusion outperforms state-of-the-art GANs-based models for anime face line drawing colorization.
1 code implementation • 21 Dec 2022 • Yu Cao, Hao Tian, P. Y. Mok
Automatic colorization of anime line drawing has attracted much attention in recent years since it can substantially benefit the animation industry.
1 code implementation • 27 Oct 2022 • Yu Cao, Dianqi Li, Meng Fang, Tianyi Zhou, Jun Gao, Yibing Zhan, DaCheng Tao
We present Twin Answer Sentences Attack (TASA), an adversarial attack method for question answering (QA) models that produces fluent and grammatical adversarial contexts while maintaining gold answers.
2 code implementations • 24 Oct 2022 • Abhishek Moitra, Abhiroop Bhattacharjee, Runcong Kuang, Gokul Krishnan, Yu Cao, Priyadarshini Panda
To this end, we propose SpikeSim, a tool that can perform realistic performance, energy, latency and area evaluation of IMC-mapped SNNs.
1 code implementation • COLING 2022 • Changtong Zan, Liang Ding, Li Shen, Yu Cao, Weifeng Liu, DaCheng Tao
Pre-Training (PT) of text representations has been successfully applied to low-resource Neural Machine Translation (NMT).
1 code implementation • 16 Aug 2022 • Yu Cao, Yancheng Wang, Yifei Xue, Huiqing Zhang, Yizhen Lao
Segmentation from point cloud data is essential in many applications such as remote sensing, mobile robots, or autonomous cars.
no code implementations • 21 Jun 2022 • Wenfei Li, Qi Ou, Yixiao Chen, Yu Cao, Renxi Liu, Chunyi Zhang, Daye Zheng, Chun Cai, Xifan Wu, Han Wang, Mohan Chen, Linfeng Zhang
However, for high-level QM methods, such as density functional theory (DFT) at the meta-GGA level and/or with exact exchange, quantum Monte Carlo, etc., generating a sufficient amount of data for training a ML potential has remained computationally challenging due to their high cost.
1 code implementation • 20 Jun 2022 • Yu Cao, Eric Vanden-Eijnden
On the theory side, we discuss how to tailor the velocity field to the target and establish general conditions under which the proposed estimator is a perfect estimator with zero-variance.
1 code implementation • Findings (NAACL) 2022 • Yibin Lei, Yu Cao, Dianqi Li, Tianyi Zhou, Meng Fang, Mykola Pechenizkiy
Generating high-quality textual adversarial examples is critical for investigating the pitfalls of natural language processing (NLP) models and further promoting their robustness.
1 code implementation • NAACL 2022 • Hanhao Qu, Yu Cao, Jun Gao, Liang Ding, Ruifeng Xu
We present IBR, an Iterative Backward Reasoning model to solve the proof generation tasks on rule-based Question Answering (QA), where models are required to reason over a series of textual rules and facts to find out the related proof path and derive the final answer.
no code implementations • 15 May 2022 • Sumit K. Mandal, Gokul Krishnan, A. Alper Goksoy, Gopikrishnan Ravindran Nair, Yu Cao, Umit Y. Ogras
Besides accelerating the computation using custom compute elements (CE) and in-memory computing, COIN aims at minimizing the intra- and inter-CE communication in GCN operations to optimize the performance and energy efficiency.
1 code implementation • ACL 2022 • Yu Cao, Wei Bi, Meng Fang, Shuming Shi, DaCheng Tao
To alleviate the above data issues, we propose a data manipulation method, which is model-agnostic to be packed with any persona-based dialogue generation model to improve its performance.
1 code implementation • 16 Apr 2022 • Changtong Zan, Liang Ding, Li Shen, Yu Cao, Weifeng Liu, DaCheng Tao
For multilingual sequence-to-sequence pretrained language models (multilingual Seq2Seq PLMs), e. g. mBART, the self-supervised pretraining task is trained on a wide range of monolingual languages, e. g. 25 languages from CommonCrawl, while the downstream cross-lingual tasks generally progress on a bilingual language subset, e. g. English-German, making there exists the data discrepancy, namely domain discrepancy, and cross-lingual learning objective discrepancy, namely task discrepancy, between the pretraining and finetuning stages.
no code implementations • 27 Jan 2022 • Xizhe Wang, Ning Zhang, Jia Wang, Jing Ni, Xinzi Sun, John Zhang, Zitao Liu, Yu Cao, Benyuan Liu
To improve the IVF success rate, we propose a knowledge-based decision support system that can provide medical advice on the treatment protocol and medication adjustment for each patient visit during IVF treatment cycle.
no code implementations • 27 Jan 2022 • Jia Wang, Hongwei Zhu, Jiancheng Shen, Yu Cao, Benyuan Liu
It is a challenging task to predict financial markets.
no code implementations • 19 Jan 2022 • Zinan Xiong, Chenxi Wang, Ying Li, Yan Luo, Yu Cao
We are interested in exploring its capability in human pose estimation, and thus propose a novel model based on transformer architecture, enhanced with a feature pyramid fusion structure.
no code implementations • 18 Dec 2021 • Yu Cao, QiYue Yu
Similarly to the conception of communication rate, this paper defines radar rate to unify the DFRC system.
1 code implementation • 13 Dec 2021 • Yu Cao
In this thesis we present a semantic representation formalism based on directed graphs and explore its linguistic adequacy and explanatory benefits in the semantics of plurality and quantification.
no code implementations • 18 Sep 2021 • Cheng Tan, Zhichao Li, Jian Zhang, Yu Cao, Sikai Qi, Zherui Liu, Yibo Zhu, Chuanxiong Guo
With MIG, A100 can be the most cost-efficient GPU ever for serving Deep Neural Networks (DNNs).
no code implementations • 14 Aug 2021 • Gokul Krishnan, Sumit K. Mandal, Manvitha Pannala, Chaitali Chakrabarti, Jae-sun Seo, Umit Y. Ogras, Yu Cao
In-memory computing (IMC) on a monolithic chip for deep learning faces dramatic challenges on area, yield, and on-chip interconnection cost due to the ever-increasing model sizes.
no code implementations • 28 Jul 2021 • Tomasz Luczynski, Jonatan Scharff Willners, Elizabeth Vargas, Joshua Roe, Shida Xu, Yu Cao, Yvan Petillot, Sen Wang
This paper presents a novel dataset for the development of visual navigation and simultaneous localisation and mapping (SLAM) algorithms as well as for underwater intervention tasks.
no code implementations • 6 Jul 2021 • Gokul Krishnan, Sumit K. Mandal, Chaitali Chakrabarti, Jae-sun Seo, Umit Y. Ogras, Yu Cao
In this technique, we use analytical models of NoC to evaluate end-to-end communication latency of any given DNN.
no code implementations • 4 May 2021 • Xiaocong Du, Bhargav Bhushanam, Jiecao Yu, Dhruv Choudhary, Tianxiang Gao, Sherman Wong, Louis Feng, Jongsoo Park, Yu Cao, Arun Kejariwal
Our method leverages structured sparsification to reduce computational cost without hurting the model capacity at the end of offline training so that a full-size model is available in the recurring training stage to learn new data in real-time.
no code implementations • 8 Apr 2021 • Jia Wang, Tong Sun, Benyuan Liu, Yu Cao, Hongwei Zhu
Financial markets are a complex dynamical system.
no code implementations • 5 Apr 2021 • Jia Wang, Tong Sun, Benyuan Liu, Yu Cao, Degang Wang
Financial markets are difficult to predict due to its complex systems dynamics.
2 code implementations • NAACL 2021 • Yinya Huang, Meng Fang, Yu Cao, LiWei Wang, Xiaodan Liang
The model encodes discourse information as a graph with elementary discourse units (EDUs) and discourse relations, and learns the discourse-aware features via a graph network for downstream QA tasks.
Ranked #24 on
Reading Comprehension
on ReClor
no code implementations • 22 Mar 2021 • Adnan Siraj Rakin, Li Yang, Jingtao Li, Fan Yao, Chaitali Chakrabarti, Yu Cao, Jae-sun Seo, Deliang Fan
Apart from recovering the inference accuracy, our RA-BNN after growing also shows significantly higher resistance to BFA.
1 code implementation • 2 Mar 2021 • Yu Cao, Liang Ding, Zhiliang Tian, Meng Fang
Dialogue generation models face the challenge of producing generic and repetitive responses.
no code implementations • 11 Jan 2021 • Xinzi Sun, Dechun Wang, Chenxi Zhang, Pengfei Zhang, Zinan Xiong, Yu Cao, Benyuan Liu, Xiaowei Liu, Shuijiao Chen
All these factors pose a significant challenge to effective polyp detection in a colonoscopy.
no code implementations • 9 Nov 2020 • Yu Wang, Shu Jiang, Weiman Lin, Yu Cao, Longtao Lin, Jiangtao Hu, Jinghao Miao, Qi Luo
This paper presents the design of a tune-free (human-out-of-the-loop parameter tuning) control framework, aiming at accelerating large scale autonomous driving system deployed on various vehicles and driving environments.
1 code implementation • Findings of the Association for Computational Linguistics 2020 • Yu Cao, Wei Bi, Meng Fang, DaCheng Tao
In this work, we study dialogue models with multiple input sources adapted from the pretrained language model GPT2.
no code implementations • 11 Sep 2020 • Li Yang, Zhezhi He, Yu Cao, Deliang Fan
Many techniques have been developed, such as model compression, to make Deep Neural Networks (DNNs) inference more efficiently.
no code implementations • 3 Apr 2020 • Alexander Ding, Qilei Chen, Yu Cao, Benyuan Liu
This paper builds upon the success of previous models and develops a novel architecture, which combines object segmentation and convolutional neural networks (CNN) to construct an effective classifier of ROP stages 1-3 based on neonatal retinal images.
2 code implementations • ECCV 2020 • Heming Zhu, Yu Cao, Hang Jin, Weikai Chen, Dong Du, Zhangye Wang, Shuguang Cui, Xiaoguang Han
High-fidelity clothing reconstruction is the key to achieving photorealism in a wide range of applications including human digitization, virtual try-on, etc.
no code implementations • 26 Mar 2020 • Qilei Chen, Ping Liu, Jing Ni, Yu Cao, Benyuan Liu, Honggang Zhang
The first one is that our dataset is not fully labeled, i. e., only a subset of all lesion instances are marked.
no code implementations • 29 Jan 2020 • Ning Zhang, Yu Cao, Benyuan Liu, Yan Luo
This classifier branch is equipped with Feature Aggregation and Local Magnification Layers to enhance the classifier branch.
no code implementations • 26 Dec 2019 • Xinzi Sun, Pengfei Zhang, Dechun Wang, Yu Cao, Benyuan Liu
The model we design consists of an encoder to extract multi-scale semantic features and a decoder to expand the feature maps to a polyp segmentation map.
no code implementations • 19 Nov 2019 • Qilei Chen, Xinzi Sun, Ning Zhang, Yu Cao, Benyuan Liu
We analyze the lesion-vs-image scale carefully and propose a large-size feature pyramid network (LFPN) to preserve more image details for mini lesion instance detection.
1 code implementation • 13 Nov 2019 • Yu Cao, Meng Fang, Baosheng Yu, Joey Tianyi Zhou
On the other hand, it further reduces domain distribution discrepancy through conditional adversarial learning across domains.
no code implementations • 11 Nov 2019 • Gokul Krishnan, Xiaocong Du, Yu Cao
Inspired by the observation that brain networks follow the Small-World model, we propose a novel structural pruning scheme, which includes (1) hierarchically trimming the network into a Small-World model before training, (2) training the network for a given dataset, and (3) optimizing the network for accuracy.
1 code implementation • IJCNLP 2019 • Alex Warstadt, Yu Cao, Ioana Grosu, Wei Peng, Hagen Blix, Yining Nie, Anna Alsop, Shikha Bordia, Haokun Liu, Alicia Parrish, Sheng-Fu Wang, Jason Phang, Anhad Mohananey, Phu Mon Htut, Paloma Jeretič, Samuel R. Bowman
We conclude that a variety of methods is necessary to reveal all relevant aspects of a model's grammatical knowledge in a given domain.
no code implementations • 5 Sep 2019 • Dechun Wang, Ning Zhang, Xinzi Sun, Pengfei Zhang, Chenxi Zhang, Yu Cao, Benyuan Liu
Though challenging, with the great advances in object detection techniques, automated polyp detection still demonstrates a great potential in reducing the false negative rate while maintaining a high precision.
no code implementations • 29 Aug 2019 • Ning Zhang, Dechun Wang, Xinzi Sun, Pengfei Zhang, Chenxi Zhang, Yu Cao, Benyuan Liu
The anchor mechanism is removed and lesions are formalized as single keypoints.
no code implementations • 26 Aug 2019 • Zhi Cao, Honggang Zhang, Yu Cao, Benyuan Liu
We are interested in the optimal scheduling of a collection of multi-component application jobs in an edge computing system that consists of geo-distributed edge computing nodes connected through a wide area network.
no code implementations • 22 Aug 2019 • Pengfei Zhang, Yu Cao, Benyuan Liu
We present a 3D Convolutional Neural Networks (CNNs) based single shot detector for spatial-temporal action detection tasks.
no code implementations • 15 Aug 2019 • Shreyas Kolala Venkataramanaiah, Yufei Ma, Shihui Yin, Eriko Nurvithadhi, Aravind Dasu, Yu Cao, Jae-sun Seo
Training of convolutional neural networks (CNNs)on embedded platforms to support on-device learning is earning vital importance in recent days.
no code implementations • 28 May 2019 • Xiaocong Du, Gokul Krishnan, Abinash Mohanty, Zheng Li, Gouranga Charan, Yu Cao
Machine learning algorithms have made significant advances in many applications.
no code implementations • 28 May 2019 • Xiaocong Du, Gouranga Charan, Frank Liu, Yu Cao
Such a system requires learning from the data stream, training the model to preserve previous information and adapt to a new task, and generating a single-headed vector for future inference.
no code implementations • 27 May 2019 • Xiaocong Du, Zheng Li, Yufei Ma, Yu Cao
A typical training pipeline to mitigate over-parameterization is to pre-define a DNN structure first with redundant learning units (filters and neurons) under the goal of high accuracy, then to prune redundant learning units after training with the purpose of efficient inference.
no code implementations • 27 May 2019 • Xiaocong Du, Zheng Li, Yu Cao
Today a canonical approach to reduce the computation cost of Deep Neural Networks (DNNs) is to pre-define an over-parameterized model before training to guarantee the learning capacity, and then prune unimportant learning units (filters and neurons) during training to improve model compactness.
1 code implementation • NAACL 2019 • Yu Cao, Meng Fang, DaCheng Tao
Graph convolutional networks are used to obtain a relation-aware representation of nodes for entity graphs built from documents with multi-level features.
no code implementations • 23 May 2018 • Chetan Singh Thakur, Jamal Molin, Gert Cauwenberghs, Giacomo Indiveri, Kundan Kumar, Ning Qiao, Johannes Schemmel, Runchun Wang, Elisabetta Chicca, Jennifer Olson Hasler, Jae-sun Seo, Shimeng Yu, Yu Cao, André van Schaik, Ralph Etienne-Cummings
Neuromorphic engineering (NE) encompasses a diverse range of approaches to information processing that are inspired by neurobiological systems, and this feature distinguishes neuromorphic systems from conventional computing systems.
no code implementations • 8 Jan 2018 • Lina Karam, Tejas Borkar, Yu Cao, Junseok Chae
The proposed generative sensing framework aims at transforming low-end, low-quality sensor data into higher quality sensor data in terms of achieved classification accuracy.
1 code implementation • 19 Sep 2017 • Shihui Yin, Shreyas K. Venkataramanaiah, Gregory K. Chen, Ram Krishnamurthy, Yu Cao, Chaitali Chakrabarti, Jae-sun Seo
We present a new back propagation based training algorithm for discrete-time spiking neural networks (SNN).
1 code implementation • 9 Jan 2017 • Tian Zhao, Xiaobing Huang, Yu Cao
In this paper, we present DeepDSL, a domain specific language (DSL) embedded in Scala, that compiles deep networks written in DeepDSL to Java source code.
1 code implementation • 17 Jun 2016 • Chang Liu, Yu Cao, Yan Luo, Guanling Chen, Vinod Vokkarane, Yunsheng Ma
We applied our proposed approach to two real-world food image data sets (UEC-256 and Food-101) and achieved impressive results.
no code implementations • 16 May 2016 • Ming Tu, Visar Berisha, Yu Cao, Jae-sun Seo
In this paper, we propose a method to compress deep neural networks by using the Fisher Information metric, which we estimate through a stochastic optimization method that keeps track of second-order information in the network.
no code implementations • 5 Sep 2015 • Yuewei Lin, Jing Chen, Yu Cao, Youjie Zhou, Lingfeng Zhang, Yuan Yan Tang, Song Wang
By adopting a natural and widely used assumption -- "the data samples from the same class should lay on a low-dimensional subspace, even if they come from different domains", the proposed method circumvents the limitation of the global domain shift, and solves the cross-domain recognition by finding the compact joint subspaces of source and target domain.
no code implementations • CVPR 2013 • Yu Cao, Daniel Barrett, Andrei Barbu, Siddharth Narayanaswamy, Haonan Yu, Aaron Michaux, Yuewei Lin, Sven Dickinson, Jeffrey Mark Siskind, Song Wang
In this paper, we propose a new method that can recognize human activities from partially observed videos in the general case.