no code implementations • ACL 2022 • Juncai Guo, Jin Liu, Yao Wan, Li Li, Pingyi Zhou
In this paper, we propose CODESCRIBE to model the hierarchical syntax structure of code by introducing a novel triplet position for code summarization.
no code implementations • 24 Mar 2023 • Zhiyang Guo, Wengang Zhou, Min Wang, Li Li, Houqiang Li
We propose a novel framework to reconstruct accurate appearance and geometry with neural radiance fields (NeRF) for interacting hands, enabling the rendering of photo-realistic images and videos for gesture animation from arbitrary views.
1 code implementation • 20 Mar 2023 • Li Li, Hubert P. H. Shum, Toby P. Breckon
Whilst the availability of 3D LiDAR point cloud data has significantly grown in recent years, annotation remains expensive and time-consuming, leading to a demand for semi-supervised semantic segmentation methods with application domains such as autonomous driving.
no code implementations • 18 Feb 2023 • Hao Li, Li Li, Yunmeng Huang, Ning li, Yongtao Zhang
Few-shot learning (FSL) requires a model to classify new samples after learning from only a few samples.
1 code implementation • 12 Feb 2023 • Chi Zhang, Rui Chen, Xiangyu Zhao, Qilong Han, Li Li
In practical recommendation scenarios, users often interact with items under multi-typed behaviors (e. g., click, add-to-cart, and purchase).
no code implementations • 10 Jan 2023 • Chaopeng Shen, Alison P. Appling, Pierre Gentine, Toshiyuki Bandai, Hoshin Gupta, Alexandre Tartakovsky, Marco Baity-Jesi, Fabrizio Fenicia, Daniel Kifer, Li Li, Xiaofeng Liu, Wei Ren, Yi Zheng, Ciaran J. Harman, Martyn Clark, Matthew Farthing, Dapeng Feng, Praveen Kumar, Doaa Aboelyazeed, Farshid Rahmani, Hylke E. Beck, Tadd Bindas, Dipankar Dwivedi, Kuai Fang, Marvin Höge, Chris Rackauckas, Tirthankar Roy, Chonggang Xu, Kathryn Lawson
Here we present differentiable geoscientific modeling as a powerful pathway toward dissolving the perceived barrier between them and ushering in a paradigm shift.
1 code implementation • 23 Dec 2022 • Qiaoyu Tan, Xin Zhang, Ninghao Liu, Daochen Zha, Li Li, Rui Chen, Soo-Hyun Choi, Xia Hu
To bridge the gap, we introduce a Personalized Subgraph Selector (PS2) as a plug-and-play framework to automatically, personally, and inductively identify optimal subgraphs for different edges when performing GNNLP.
no code implementations • 28 Nov 2022 • YiXuan Wang, Wengang Zhou, Jianmin Bao, Weilun Wang, Li Li, Houqiang Li
The key idea of our CLIP2GAN is to bridge the output feature embedding space of CLIP and the input latent space of StyleGAN, which is realized by introducing a mapping network.
no code implementations • 10 Nov 2022 • Jiawei Zhang, Shen Li, Li Li
Connected and automated vehicles (CAVs) are viewed as a special kind of robots that have the potential to significantly improve the safety and efficiency of traffic.
no code implementations • 3 Nov 2022 • Li Li, Dongxing Xu, Haoran Wei, Yanhua Long
Exploiting effective target modeling units is very important and has always been a concern in end-to-end automatic speech recognition (ASR).
Automatic Speech Recognition
Automatic Speech Recognition (ASR)
+2
no code implementations • 11 Oct 2022 • Yuxi Xiao, Li Li, Xiaodi Li, Jian Yao
In addition, in order to increase the robustness of our framework, we formulate the likelihood function of the correlations of 2D image matches as a Gaussian and Uniform mixture distribution which takes the uncertainty caused by illumination changes, image noise and moving objects into account.
no code implementations • 13 Sep 2022 • Zhensu Sun, Xiaoning Du, Fu Song, Shangwen Wang, Mingze Ni, Li Li
The experimental results show that the proposed estimator helps save 23. 3% of computational cost measured in floating-point operations for the code completion systems, and 80. 2% of rejected prompts lead to unhelpful completion
no code implementations • 12 Sep 2022 • Qianqian Ma, Li Li, Junhui Shen, Haowei Guan, Guangcheng Ma, Hongwei Xia
This paper investigates the fuzzy $H_{\infty}$ filter design issue for nonlinear systems with time-varying delay.
no code implementations • 12 Sep 2022 • Qianqian Ma, Hongwei Xia, Li Li, Guangcheng Ma
This paper is concerned with the fuzzy $H_{\infty}$ filter design issue for nonlinear systems with time-varying delay.
no code implementations • 18 Aug 2022 • Hong Ren, Zhenkun Zhang, Zhangjie Peng, Li Li, Cunhua Pan
Then, we investigate the general scenario in which the RF signals are radiated during the flight, aiming to minimize the total energy consumption of the UAV by jointly optimizing the UAV's trajectory, flight time and the RIS's reflection coefficients.
no code implementations • 12 Jul 2022 • Shuai Huo, Dong Liu, Li Li, Siwei Ma, Feng Wu, Wen Gao
Our idea is to provide multiple discrete starting points in the global space and optimize the local optimum around each point by numerical algorithm efficiently.
no code implementations • 18 Jun 2022 • Qinghua Tao, Li Li, Xiaolin Huang, Xiangming Xi, Shuning Wang, Johan A. K. Suykens
To apply PWLNN methods, both the representation and the learning have long been studied.
no code implementations • Findings (NAACL) 2022 • Xin Wang, Yasheng Wang, Yao Wan, Jiawei Wang, Pingyi Zhou, Li Li, Hao Wu, Jin Liu
Specifically, we first extract multiple code views using compiler tools, and learn the complementary information among them under a contrastive learning framework.
no code implementations • 21 Apr 2022 • Jiaqi Xue, Chentian Ma, Li Li, Xuan Wen
Melanoma is the most malignant skin tumor and usually cancerates from normal moles, which is difficult to distinguish benign from malignant in the early stage.
1 code implementation • 15 Apr 2022 • Yang Xu, Li Li, Haiyang Xu, Songfang Huang, Fei Huang, Jianfei Cai
This drawback inspires the researchers to develop a homogeneous architecture that facilitates end-to-end training, for which Transformer is the perfect one that has proven its huge potential in both vision and language domains and thus can be used as the basic component of the visual encoder and language decoder in an IC pipeline.
1 code implementation • CVPR 2022 • Liang Gao, Huazhu Fu, Li Li, YingWen Chen, Ming Xu, Cheng-Zhong Xu
Federated learning (FL) allows multiple clients to collectively train a high-performance global model without sharing their private data.
no code implementations • 11 Mar 2022 • Dongmei Xue, Haichuan Ma, Li Li, Dong Liu, Zhiwei Xiong
Volumetric image compression has become an urgent task to effectively transmit and store images produced in biological research and clinical practice.
1 code implementation • 3 Mar 2022 • He Ma, Arunachalam Narayanaswamy, Patrick Riley, Li Li
Systematic development of accurate density functionals has been a decades-long challenge for scientists.
1 code implementation • 14 Feb 2022 • Zhensu Sun, Yan Liu, Xiaoning Du, Li Li
The performance of neural code search is significantly influenced by the quality of the training data from which the neural models are derived.
1 code implementation • IEEE Transactions on Intelligent Transportation Systems 2022 • Bai Li, Yakun Ouyang, Li Li, Youmin Zhang
This paper is focused on the trajectory planning task for autonomous driving on a curvy road.
no code implementations • NeurIPS 2021 • Ruoxi Sun, Hanjun Dai, Li Li, Steven Kearnes, Bo Dai
In this paper, we propose a framework that unifies sequence- and graph-based methods as energy-based models (EBMs) with different energy functions.
1 code implementation • International Conference on 3D Vision (3DV) 2021 • Li Li, Khalid N. Ismail, Hubert P. H. Shum, Toby P. Breckon
Leveraging DurLAR, with a resolution exceeding that of prior benchmarks, we consider the task of monocular depth estimation and use this increased avail- ability of higher resolution, yet sparse ground truth scene depth information to propose a novel joint supervised/self- supervised loss formulation.
no code implementations • 1 Dec 2021 • Xihua Sheng, Li Li, Dong Liu, Zhiwei Xiong
In this paper, we propose a Multi-Scale Graph Attention Network (MS-GAT) to remove the artifacts of point cloud attributes compressed by G-PCC.
1 code implementation • 27 Nov 2021 • Xihua Sheng, Jiahao Li, Bin Li, Li Li, Dong Liu, Yan Lu
From the stored propagated features, we propose to learn multi-scale temporal contexts, and re-fill the learned temporal contexts into the modules of our compression scheme, including the contextual encoder-decoder, the frame generator, and the temporal context encoder.
no code implementations • 25 Nov 2021 • Xiaoxiao Zhao, Jinlong Lei, Li Li, Jie Chen
This paper studies a distributed policy gradient in collaborative multi-agent reinforcement learning (MARL), where agents over a communication network aim to find the optimal policy to maximize the average of all agents' local returns.
Multi-agent Reinforcement Learning
reinforcement-learning
+2
no code implementations • 30 Oct 2021 • Miao Zhang, Miaojing Shi, Li Li
Last, to enhance the embedding space learning, an additional pixel-wise metric learning module is introduced with triplet loss formulated on the pixel-level embedding of the input image.
no code implementations • 28 Oct 2021 • Bhupalee Kalita, Ryan Pederson, Jielun Chen, Li Li, Kieron Burke
Kohn-Sham regularizer (KSR) is a differentiable machine learning approach to finding the exchange-correlation functional in Kohn-Sham density functional theory (DFT) that works for strongly correlated systems.
1 code implementation • 25 Oct 2021 • Zhensu Sun, Xiaoning Du, Fu Song, Mingze Ni, Li Li
Github Copilot, trained on billions of lines of public code, has recently become the buzzword in the computer science research and practice community.
no code implementations • 30 Sep 2021 • Haichuan Ma, Dong Liu, Cunhui Dong, Li Li, Feng Wu
However, this nature was seldom considered in previous studies on image compression, which usually chose one possible image as reconstruction, e. g. the one with the maximal a posteriori probability.
no code implementations • ICLR 2022 • Zhimeng Jiang, Kaixiong Zhou, Zirui Liu, Li Li, Rui Chen, Soo-Hyun Choi, Xia Hu
Instance-dependent label noise (IDN) widely exists in real-world datasets and usually misleads the training of deep neural networks.
no code implementations • ICLR 2022 • Zirui Liu, Kaixiong Zhou, Fan Yang, Li Li, Rui Chen, Xia Hu
Based on the implementation, we propose a memory-efficient framework called ``EXACT'', which for the first time demonstrate the potential and evaluate the feasibility of training GNNs with compressed activations.
no code implementations • 29 Sep 2021 • Dongping Liao, Xitong Gao, Yiren Zhao, Hao Dai, Li Li, Kafeng Wang, Kejiang Ye, Yang Wang, Cheng-Zhong Xu
Federated learning (FL) enables edge clients to train collaboratively while preserving individual's data privacy.
no code implementations • 18 Sep 2021 • Dongmei Xue, Haichuan Ma, Li Li, Dong Liu, Zhiwei Xiong
With the rapid development of whole brain imaging technology, a large number of brain images have been produced, which puts forward a great demand for efficient brain image compression methods.
no code implementations • 7 Sep 2021 • Li Li, Raed Fayad, Adrian Taylor
Given the success of reinforcement learning (RL) in various domains, it is promising to explore the application of its methods to the development of intelligent and autonomous cyber agents.
no code implementations • 4 Sep 2021 • Xiuxian Li, Kuo-Yi Lin, Li Li, Yiguang Hong, Jie Chen
For the first two cases, it can be shown that the scaled signGD converges at a linear rate.
no code implementations • 30 Aug 2021 • Kaixiong Zhou, Ninghao Liu, Fan Yang, Zirui Liu, Rui Chen, Li Li, Soo-Hyun Choi, Xia Hu
Graph neural networks (GNNs), which learn the node representations by recursively aggregating information from its neighbors, have become a predominant computational tool in many domains.
no code implementations • 10 Aug 2021 • Xin Wang, Yasheng Wang, Fei Mi, Pingyi Zhou, Yao Wan, Xiao Liu, Li Li, Hao Wu, Jin Liu, Xin Jiang
Code representation learning, which aims to encode the semantics of source code into distributed vectors, plays an important role in recent deep-learning-based models for code intelligence.
no code implementations • 4 Aug 2021 • Li Li, Yanfei Kang, Feng Li
In this work, we propose a novel framework for density forecast combination by constructing time-varying weights based on time series features, which is called Feature-based Bayesian Forecasting Model Averaging (FEBAMA).
no code implementations • ACL 2021 • Qiuxiang He, Guoping Huang, Qu Cui, Li Li, Lemao Liu
It is generally believed that a translation memory (TM) should be beneficial for machine translation tasks.
1 code implementation • NeurIPS 2021 • Kaixiong Zhou, Xiao Huang, Daochen Zha, Rui Chen, Li Li, Soo-Hyun Choi, Xia Hu
To this end, we analyze the bottleneck of deep GNNs by leveraging the Dirichlet energy of node embeddings, and propose a generalizable principle to guide the training of deep GNNs.
1 code implementation • 24 Jun 2021 • Kahou Tam, Li Li, Bo Han, Chengzhong Xu, Huazhu Fu
Federated learning (FL) collaboratively trains a shared global model depending on multiple local clients, while keeping the training data decentralized in order to preserve data privacy.
no code implementations • 26 May 2021 • Xinran Li, Kuo-Yi Lin, Min Meng, Xiuxian Li, Li Li, Yiguang Hong, Jie Chen
Due to the growing awareness of driving safety and the development of sophisticated technologies, advanced driving assistance system (ADAS) has been equipped in more and more vehicles with higher accuracy and lower price.
1 code implementation • 19 May 2021 • Wentao Ouyang, Xiuwu Zhang, Shukui Ren, Li Li, Kun Zhang, Jinmei Luo, Zhaojie Liu, Yanlong Du
For existing old ads, GMEs first build a graph to connect them with new ads, and then adaptively distill useful information.
no code implementations • 15 May 2021 • Xinyu Peng, Jiawei Zhang, Fei-Yue Wang, Li Li
As a promising tool to better understand the learning dynamic of minibatch SGD, the information bottleneck (IB) theory claims that the optimization process consists of an initial fitting phase and the following compression phase.
no code implementations • 11 May 2021 • Huihuang Chen, Li Li, Jie Chen, Kuo-Yi Lin
In addition to aligning the global distribution, the real domain adaptation should also align the meso distribution and the micro distribution.
no code implementations • 22 Apr 2021 • Jing Wu, Mingyi Zhou, Ce Zhu, Yipeng Liu, Mehrtash Harandi, Li Li
Recently, adversarial attack methods have been developed to challenge the robustness of machine learning models.
no code implementations • 21 Apr 2021 • Huaxin Pei, Yi Zhang, Qinghua Tao, Shuo Feng, Li Li
Cooperative driving at isolated intersections attracted great interest and had been well discussed in recent years.
no code implementations • 16 Apr 2021 • Yu Zhang, Moming Duan, Duo Liu, Li Li, Ao Ren, Xianzhang Chen, Yujuan Tan, Chengliang Wang
Asynchronous FL has a natural advantage in mitigating the straggler effect, but there are threats of model quality degradation and server crash.
no code implementations • 15 Apr 2021 • Li Li, Moming Duan, Duo Liu, Yu Zhang, Ao Ren, Xianzhang Chen, Yujuan Tan, Chengliang Wang
In our framework, the server evaluates devices' value of training based on their training loss.
no code implementations • 7 Apr 2021 • Vivek Singh Bawa, Gurkirt Singh, Francis KapingA, Inna Skarga-Bandurova, Elettra Oleari, Alice Leporini, Carmela Landolfo, Pengfei Zhao, Xi Xiang, Gongning Luo, Kuanquan Wang, Liangzhi Li, Bowen Wang, Shang Zhao, Li Li, Armando Stabile, Francesco Setti, Riccardo Muradore, Fabio Cuzzolin
For an autonomous robotic system, monitoring surgeon actions and assisting the main surgeon during a procedure can be very challenging.
no code implementations • 11 Mar 2021 • Chi Zhang, Zihang Lin, Liheng Xu, Zongliang Li, Wei Tang, Yuehu Liu, Gaofeng Meng, Le Wang, Li Li
The key procedure of haze image translation through adversarial training lies in the disentanglement between the feature only involved in haze synthesis, i. e. style feature, and the feature representing the invariant semantic content, i. e. content feature.
1 code implementation • 9 Mar 2021 • Yue Liu, Chakkrit Tantithamthavorn, Li Li, Yepang Liu
In this paper, we conducted a systematic literature review to search and analyze how deep learning approaches have been applied in the context of malware defenses in the Android environment.
1 code implementation • 4 Mar 2021 • Jiawei Wang, Li Li, Andreas Zeller
More than ninety percent of published Jupyter notebooks do not state dependencies on external packages.
Software Engineering
no code implementations • 23 Feb 2021 • Li-Zheng Liu, Yu-Zhe Zhang, Zheng-Da Li, Rui Zhang, Xu-Fei Yin, Yue-Yang Fei, Li Li, Nai-Le Liu, Feihu Xu, Yu-Ao Chen, Jian-Wei Pan
Distributed quantum metrology can enhance the sensitivity for sensing spatially distributed parameters beyond the classical limits.
Quantum Physics
no code implementations • 5 Feb 2021 • Wenting Zou, Li Li, Zichen Xu, Chengzhong Xu
To address the conflict between learning SLO and energy efficiency, we propose DEAL, an energy efficient learning system that saves energy and preserves privacy with a decremental learning design.
no code implementations • 17 Jan 2021 • YingJie Xu, Kai Yu, Li Li, Xianfu Lei, Li Hao, Cheng-Xiang Wang
As a potential development direction of future transportation, the vacuum tube ultra-high-speed train (UHST) wireless communication systems have newly different channel characteristics from existing high-speed train (HST) scenarios.
no code implementations • 4 Dec 2020 • Yanan Wang, Yong Ge, Li Li, Rui Chen, Tong Xu
To improve adaptation efficiency, we learn to recover the user policy and reward from only a few interactions via an inverse reinforcement learning method to assist a meta-level recommendation agent.
Model-based Reinforcement Learning
Recommendation Systems
+2
1 code implementation • 22 Oct 2020 • Murphy Yuezhen Niu, Andrew M. Dai, Li Li, Augustus Odena, Zhengli Zhao, Vadim Smelyanskyi, Hartmut Neven, Sergio Boixo
Given a quantum circuit, a quantum computer can sample the output distribution exponentially faster in the number of bits than classical computers.
1 code implementation • 17 Sep 2020 • Li Li, Stephan Hoyer, Ryan Pederson, Ruoxi Sun, Ekin D. Cubuk, Patrick Riley, Kieron Burke
Including prior knowledge is important for effective machine learning models in physics, and is usually achieved by explicitly adding loss terms or constraints on model architectures.
no code implementations • 21 Aug 2020 • Ninghao Liu, Yong Ge, Li Li, Xia Hu, Rui Chen, Soo-Hyun Choi
Different from previous work, in our model, factor discovery and representation learning are simultaneously conducted, and we are able to handle extra attribute information and knowledge.
no code implementations • 12 Aug 2020 • Wenqing Liu, Miaojing Shi, Teddy Furon, Li Li
This paper presents a DNN bottleneck reinforcement scheme to alleviate the vulnerability of Deep Neural Networks (DNN) against adversarial attacks.
no code implementations • 14 Jul 2020 • Ruoxi Sun, Hanjun Dai, Li Li, Steven Kearnes, Bo Dai
Retrosynthesis -- the process of identifying a set of reactants to synthesize a target molecule -- is of vital importance to material design and drug discovery.
Ranked #1 on
Single-step retrosynthesis
on USPTO-50k
no code implementations • ECCV 2020 • Zhen Zhao, Miaojing Shi, Xiaoxiao Zhao, Li Li
To learn a reliable people counter from crowd images, head center annotations are normally required.
2 code implementations • ICLR 2021 • Subham Sekhar Sahoo, Subhashini Venugopalan, Li Li, Rishabh Singh, Patrick Riley
In this work, we propose a technique for combining gradient-based methods with symbolic techniques to scale such analyses and demonstrate its application for model explanation.
no code implementations • 28 Jun 2020 • Li Li, Theodoros Pantelidis, Joseph Y. J. Chow, Saif Eddin Jabari
To overcome this complexity, we employ an online minimum drift plus penalty (MDPP) approach for SAEV systems that (i) does not require a priori knowledge of customer arrival rates to the different parts of the system (i. e. it is practical from a real-world deployment perspective), (ii) ensures the stability of customer waiting times, (iii) ensures that the deviation of dispatch costs from a desirable dispatch cost can be controlled, and (iv) has a computational time-complexity that allows for real-time implementation.
no code implementations • 25 Jun 2020 • Yuzhu Guo, Kang Pan, Simeng Li, Zongchang Han, Kexin Wang, Li Li
Autoencoders have been widely used for dimensional reduction and feature extraction.
no code implementations • 9 Jun 2020 • Zhangjie Peng, Zhenkun Zhang, Cunhua Pan, Li Li, A. Lee Swindlehurst
Low-cost passive intelligent reflecting surfaces (IRSs) have recently been envisioned as a revolutionary technology capable of reconfiguring the wireless propagation environment through carefully tuning reflection elements.
1 code implementation • 31 May 2020 • Hong Yu, Li Li, Hsin-hui Huang, Yang Wang, Yingtong Liu, Edison Ong, Anthony Huffman, Tao Zeng, Jingsong Zhang, Pengpai Li, Zhiping Liu, Xiangyan Zhang, Xianwei Ye, Samuel K. Handelman, Gerry Higgins, Gilbert S. Omenn, Brian Athey, Junguk Hur, Luonan Chen, Yongqun He
We hypothesized that ontology can be used as an integrative platform to classify and analyze HCI and disease outcomes.
Other Quantitative Biology
1 code implementation • 27 May 2020 • Daniel Liang, Li Li, Stefan Leichenauer
The quantum approximate optimization algorithm (QAOA) is widely seen as a possible usage of noisy intermediate-scale quantum (NISQ) devices.
Quantum Physics
no code implementations • 26 May 2020 • Cong Wang, Yanru Xiao, Xing Gao, Li Li, Jun Wang
We show the feasibility of training with mobile CPUs, where training 100 epochs takes less than 10 mins and can be boosted 3-5 times with feature transfer.
no code implementations • 22 Apr 2020 • Yijun Quan, Chang-Tsun Li, Yujue Zhou, Li Li
Device fingerprints like sensor pattern noise (SPN) are widely used for provenance analysis and image authentication.
no code implementations • 17 Apr 2020 • Senlin Shu, Fengmao Lv, Yan Yan, Li Li, Shuo He, Jun He
In this article, we propose to leverage the data augmentation technique to improve the performance of multi-label learning.
no code implementations • 6 Apr 2020 • Kai Chen, Jian Yao, Jingmin Tu, Yahui Liu, Yinxuan Li, Li Li
Recently, works on improving the naturalness of stitching images gain more and more extensive attention.
no code implementations • 27 Jan 2020 • Xi Liu, Li Li, Ping-Chun Hsieh, Muhe Xie, Yong Ge, Rui Chen
With the explosive growth of online products and content, recommendation techniques have been considered as an effective tool to overcome information overload, improve user experience, and boost business revenue.
no code implementations • 13 Jan 2020 • Harrison Ball, Michael J. Biercuk, Andre Carvalho, Jiayin Chen, Michael Hush, Leonardo A. De Castro, Li Li, Per J. Liebermann, Harry J. Slatyer, Claire Edmunds, Virginia Frey, Cornelius Hempel, Alistair Milne
Manipulating quantum computing hardware in the presence of imperfect devices and control systems is a central challenge in realizing useful quantum computers.
Quantum Physics
no code implementations • 24 Oct 2019 • Fei Hu, Wei Liu, Ajmal Saeed Mian, Li Li
In this paper, we propose the Topic-coherent Hierarchical Recurrent Encoder-Decoder model (THRED) to diversify the generated responses without deviating the contextual topics for multi-turn conversations.
no code implementations • 19 Oct 2019 • Xiaofei Xu, Ke Deng, Fei Hu, Li Li
Our method outperformed three other popular methods in terms of the number of words correctly identified to have changed in meaning.
no code implementations • 29 Jul 2019 • Wei Jia, Li Li, Zhu Li, Xiang Zhang, Shan Liu
The block-based coding structure in the hybrid video coding framework inevitably introduces compression artifacts such as blocking, ringing, etc.
1 code implementation • 9 Jul 2019 • Wentao Ouyang, Xiuwu Zhang, Shukui Ren, Li Li, Zhaojie Liu, Yanlong Du
Both offline and online experiments demonstrate the effectiveness of MA-DNN for practical CTR prediction services.
no code implementations • 20 Jun 2019 • Guan Wang, Jianming Hu, Zhiheng Li, Li Li
In this paper, we study how to learn an appropriate lane changing strategy for autonomous vehicles by using deep reinforcement learning.
1 code implementation • 10 Jun 2019 • Wentao Ouyang, Xiuwu Zhang, Li Li, Heng Zou, Xin Xing, Zhaojie Liu, Yanlong Du
The intuitions are that ads shown together may influence each other, clicked ads reflect a user's preferences, and unclicked ads may indicate what a user dislikes to certain extent.
Ranked #1 on
Click-Through Rate Prediction
on Avito
no code implementations • 18 Apr 2019 • Steven Kearnes, Li Li, Patrick Riley
We present RL-VAE, a graph-to-graph variational autoencoder that uses reinforcement learning to decode molecular graphs from latent embeddings.
no code implementations • 11 Mar 2019 • Xinyu Peng, Li Li, Fei-Yue Wang
Machine learning, especially deep neural networks, has been rapidly developed in fields including computer vision, speech recognition and reinforcement learning.
1 code implementation • 23 Jan 2019 • Li Li, Minjie Fan, Rishabh Singh, Patrick Riley
The second part, which we call Neural-Guided Monte Carlo Tree Search, uses the network during a search to find an expression that conforms to a set of data points and desired leading powers.
no code implementations • 16 Dec 2018 • Li Li, Hirokazu Kameoka, Shoji Makino
While MVAE is notable in its impressive source separation performance, the convergence-guaranteed optimization algorithm and that it allows us to estimate source-class labels simultaneously with source separation, there are still two major drawbacks, i. e., the high computational complexity and unsatisfactory source classification accuracy.
1 code implementation • 20 Nov 2018 • Li Li, Tegawendé Bissyandé, Jacques Klein
Repackaging is a serious threat to the Android ecosystem as it deprives app developers of their benefits, contributes to spreading malware on users' devices, and increases the workload of market maintainers.
Software Engineering
no code implementations • 13 Nov 2018 • Qianyu Guo, Xiaofei Xie, Lei Ma, Qiang Hu, Ruitao Feng, Li Li, Yang Liu, Jianjun Zhao, Xiaohong Li
Up to the present, it still lacks a comprehensive study on how current diverse DL frameworks and platforms influence the DL software development process.
6 code implementations • 19 Oct 2018 • Zhenpeng Zhou, Steven Kearnes, Li Li, Richard N. Zare, Patrick Riley
We present a framework, which we call Molecule Deep $Q$-Networks (MolDQN), for molecule optimization by combining domain knowledge of chemistry and state-of-the-art reinforcement learning techniques (double $Q$-learning and randomized value functions).
Ranked #1 on
Molecular Graph Generation
on ZINC
(QED Top-3 metric)
Molecular Graph Generation
Multi-Objective Reinforcement Learning
+2
no code implementations • 29 Sep 2018 • Shogo Seki, Hirokazu Kameoka, Li Li, Tomoki Toda, Kazuya Takeda
This paper deals with a multichannel audio source separation problem under underdetermined conditions.
no code implementations • 18 Sep 2018 • Kai Chen, Jingmin Tu, Binbin Xiang, Li Li, Jian Yao
In this paper, geometric and photometric constraints are combined to improve the alignment quality, which is based on the observation that these two kinds of constraints are complementary.
no code implementations • 31 Aug 2018 • Fanghui Liu, Xiaolin Huang, Chen Gong, Jie Yang, Li Li
Learning this data-adaptive matrix in a formulation-free strategy enlarges the margin between classes and thus improves the model flexibility.
1 code implementation • 6 Aug 2018 • Yilun Lin, Xingyuan Dai, Li Li, Fei-Yue Wang
Urban Traffic Control (UTC) plays an essential role in Intelligent Transportation System (ITS) but remains difficult.
1 code implementation • 2 Aug 2018 • Hirokazu Kameoka, Li Li, Shota Inoue, Shoji Makino
This paper proposes a multichannel source separation technique called the multichannel variational autoencoder (MVAE) method, which uses a conditional VAE (CVAE) to model and estimate the power spectrograms of the sources in a mixture.
4 code implementations • 14 May 2018 • Lei Ma, Fuyuan Zhang, Jiyuan Sun, Minhui Xue, Bo Li, Felix Juefei-Xu, Chao Xie, Li Li, Yang Liu, Jianjun Zhao, Yadong Wang
To do this, by sharing the same spirit of mutation testing in traditional software, we first define a set of source-level mutation operators to inject faults to the source of DL (i. e., training data and training programs).
Software Engineering
no code implementations • 20 Mar 2018 • Lei Ma, Felix Juefei-Xu, Fuyuan Zhang, Jiyuan Sun, Minhui Xue, Bo Li, Chunyang Chen, Ting Su, Li Li, Yang Liu, Jianjun Zhao, Yadong Wang
Deep learning (DL) defines a new data-driven programming paradigm that constructs the internal system logic of a crafted neuron network through a set of training data.
3 code implementations • 22 Feb 2018 • Nathaniel Thomas, Tess Smidt, Steven Kearnes, Lusann Yang, Li Li, Kai Kohlhoff, Patrick Riley
We introduce tensor field neural networks, which are locally equivariant to 3D rotations, translations, and permutations of points at every layer.
no code implementations • 19 Jan 2018 • He-Liang Huang, Xi-Lin Wang, Peter P. Rohde, Yi-Han Luo, You-Wei Zhao, Chang Liu, Li Li, Nai-Le Liu, Chao-Yang Lu, Jian-Wei Pan
Topological data analysis offers a robust way to extract useful information from noisy, unstructured data by identifying its underlying structure.
no code implementations • 13 Jan 2018 • Sheng-Kai Liao, Wen-Qi Cai, Johannes Handsteiner, Bo Liu, Juan Yin, Liang Zhang, Dominik Rauch, Matthias Fink, Ji-Gang Ren, Wei-Yue Liu, Yang Li, Qi Shen, Yuan Cao, Feng-Zhi Li, Jian-Feng Wang, Yong-Mei Huang, Lei Deng, Tao Xi, Lu Ma, Tai Hu, Li Li, Nai-Le Liu, Franz Koidl, Peiyuan Wang, Yu-Ao Chen, Xiang-Bin Wang, Michael Steindorfer, Georg Kirchner, Chao-Yang Lu, Rong Shu, Rupert Ursin, Thomas Scheidl, Cheng-Zhi Peng, Jian-Yu Wang, Anton Zeilinger, Jian-Wei Pan
This was on the one hand the transmission of images in a one-time pad configuration from China to Austria as well as from Austria to China.
Quantum Physics
1 code implementation • 10 Dec 2017 • Pedro Silva, Sepehr Akhavan-Masouleh, Li Li
While these models commonly use features extracted from the structure of PE files, we propose that icons from these files can also help better predict malware.
1 code implementation • 21 Sep 2017 • Michael Wojnowicz, Dinh Nguyen, Li Li, Xuan Zhao
Stochastic principal component analysis (SPCA) has become a popular dimensionality reduction strategy for large, high-dimensional datasets.
no code implementations • 11 Jul 2017 • Xingyuan Dai, Rui Fu, Yilun Lin, Li Li, Fei-Yue Wang
Detrending based methods decompose original flow series into trend and residual series, in which trend describes the fixed temporal pattern in traffic flow and residual series is used for prediction.
no code implementations • 12 May 2017 • Yahui Liu, Jian Yao, Li Li, Xiaohu Lu, Jing Han
We develop a novel deep contour detection algorithm with a top-down fully convolutional encoder-decoder network.
no code implementations • 22 Feb 2017 • Yue Li, Dong Liu, Houqiang Li, Li Li, Feng Wu, Hong Zhang, Haitao Yang
A block can be down-sampled before being compressed by normal intra coding, and then up-sampled to its original resolution.
Multimedia
no code implementations • 21 Feb 2017 • Li Li, Zhu Li, Madhukar Budagavi, Houqiang Li
This paper proposes a novel advanced motion model to handle the irregular motion for the cubic map projection of 360-degree video.
no code implementations • 3 Nov 2016 • Jeremy Every, Li Li, Youguang G. Guo, David G. Dorrell
Determining the optimal size and orientation of small-scale residential based PV arrays will become increasingly complex in the future smart grid environment with the introduction of smart meters and dynamic tariffs.
no code implementations • 9 Sep 2016 • Felix Brockherde, Leslie Vogt, Li Li, Mark E. Tuckerman, Kieron Burke, Klaus-Robert Müller
Last year, at least 30, 000 scientific papers used the Kohn-Sham scheme of density functional theory to solve electronic structure problems in a wide variety of scientific fields, ranging from materials science to biochemistry to astrophysics.
no code implementations • 16 Aug 2016 • Fei Hu, Changjiu Pu, Haowei Gao, Mengzi Tang, Li Li
As a result, SAE can be used for image compression.
no code implementations • 10 Jun 2016 • Daoyuan Li, Li Li, Dongsun Kim, Tegawendé F. Bissyandé, David Lo, Yves Le Traon
One single code change can significantly influence a wide range of software systems and their users.
Software Engineering
no code implementations • 5 Apr 2016 • Li Li, Houfeng Wang
To the best of our knowledge, we are the first to tackle the imbalance problem in multi-label classification with many labels.
no code implementations • 16 Jan 2015 • Kevin Vu, John Snyder, Li Li, Matthias Rupp, Brandon F. Chen, Tarek Khelif, Klaus-Robert Müller, Kieron Burke
Accurate approximations to density functionals have recently been obtained via machine learning (ML).
no code implementations • 4 Apr 2014 • Li Li, John C. Snyder, Isabelle M. Pelaschier, Jessica Huang, Uma-Naresh Niranjan, Paul Duncan, Matthias Rupp, Klaus-Robert Müller, Kieron Burke
Kernel ridge regression is used to approximate the kinetic energy of non-interacting fermions in a one-dimensional box as a functional of their density.