no code implementations • 1 Apr 2024 • Renxiang Guan, Zihao Li, Chujia Song, Guo Yu, Xianju Li, Ruyi Feng
Specifically, we fused the spectral and spatial features extracted by the 1D- and 2D-encoder, and the 2D-encoder includes an attention model to automatically extract important information.
1 code implementation • 10 Feb 2024 • Guo Yu
Hence, this study suggests employing the Kalman filter in the active noise control (ANC) system to enhance the efficacy of noise reduction for dynamic noise.
1 code implementation • 30 Jan 2024 • Lianbo Ma, Yuee Zhou, Jianlun Ma, Guo Yu, Qing Li
During the gradient descent learning, a one-step forward search is designed to find the trial gradient of the next-step, which is adopted to adjust the gradient of current step towards the direction of fast convergence.
no code implementations • 9 Jan 2024 • Wanting Zhang, Wei Du, Guo Yu, Renchu He, Wenli Du, Yaochu Jin
On the basis of the proposed model, a dual-stage evolutionary algorithm driven by heuristic rules (denoted by DSEA/HR) is developed, where the dual-stage search mechanism consists of global search and local refinement.
no code implementations • 6 Oct 2023 • Guo Yu
The efficacy of active noise control technology in mitigating urban noise, particularly in relation to low-frequency components, has been well-established.
1 code implementation • 14 Nov 2022 • Yangyi Zhang, Sui Tang, Guo Yu
The Coronavirus Disease 2019 (COVID-19) has a profound impact on global health and economy, making it crucial to build accurate and interpretable data-driven predictive models for COVID-19 cases to improve policy making.
no code implementations • 23 Aug 2022 • Nan Li, Lianbo Ma, Guo Yu, Bing Xue, Mengjie Zhang, Yaochu Jin
Specifically, we firstly illuminate EDL from machine learning and EC and regard EDL as an optimization problem.
no code implementations • 22 Jul 2022 • Guo Yu, Lianbo Ma, Wei Du, Wenli Du, Yaochu Jin
Recent years have seen the rapid development of fairness-aware machine learning in mitigating unfairness or discrimination in decision-making in a wide range of applications.
no code implementations • 14 Sep 2021 • Lianbo Ma, Nan Li, Guo Yu, Xiaoyu Geng, Min Huang, Xingwei Wang
In the deployment of deep neural models, how to effectively and automatically find feasible deep models under diverse design objectives is fundamental.
no code implementations • 28 Feb 2021 • Guoyang Xie, Jinbao Wang, Guo Yu, Feng Zheng, Yaochu Jin
Our work focuses on how to improve the robustness of tiny neural networks without seriously deteriorating of clean accuracy under mobile-level resources.
no code implementations • 12 Oct 2020 • Pengjie Wang, Guo Yu, Yanyu Jia, Michael Onyszczak, F. Alexandre Cevallos, Shiming Lei, Sebastian Klemenz, Kenji Watanabe, Takashi Taniguchi, Robert J. Cava, Leslie M. Schoop, Sanfeng Wu
Using a detection scheme that avoids edge contributions, we uncover strikingly large quantum oscillations in the monolayer insulator's magnetoresistance, with an onset field as small as ~ 0. 5 tesla.
Mesoscale and Nanoscale Physics Materials Science Strongly Correlated Electrons
no code implementations • ECCV 2020 • Mingfei Gao, Zizhao Zhang, Guo Yu, Sercan O. Arik, Larry S. Davis, Tomas Pfister
Active learning (AL) combines data labeling and model training to minimize the labeling cost by prioritizing the selection of high value data that can best improve model performance.
no code implementations • 25 Sep 2019 • Mingfei Gao, Zizhao Zhang, Guo Yu, Sercan O. Arik, Larry S. Davis, Tomas Pfister
Active learning (AL) aims to integrate data labeling and model training in a unified way, and to minimize the labeling budget by prioritizing the selection of high value data that can best improve model performance.
no code implementations • 6 Dec 2017 • Guo Yu, Jacob Bien
In this paper, we propose the natural lasso estimator for the error variance, which maximizes a penalized likelihood objective.
no code implementations • 25 Apr 2016 • Guo Yu, Jacob Bien
Penalized maximum likelihood estimation of this matrix yields a simple regression interpretation for local dependence in which variables are predicted by their neighbors.