no code implementations • 9 Jul 2017 • Pedram Ghamisi, Gabriele Cavallaro, Dan, Wu, Jon Atli Benediktsson, Antonio Plaza
In this paper, an approach is proposed to fuse LiDAR and hyperspectral data, which considers both spectral and spatial information in a single framework.
no code implementations • The 41st International ACM SIGIR Conference 2018 • Wu, Yuexin Yang, Yiming Nishiura, Hiroshi Saitoh, Masaya
Predicting new and urgent trends in epidemiological data is an important problem for public health, and has attracted increasing attention in the data mining and machine learning communities.
no code implementations • 16 Sep 2019 • Yi-Ta Chen, Yu-Chuan Chuang, An-Yeu, Wu
In this paper, we propose an AdaBoost-assisted extreme learning machine for efficient online sequential classification (AOS-ELM).
no code implementations • 7 Jan 2020 • Pengzhou, Wu, Kenji Fukumizu
We address the problem of distinguishing cause from effect in bivariate setting.
no code implementations • 9 Dec 2020 • Weixin, Wu, Sonal Thakkar, Will Hawkins, Puya Vahabi, Alberto Todeschini
An accurate and precise understanding of global irrigation usage is crucial for a variety of climate science efforts.
1 code implementation • 3 Nov 2021 • Win-Ken Beh, Yi-Hsuan Wu, An-Yeu, Wu
Besides, we also presents a reproducible baseline system as a preliminary benchmark (The code of the baseline system on MAUS dataset is available on Github: https://github. com/rickwu11/MAUS\_dataset\_baseline\_system), which testing accuracy are 71. 6 %, 66. 7 %, and 59. 9 % in ECG, fingertip PPG, wristband PPG, respectively.
no code implementations • 16 Jul 2022 • Yu-Shan Tai, Cheng-Yang Chang, Chieh-Fang Teng, AnYeu, Wu
Recently, deep convolutional neural networks (CNNs) have achieved many eye-catching results.
1 code implementation • 25 Jul 2022 • Cheng-Yen Hsieh, Yu-Chuan Chuang, An-Yeu, Wu
Based on the simulation results on CIFAR-10 and CIFAR-100, our method achieves a 16x compression ratio with negligible accuracy drops compared with the vanilla SL.
no code implementations • 22 May 2023 • Yu-Shan Tai, Ming-Guang Lin, An-Yeu, Wu
Due to the non-normally distributed values after Softmax and GeLU, post-training quantization on ViTs results in severe accuracy degradation.
1 code implementation • . 2023 • Wu, Shih-Lun and Donahue, Chris and Watanabe, Shinji and Bryan, Nicholas J.
While the image-domain Uni-ControlNet method already allows generation with any subset of controls, we devise a new strategy to allow creators to input controls that are only partially specified in time.
no code implementations • 26 Jan 2024 • Yu-Shan Tai, An-Yeu, Wu
However, without considering the asymmetry in activations and relying on hand-crafted settings, these methods often struggle to maintain performance under low-bit quantization.
no code implementations • 11 Apr 2024 • Jiing-Ping Wang, Ming-Guang Lin, An-Yeu, Wu
With the rise of Transformer models in NLP and CV domain, Multi-Head Attention has been proven to be a game-changer.