no code implementations • 17 Apr 2024 • Weiyu Guo, Ziyue Qiao, Ying Sun, Hui Xiong
We propose a Short Term Enhancement Module(STEM) which can be easily integrated with various models.
1 code implementation • 3 Apr 2024 • Yunfan Lu, Yijie Xu, Wenzong Ma, Weiyu Guo, Hui Xiong
To end this, we present a Swin-Transformer-based backbone and a pixel-focus loss function for demosaicing with missing pixel values in RAW domain processing.
1 code implementation • 21 Oct 2022 • Weiyu Guo, Zhaoshuo Li, Yongkui Yang, Zheng Wang, Russell H. Taylor, Mathias Unberath, Alan Yuille, Yingwei Li
We construct our stereo depth estimation model, Context Enhanced Stereo Transformer (CSTR), by plugging CEP into the state-of-the-art stereo depth estimation method Stereo Transformer.
no code implementations • 22 May 2021 • Weiyu Guo, Zhijiang Yang, Shu Wu, Fu Chen
Experimental results obtained on real-world enterprise datasets verify that the proposed approach achieves higher performance than conventional methods, and provides insights into individual rating results and the reliability of model training.
no code implementations • 7 Jul 2020 • Weiyu Guo, Yidong Ouyang
We demonstrate the effectiveness of our regularization by (1) defensing to adversarial perturbations; (2) reducing the generalization gap in different architecture; (3) improving the generalization ability in transfer learning scenario without fine-tune.
no code implementations • 29 Sep 2019 • Weiyu Guo, Jiabin Ma, Liang Wang, Yongzhen Huang
As deep neural networks are increasingly used in applications suited for low-power devices, a fundamental dilemma becomes apparent: the trend is to grow models to absorb increasing data that gives rise to memory intensive; however low-power devices are designed with very limited memory that can not store large models.