Search Results for author: Zixu Zhuang

Found 5 papers, 2 papers with code

MUC: Mixture of Uncalibrated Cameras for Robust 3D Human Body Reconstruction

no code implementations8 Mar 2024 Yitao Zhu, Sheng Wang, Mengjie Xu, Zixu Zhuang, Zhixin Wang, Kaidong Wang, Han Zhang, Qian Wang

Next, instead of simply averaging models across views, we train a network to determine the weights of individual views for their fusion, based on the parameters estimated for joints and hands of human body as well as camera positions.

ChatCAD+: Towards a Universal and Reliable Interactive CAD using LLMs

1 code implementation25 May 2023 Zihao Zhao, Sheng Wang, Jinchen Gu, Yitao Zhu, Lanzhuju Mei, Zixu Zhuang, Zhiming Cui, Qian Wang, Dinggang Shen

The integration of Computer-Aided Diagnosis (CAD) with Large Language Models (LLMs) presents a promising frontier in clinical applications, notably in automating diagnostic processes akin to those performed by radiologists and providing consultations similar to a virtual family doctor.

In-Context Learning Retrieval

Learning Better Contrastive View from Radiologist's Gaze

1 code implementation15 May 2023 Sheng Wang, Zixu Zhuang, Xi Ouyang, Lichi Zhang, Zheren Li, Chong Ma, Tianming Liu, Dinggang Shen, Qian Wang

Then, we propose a novel augmentation method, i. e., FocusContrast, to learn from radiologists' gaze in diagnosis and generate contrastive views for medical images with guidance from radiologists' visual attention.

Contrastive Learning Data Augmentation

Spatial Attention-based Implicit Neural Representation for Arbitrary Reduction of MRI Slice Spacing

no code implementations23 May 2022 Xin Wang, Sheng Wang, Honglin Xiong, Kai Xuan, Zixu Zhuang, Mengjun Liu, Zhenrong Shen, Xiangyu Zhao, Lichi Zhang, Qian Wang

Magnetic resonance (MR) images collected in 2D clinical protocols typically have large inter-slice spacing, resulting in high in-plane resolution and reduced through-plane resolution.

Computational Efficiency Super-Resolution

Cannot find the paper you are looking for? You can Submit a new open access paper.