Search Results for author: Ying Lu

Found 15 papers, 2 papers with code

Domain-invariant Representation Learning via Segment Anything Model for Blood Cell Classification

1 code implementation14 Aug 2024 Yongcheng Li, Lingcong Cai, Ying Lu, Cheng Lin, Yupeng Zhang, Jingyan Jiang, Genan Dai, BoWen Zhang, Jingzhou Cao, Xiangzhong Zhang, Xiaomao Fan

To address this issue, we propose a novel framework of domain-invariant representation learning (DoRL) via segment anything model (SAM) for blood cell classification.

Representation Learning

Towards Cross-Domain Single Blood Cell Image Classification via Large-Scale LoRA-based Segment Anything Model

1 code implementation13 Aug 2024 Yongcheng Li, Lingcong Cai, Ying Lu, Yupeng Zhang, Jingyan Jiang, Genan Dai, BoWen Zhang, Jingzhou Cao, Xiangzhong Zhang, Xiaomao Fan

Accurate classification of blood cells plays a vital role in hematological analysis as it aids physicians in diagnosing various medical conditions.

Image Classification

Opportunities and challenges in the application of large artificial intelligence models in radiology

no code implementations24 Mar 2024 Liangrui Pan, Zhenyu Zhao, Ying Lu, Kewei Tang, Liyong Fu, Qingchun Liang, Shaoliang Peng

Influenced by ChatGPT, artificial intelligence (AI) large models have witnessed a global upsurge in large model research and development.

Video Generation

Developing A Fair Individualized Polysocial Risk Score (iPsRS) for Identifying Increased Social Risk of Hospitalizations in Patients with Type 2 Diabetes (T2D)

no code implementations5 Sep 2023 Yu Huang, Jingchuan Guo, William T Donahoo, Zhengkang Fan, Ying Lu, Wei-Han Chen, Huilin Tang, Lori Bilello, Elizabeth A Shenkman, Jiang Bian

Background: Racial and ethnic minority groups and individuals facing social disadvantages, which often stem from their social determinants of health (SDoH), bear a disproportionate burden of type 2 diabetes (T2D) and its complications.

Fairness

Persistent Ballistic Entanglement Spreading with Optimal Control in Quantum Spin Chains

no code implementations21 Jul 2023 Ying Lu, Pei Shi, Xiao-Han Wang, Jie Hu, Shi-Ju Ran

Entanglement propagation provides a key routine to understand quantum many-body dynamics in and out of equilibrium.

Quantum compiling with a variational instruction set for accurate and fast quantum computing

no code implementations29 Mar 2022 Ying Lu, Peng-Fei Zhou, Shao-Ming Fei, Shi-Ju Ran

The quantum instruction set (QIS) is defined as the quantum gates that are physically realizable by controlling the qubits in quantum hardware.

Embedding Node Structural Role Identity Using Stress Majorization

no code implementations14 Sep 2021 Lili Wang, Chenghan Huang, Weicheng Ma, Ying Lu, Soroush Vosoughi

In this paper, we present a novel and flexible framework using stress majorization, to transform the high-dimensional role identities in networks directly (without approximation or indirect modeling) to a low-dimensional embedding space.

Node Classification

Preparation of Many-body Ground States by Time Evolution with Variational Microscopic Magnetic Fields and Incomplete Interactions

no code implementations3 Jun 2021 Ying Lu, Yue-Min Li, Peng-Fei Zhou, Shi-Ju Ran

State preparation is of fundamental importance in quantum physics, which can be realized by constructing the quantum circuit as a unitary that transforms the initial state to the target, or implementing a quantum control protocol to evolve to the target state with a designed Hamiltonian.

Embedding Node Structural Role Identity into Hyperbolic Space

no code implementations3 Nov 2020 Lili Wang, Ying Lu, Chenghan Huang, Soroush Vosoughi

However, the work on network embedding in hyperbolic space has been focused on microscopic node embedding.

Network Embedding

D$^2$-City: A Large-Scale Dashcam Video Dataset of Diverse Traffic Scenarios

no code implementations3 Apr 2019 Zhengping Che, Guangyu Li, Tracy Li, Bo Jiang, Xuefeng Shi, Xinsheng Zhang, Ying Lu, Guobin Wu, Yan Liu, Jieping Ye

Driving datasets accelerate the development of intelligent driving and related computer vision technologies, while substantial and detailed annotations serve as fuels and powers to boost the efficacy of such datasets to improve learning-based models.

Diversity

Discriminative Label Consistent Domain Adaptation

no code implementations21 Feb 2018 Lingkun Luo, Liming Chen, Ying Lu, Shiqiang Hu

Domain adaptation (DA) is transfer learning which aims to learn an effective predictor on target data from source data despite data distribution mismatch between source and target.

Domain Adaptation Image Classification +1

Brenier approach for optimal transportation between a quasi-discrete measure and a discrete measure

no code implementations17 Jan 2018 Ying Lu, Liming Chen, Alexandre Saidi, Xianfeng GU

Correctly estimating the discrepancy between two data distributions has always been an important task in Machine Learning.

BIG-bench Machine Learning

Discriminative and Geometry Aware Unsupervised Domain Adaptation

no code implementations28 Dec 2017 Lingkun Luo, Liming Chen, Shiqiang Hu, Ying Lu, Xiaofang Wang

Domain adaptation (DA) aims to generalize a learning model across training and testing data despite the mismatch of their data distributions.

Image Classification Unsupervised Domain Adaptation

Optimal Transport for Deep Joint Transfer Learning

no code implementations9 Sep 2017 Ying Lu, Liming Chen, Alexandre Saidi

By adding an Optimal Transport loss (OT loss) between source and target classifier predictions as a constraint on the source classifier, the proposed Joint Transfer Learning Network (JTLN) can effectively learn useful knowledge for target classification from source data.

General Classification Image Classification +1

Cannot find the paper you are looking for? You can Submit a new open access paper.