Search Results for author: Chang-Yu Hsieh

Found 23 papers, 12 papers with code

AAVDiff: Experimental Validation of Enhanced Viability and Diversity in Recombinant Adeno-Associated Virus (AAV) Capsids through Diffusion Generation

no code implementations16 Apr 2024 Lijun Liu, Jiali Yang, Jianfei Song, Xinglin Yang, Lele Niu, Zeqi Cai, Hui Shi, Tingjun Hou, Chang-Yu Hsieh, Weiran Shen, Yafeng Deng

Additionally, in the absence of AAV9 capsid data, apart from one wild-type sequence, we used the same model to directly generate a number of viable sequences with up to 9 mutations.

Specificity

Deep Geometry Handling and Fragment-wise Molecular 3D Graph Generation

no code implementations15 Mar 2024 Odin Zhang, Yufei Huang, Shichen Cheng, Mengyao Yu, Xujun Zhang, Haitao Lin, Yundian Zeng, Mingyang Wang, Zhenxing Wu, Huifeng Zhao, Zaixi Zhang, Chenqing Hua, Yu Kang, Sunliang Cui, Peichen Pan, Chang-Yu Hsieh, Tingjun Hou

Most earlier 3D structure-based molecular generation approaches follow an atom-wise paradigm, incrementally adding atoms to a partially built molecular fragment within protein pockets.

Graph Generation

Generative AI for Controllable Protein Sequence Design: A Survey

no code implementations16 Feb 2024 Yiheng Zhu, Zitai Kong, Jialu Wu, Weize Liu, Yuqiang Han, Mingze Yin, Hongxia Xu, Chang-Yu Hsieh, Tingjun Hou

To set the stage, we first outline the foundational tasks in protein sequence design in terms of the constraints involved and present key generative models and optimization algorithms.

Drug Discovery Protein Design

From molecules to scaffolds to functional groups: building context-dependent molecular representation via multi-channel learning

no code implementations5 Nov 2023 Yue Wan, Jialu Wu, Tingjun Hou, Chang-Yu Hsieh, Xiaowei Jia

Self-supervised learning (SSL) has emerged as a popular solution, utilizing large-scale, unannotated molecular data to learn a foundational representation of chemical space that might be advantageous for downstream tasks.

Drug Discovery Molecular Property Prediction +3

MolHF: A Hierarchical Normalizing Flow for Molecular Graph Generation

1 code implementation15 May 2023 Yiheng Zhu, Zhenqiu Ouyang, Ben Liao, Jialu Wu, Yixuan Wu, Chang-Yu Hsieh, Tingjun Hou, Jian Wu

However, limited attention is paid to hierarchical generative models, which can exploit the inherent hierarchical structure (with rich semantic information) of the molecular graphs and generate complex molecules of larger size that we shall demonstrate to be difficult for most existing models.

Graph Generation Molecular Graph Generation +1

An Equivariant Generative Framework for Molecular Graph-Structure Co-Design

no code implementations12 Apr 2023 Zaixi Zhang, Qi Liu, Chee-Kong Lee, Chang-Yu Hsieh, Enhong Chen

Our extensive investigation reveals that the 2D topology and 3D geometry contain intrinsically complementary information in molecule design, and provide new insights into machine learning-based molecule representation and generation.

Drug Discovery Graph Generation +1

Sample-efficient Multi-objective Molecular Optimization with GFlowNets

1 code implementation NeurIPS 2023 Yiheng Zhu, Jialu Wu, Chaowen Hu, Jiahuan Yan, Chang-Yu Hsieh, Tingjun Hou, Jian Wu

Many crucial scientific problems involve designing novel molecules with desired properties, which can be formulated as a black-box optimization problem over the discrete chemical space.

Bayesian Optimization

Protein-Ligand Complex Generator & Drug Screening via Tiered Tensor Transform

1 code implementation3 Jan 2023 Jonathan P. Mailoa, Zhaofeng Ye, Jiezhong Qiu, Chang-Yu Hsieh, Shengyu Zhang

The generation of small molecule candidate (ligand) binding poses in its target protein pocket is important for computer-aided drug discovery.

Drug Discovery

Retroformer: Pushing the Limits of Interpretable End-to-end Retrosynthesis Transformer

1 code implementation29 Jan 2022 Yue Wan, Benben Liao, Chang-Yu Hsieh, Shengyu Zhang

In this paper, we propose Retroformer, a novel Transformer-based architecture for retrosynthesis prediction without relying on any cheminformatics tools for molecule editing.

Retrosynthesis

Fast Extraction of Word Embedding from Q-contexts

no code implementations15 Sep 2021 Junsheng Kong, Weizhao Li, Zeyi Liu, Ben Liao, Jiezhong Qiu, Chang-Yu Hsieh, Yi Cai, Shengyu Zhang

In this work, we show that with merely a small fraction of contexts (Q-contexts)which are typical in the whole corpus (and their mutual information with words), one can construct high-quality word embedding with negligible errors.

Modeling Protein Using Large-scale Pretrain Language Model

1 code implementation17 Aug 2021 Yijia Xiao, Jiezhong Qiu, Ziang Li, Chang-Yu Hsieh, Jie Tang

The emergence of deep learning models makes modeling data patterns in large quantities of data possible.

Drug Discovery Language Modelling

Neural Predictor based Quantum Architecture Search

no code implementations11 Mar 2021 Shi-Xin Zhang, Chang-Yu Hsieh, Shengyu Zhang, Hong Yao

For instance, a key component of VQAs is the design of task-dependent parameterized quantum circuits (PQCs) as in the case of designing a good neural architecture in deep learning.

Neural Architecture Search Quantum Physics

TrimNet: learning molecular representation from triplet messages for biomedicine

1 code implementation4 Nov 2020 Pengyong Li, Yuquan Li, Chang-Yu Hsieh, Shengyu Zhang, Xianggen Liu, Huanxiang Liu, Sen Song, Xiaojun Yao

These advantages have established TrimNet as a powerful and useful computational tool in solving the challenging problem of molecular representation learning.

Drug Discovery Molecular Property Prediction +3

Differentiable Quantum Architecture Search

1 code implementation16 Oct 2020 Shi-Xin Zhang, Chang-Yu Hsieh, Shengyu Zhang, Hong Yao

Hereby, we propose a general framework of differentiable quantum architecture search (DQAS), which enables automated designs of quantum circuits in an end-to-end differentiable fashion.

Quantum Physics

A quantum extension of SVM-perf for training nonlinear SVMs in almost linear time

no code implementations18 Jun 2020 Jonathan Allcock, Chang-Yu Hsieh

We propose a quantum algorithm for training nonlinear support vector machines (SVM) for feature space learning where classical input data is encoded in the amplitudes of quantum states.

Utilizing Edge Features in Graph Neural Networks via Variational Information Maximization

no code implementations13 Jun 2019 Pengfei Chen, Weiwen Liu, Chang-Yu Hsieh, Guangyong Chen, Shengyu Zhang

The IGNN model is based on an elegant and fundamental idea in information theory as explained in the main text, and it could be easily generalized beyond the contexts of molecular graphs considered in this work.

Drug Discovery Quantum Chemistry Regression

Rethinking the Usage of Batch Normalization and Dropout in the Training of Deep Neural Networks

1 code implementation15 May 2019 Guangyong Chen, Pengfei Chen, Yujun Shi, Chang-Yu Hsieh, Benben Liao, Shengyu Zhang

Our work is based on an excellent idea that whitening the inputs of neural networks can achieve a fast convergence speed.

Quantum algorithms for feedforward neural networks

no code implementations7 Dec 2018 Jonathan Allcock, Chang-Yu Hsieh, Iordanis Kerenidis, Shengyu Zhang

The running times of our algorithms can be quadratically faster in the size of the network than their standard classical counterparts since they depend linearly on the number of neurons in the network, as opposed to the number of connections between neurons as in the classical case.

BIG-bench Machine Learning Quantum Machine Learning +1

Cannot find the paper you are looking for? You can Submit a new open access paper.