Search Results for author: Jae Hyeon Lee

Found 4 papers, 0 papers with code

Blind Biological Sequence Denoising with Self-Supervised Set Learning

no code implementations4 Sep 2023 Nathan Ng, Ji Won Park, Jae Hyeon Lee, Ryan Lewis Kelly, Stephen Ra, Kyunghyun Cho

This set embedding represents the "average" of the subreads and can be decoded into a prediction of the clean sequence.

Denoising

MoleCLUEs: Molecular Conformers Maximally In-Distribution for Predictive Models

no code implementations20 Jun 2023 Michael Maser, Natasa Tagasovska, Jae Hyeon Lee, Andrew Watkins

As we train our predictive models jointly with a conformer decoder, the new latent embeddings can be mapped to their corresponding inputs, which we call \textit{MoleCLUEs}, or (molecular) counterfactual latent uncertainty explanations \citep{antoran2020getting}.

counterfactual

Multi-segment preserving sampling for deep manifold sampler

no code implementations9 May 2022 Daniel Berenberg, Jae Hyeon Lee, Simon Kelow, Ji Won Park, Andrew Watkins, Vladimir Gligorijević, Richard Bonneau, Stephen Ra, Kyunghyun Cho

We introduce an alternative approach to this guided sampling procedure, multi-segment preserving sampling, that enables the direct inclusion of domain-specific knowledge by designating preserved and non-preserved segments along the input sequence, thereby restricting variation to only select regions.

Language Modelling

Cannot find the paper you are looking for? You can Submit a new open access paper.