Search Results for author: Carl Edwards

Found 11 papers, 7 papers with code

Text2Mol: Cross-Modal Molecule Retrieval with Natural Language Queries

1 code implementation EMNLP 2021 Carl Edwards, ChengXiang Zhai, Heng Ji

Moreover, this can be viewed as an especially challenging cross-lingual retrieval problem by considering the molecules as a language with a very unique grammar.

Cross-Modal Retrieval Natural Language Queries +1

$\textit{L+M-24}$: Building a Dataset for Language + Molecules @ ACL 2024

1 code implementation22 Feb 2024 Carl Edwards, Qingyun Wang, Lawrence Zhao, Heng Ji

Language-molecule models have emerged as an exciting direction for molecular discovery and understanding.

Entity Linking Property Prediction

Defining a New NLP Playground

no code implementations31 Oct 2023 Sha Li, Chi Han, Pengfei Yu, Carl Edwards, Manling Li, Xingyao Wang, Yi R. Fung, Charles Yu, Joel R. Tetreault, Eduard H. Hovy, Heng Ji

The recent explosion of performance of large language models (LLMs) has changed the field of Natural Language Processing (NLP) more abruptly and seismically than any other shift in the field's 80-year history.

Monte Carlo Thought Search: Large Language Model Querying for Complex Scientific Reasoning in Catalyst Design

1 code implementation22 Oct 2023 Henry W. Sprueill, Carl Edwards, Mariefel V. Olarte, Udishnu Sanyal, Heng Ji, Sutanay Choudhury

Discovering novel catalysts requires complex reasoning involving multiple chemical properties and resultant trade-offs, leading to a combinatorial growth in the search space.

Instruction Following Language Modelling +1

SynerGPT: In-Context Learning for Personalized Drug Synergy Prediction and Drug Design

no code implementations19 Jun 2023 Carl Edwards, Aakanksha Naik, Tushar Khot, Martin Burke, Heng Ji, Tom Hope

We are given a small "personalized dataset" of 10-20 drug synergy relationships in the context of specific cancer cell targets.

In-Context Learning Language Modelling

Translation between Molecules and Natural Language

1 code implementation25 Apr 2022 Carl Edwards, Tuan Lai, Kevin Ros, Garrett Honke, Kyunghyun Cho, Heng Ji

We present $\textbf{MolT5}$ $-$ a self-supervised learning framework for pretraining models on a vast amount of unlabeled natural language text and molecule strings.

Molecule Captioning Self-Supervised Learning +2

Semi-supervised New Event Type Induction and Description via Contrastive Loss-Enforced Batch Attention

no code implementations12 Feb 2022 Carl Edwards, Heng Ji

In contrast, we present a novel approach to semi-supervised new event type induction using a masked contrastive loss, which learns similarities between event mentions by enforcing an attention mechanism over the data minibatch.

Event Extraction

Phase-Only Beam Broadening of Contiguous Uniform Subarrayed Arrays Utilizing Three Metaheuristic Global Optimization Techniques

no code implementations14 Sep 2020 Barry Daniel, Carl Edwards, Adam Anderson

While many methods have been published that address beam broadening of traditional (nonsubarrayed) arrays, there is a knowledge gap in the published literature with respect to efficient and effective beam broadening of contiguous uniform subarrayed arrays.

Cannot find the paper you are looking for? You can Submit a new open access paper.