Search Results for author: Siddharth Patwardhan

Found 12 papers, 1 papers with code

Annotating Electronic Medical Records for Question Answering

no code implementations17 May 2018 Preethi Raghavan, Siddharth Patwardhan, Jennifer J. Liang, Murthy V. Devarakonda

Over the course of 11 months, 11 medical students followed our annotation methodology, resulting in a question answering dataset of 5696 questions over 71 patient records, of which 1747 questions have corresponding answers generated by the medical students.

Question Answering

The Role of Context Types and Dimensionality in Learning Word Embeddings

no code implementations NAACL 2016 Oren Melamud, David McClosky, Siddharth Patwardhan, Mohit Bansal

We provide the first extensive evaluation of how using different types of context to learn skip-gram word embeddings affects performance on a wide range of intrinsic and extrinsic NLP tasks.

Learning Word Embeddings

Quantifying Dismantlement in Disconnected Networks

1 code implementation16 Jun 2019 Siddharth Patwardhan

We propose a novel measure to quantify dismantlement of a fragmented network.

Physics and Society Social and Information Networks

Model Stability with Continuous Data Updates

no code implementations14 Jan 2022 Huiting Liu, Avinesh P. V. S., Siddharth Patwardhan, Peter Grasch, Sachin Agarwal

For this study, we propose a methodology for the assessment of model stability (which we refer to as jitter under various experimental conditions.

text-classification Text Classification

Can Open Domain Question Answering Systems Answer Visual Knowledge Questions?

no code implementations9 Feb 2022 Jiawen Zhang, Abhijit Mishra, Avinesh P. V. S, Siddharth Patwardhan, Sachin Agarwal

In this work, we propose a potentially data-efficient approach that reuses existing systems for (a) image analysis, (b) question rewriting, and (c) text-based question answering to answer such visual questions.

Open-Domain Question Answering Question Rewriting +1

EELBERT: Tiny Models through Dynamic Embeddings

no code implementations31 Oct 2023 Gabrielle Cohn, Rishika Agarwal, Deepanshu Gupta, Siddharth Patwardhan

We introduce EELBERT, an approach for compression of transformer-based models (e. g., BERT), with minimal impact on the accuracy of downstream tasks.

Cannot find the paper you are looking for? You can Submit a new open access paper.