Search Results for author: Zach Wood-Doughty

Found 13 papers, 7 papers with code

How Does Twitter User Behavior Vary Across Demographic Groups?

no code implementations WS 2017 Zach Wood-Doughty, Michael Smith, David Broniatowski, Mark Dredze

Demographically-tagged social media messages are a common source of data for computational social science.

Predicting Twitter User Demographics from Names Alone

1 code implementation WS 2018 Zach Wood-Doughty, Nicholas Andrews, Rebecca Marvin, Mark Dredze

Social media analysis frequently requires tools that can automatically infer demographics to contextualize trends.

Johns Hopkins or johnny-hopkins: Classifying Individuals versus Organizations on Twitter

1 code implementation WS 2018 Zach Wood-Doughty, Praateek Mahajan, Mark Dredze

Previous work (McCorriston et al., 2015) presented a method for determining if an account was an individual or organization based on account profile and a collection of tweets.

General Classification

Challenges of Using Text Classifiers for Causal Inference

1 code implementation EMNLP 2018 Zach Wood-Doughty, Ilya Shpitser, Mark Dredze

Causal understanding is essential for many kinds of decision-making, but causal inference from observational data has typically only been applied to structured, low-dimensional datasets.

Causal Inference Decision Making

Convolutions Are All You Need (For Classifying Character Sequences)

no code implementations WS 2018 Zach Wood-Doughty, Nicholas Andrews, Mark Dredze

While recurrent neural networks (RNNs) are widely used for text classification, they demonstrate poor performance and slow convergence when trained on long sequences.

Document Classification General Classification +3

Using Noisy Self-Reports to Predict Twitter User Demographics

1 code implementation NAACL (SocialNLP) 2021 Zach Wood-Doughty, Paiheng Xu, Xiao Liu, Mark Dredze

We present a method to identify self-reports of race and ethnicity from Twitter profile descriptions.

Demographic Representation and Collective Storytelling in the Me Too Twitter Hashtag Activism Movement

no code implementations13 Oct 2020 Aaron Mueller, Zach Wood-Doughty, Silvio Amir, Mark Dredze, Alicia L. Nobles

The #MeToo movement on Twitter has drawn attention to the pervasive nature of sexual harassment and violence.

Generating Synthetic Text Data to Evaluate Causal Inference Methods

no code implementations10 Feb 2021 Zach Wood-Doughty, Ilya Shpitser, Mark Dredze

High-dimensional and unstructured data such as natural language complicates the evaluation of causal inference methods; such evaluations rely on synthetic datasets with known causal effects.

Causal Inference Text Generation

Faithful and Plausible Explanations of Medical Code Predictions

1 code implementation16 Apr 2021 Zach Wood-Doughty, Isabel Cachola, Mark Dredze

Machine learning models that offer excellent predictive performance often lack the interpretability necessary to support integrated human machine decision-making.

Decision Making

The Proximal ID Algorithm

no code implementations15 Aug 2021 Ilya Shpitser, Zach Wood-Doughty, Eric J. Tchetgen Tchetgen

Unobserved confounding is a fundamental obstacle to establishing valid causal conclusions from observational data.

Causal Inference valid

Segment Anything Model is a Good Teacher for Local Feature Learning

1 code implementation29 Sep 2023 Jingqian Wu, Rongtao Xu, Zach Wood-Doughty, Changwei Wang, Shibiao Xu, Edmund Lam

To do so, first, we construct an auxiliary task of Pixel Semantic Relational Distillation (PSRD), which distillates feature relations with category-agnostic semantic information learned by the SAM encoder into a local feature learning network, to improve local feature description using semantic discrimination.

Contrastive Learning Visual Localization

Model Distillation for Faithful Explanations of Medical Code Predictions

no code implementations BioNLP (ACL) 2022 Zach Wood-Doughty, Isabel Cachola, Mark Dredze

We propose to use knowledge distillation, or training a student model that mimics the behavior of a trained teacher model, as a technique to generate faithful and plausible explanations.

Decision Making Knowledge Distillation

Cannot find the paper you are looking for? You can Submit a new open access paper.