Search Results for author: James R. Foulds

Found 6 papers, 1 papers with code

Polling Latent Opinions: A Method for Computational Sociolinguistics Using Transformer Language Models

no code implementations15 Apr 2022 Philip Feldman, Aaron Dant, James R. Foulds, Shemei Pan

Text analysis of social media for sentiment, topic analysis, and other analysis depends initially on the selection of keywords and phrases that will be used to create the research corpora.

Learning User Embeddings from Temporal Social Media Data: A Survey

no code implementations17 May 2021 Fatema Hasan, Kevin S. Xu, James R. Foulds, SHimei Pan

User-generated data on social media contain rich information about who we are, what we like and how we make decisions.

Representation Learning

Analyzing COVID-19 Tweets with Transformer-based Language Models

no code implementations20 Apr 2021 Philip Feldman, Sim Tiwari, Charissa S. L. Cheah, James R. Foulds, SHimei Pan

This paper describes a method for using Transformer-based Language Models (TLMs) to understand public opinion from social media posts.

Equitable Allocation of Healthcare Resources with Fair Cox Models

no code implementations14 Oct 2020 Kamrun Naher Keya, Rashidul Islam, SHimei Pan, Ian Stockwell, James R. Foulds

Healthcare programs such as Medicaid provide crucial services to vulnerable populations, but due to limited resources, many of the individuals who need these services the most languish on waiting lists.

Fairness

Neural Embedding Allocation: Distributed Representations of Topic Models

no code implementations10 Sep 2019 Kamrun Naher Keya, Yannis Papanikolaou, James R. Foulds

Word embedding models such as the skip-gram learn vector representations of words' semantic relationships, and document embedding models learn similar representations for documents.

Document Embedding Topic Models

Dense Distributions from Sparse Samples: Improved Gibbs Sampling Parameter Estimators for LDA

1 code implementation8 May 2015 Yannis Papanikolaou, James R. Foulds, Timothy N. Rubin, Grigorios Tsoumakas

We introduce a novel approach for estimating Latent Dirichlet Allocation (LDA) parameters from collapsed Gibbs samples (CGS), by leveraging the full conditional distributions over the latent variable assignments to efficiently average over multiple samples, for little more computational cost than drawing a single additional collapsed Gibbs sample.

Multi-Label Classification

Cannot find the paper you are looking for? You can Submit a new open access paper.