Search Results for author: Mike A. Merrill

Found 4 papers, 0 papers with code

Language Models Still Struggle to Zero-shot Reason about Time Series

no code implementations17 Apr 2024 Mike A. Merrill, Mingtian Tan, Vinayak Gupta, Tom Hartvigsen, Tim Althoff

But it remains unknown whether non-trivial forecasting implies that language models can reason about time series.

Self-supervised Pretraining and Transfer Learning Enable Flu and COVID-19 Predictions in Small Mobile Sensing Datasets

no code implementations26 May 2022 Mike A. Merrill, Tim Althoff

Here, we introduce a neural architecture for multivariate time series classification designed to address these unique domain challenges.

Dimensionality Reduction Representation Learning +4

Transformer-Based Behavioral Representation Learning Enables Transfer Learning for Mobile Sensing in Small Datasets

no code implementations9 Jul 2021 Mike A. Merrill, Tim Althoff

This architecture combines benefits from CNN and Trans-former architectures to (1) enable better prediction performance by learning directly from raw minute-level sensor data without the need for handcrafted features by up to 0. 33 ROC AUC, and (2) use pretraining to outperform simpler neural models and boosted decision trees with data from as few a dozen participants.

Representation Learning Time Series +2

CORAL: COde RepresentAtion Learning with Weakly-Supervised Transformers for Analyzing Data Analysis

no code implementations28 Aug 2020 Ge Zhang, Mike A. Merrill, Yang Liu, Jeffrey Heer, Tim Althoff

Large scale analysis of source code, and in particular scientific source code, holds the promise of better understanding the data science process, identifying analytical best practices, and providing insights to the builders of scientific toolkits.

Descriptive Representation Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.