no code implementations • 5 Apr 2024 • Harsh Kohli, Helian Feng, Nicholas Dronen, Calvin Mccarter, Sina Moeini, Ali Kebarighotbi
In contemporary machine learning approaches to bilingual lexicon induction (BLI), a model learns a mapping between the embedding spaces of a language pair.
1 code implementation • 28 Oct 2023 • Calvin Mccarter
We report the effects of replacing the scaled dot-product (within softmax) attention with the negative-log of Euclidean distance.
1 code implementation • 18 Sep 2023 • Calvin Mccarter
We demonstrate that, without hyperparameter tuning, the kernel density integral transformation can be used as a simple drop-in replacement for either method, offering protection from the weaknesses of each.
1 code implementation • 12 Jul 2022 • Calvin Mccarter, Nicholas Dronen
Fast approximations to matrix multiplication have the potential to dramatically reduce the cost of neural network inference.
no code implementations • 12 May 2022 • Ayon Basumallik, Darius Bunandar, Nicholas Dronen, Nicholas Harris, Ludmila Levkova, Calvin Mccarter, Lakshmi Nair, David Walter, David Widemann
Analog mixed-signal (AMS) devices promise faster, more energy-efficient deep neural network (DNN) inference than their digital counterparts.
1 code implementation • 23 Mar 2022 • Calvin Mccarter
Domain adaptation approaches which do account for such confounding are designed to adapt covariates to optimally predict a particular label whose shift is confounded with covariate shift.
no code implementations • 1 Feb 2021 • Robin Hanson, Daniel Martin, Calvin Mccarter, Jonathan Paulson
We fit this three-parameter model of loud aliens to data: 1) birth power from the number of hard steps seen in Earth history, 2) birth constant by assuming a inform distribution over our rank among loud alien birth dates, and 3) expansion speed from our not seeing alien volumes in our sky.
no code implementations • 15 Sep 2015 • Calvin Mccarter, Seyoung Kim
While highly scalable optimization methods exist for sparse Gaussian graphical model estimation, state-of-the-art methods for conditional Gaussian graphical models are not efficient enough and more importantly, fail due to memory constraints for very large problems.
no code implementations • NeurIPS 2014 • Calvin Mccarter, Seyoung Kim
In this paper, we address the problem of learning the structure of Gaussian chain graph models in a high-dimensional space.