no code implementations • 15 Aug 2024 • Ali Pourranjbar, Georges Kaddoum, Verdier Assoume Mba, Sahil Garg, Satinder Singh
Unlike previous works that assume an ideal environment with precise knowledge of subcarrier count and cyclic prefix location, we consider blind modulation detection while accounting for realistic environmental parameters and imperfections.
no code implementations • 10 Apr 2024 • Sahil Garg, Anderson Schneider, Anant Raj, Kashif Rasul, Yuriy Nevmyvaka, Sneihil Gopal, Amit Dhurandhar, Guillermo Cecchi, Irina Rish
In addition to the data efficiency gained from direct sampling, we propose an algorithm that offers a significant reduction in sample complexity for estimating the divergence of the data distribution with respect to the marginal distribution.
1 code implementation • 9 Mar 2024 • Zijie Pan, Yushan Jiang, Sahil Garg, Anderson Schneider, Yuriy Nevmyvaka, Dongjin Song
To this end, we propose Semantic Space Informed Prompt learning with LLM ($S^2$IP-LLM) to align the pre-trained semantic space with time series embeddings space and perform time series forecasting based on learned prompts from the joint space.
no code implementations • 20 Feb 2024 • Zijie Pan, Yushan Jiang, Dongjin Song, Sahil Garg, Kashif Rasul, Anderson Schneider, Yuriy Nevmyvaka
To address this issue, we propose a novel Structural Knowledge Informed Continual Learning (SKI-CL) framework to perform MTS forecasting within a continual learning paradigm, which leverages structural knowledge to steer the forecasting model toward identifying and adapting to different regimes, and selects representative MTS samples from each regime for memory replay.
no code implementations • 5 Feb 2024 • Yushan Jiang, Zijie Pan, Xikun Zhang, Sahil Garg, Anderson Schneider, Yuriy Nevmyvaka, Dongjin Song
Specifically, we first state the challenges and motivations of applying language models in the context of time series as well as brief preliminaries of LLMs.
1 code implementation • 12 Oct 2023 • Kashif Rasul, Arjun Ashok, Andrew Robert Williams, Hena Ghonia, Rishika Bhagwatkar, Arian Khorasani, Mohammad Javad Darvishi Bayazi, George Adamopoulos, Roland Riachi, Nadhir Hassen, Marin Biloš, Sahil Garg, Anderson Schneider, Nicolas Chapados, Alexandre Drouin, Valentina Zantedeschi, Yuriy Nevmyvaka, Irina Rish
Over the past years, foundation models have caused a paradigm shift in machine learning due to their unprecedented capabilities for zero-shot and few-shot generalization.
no code implementations • 21 Mar 2021 • Dongsheng Wang, Prayag Tiwari, Sahil Garg, Hongyin Zhu, Peter Bruza
In this paper, we propose a novel lightweight relation extraction approach of structural block driven - convolutional neural learning.
no code implementations • 15 Oct 2020 • Muhammad Waseem Akhtar, Syed Ali Hassan, Rizwan Ghaffar, Haejoon Jung, Sahil Garg, M. Shamim Hossain
The sixth-generation (6G) wireless communication network is expected to integrate the terrestrial, aerial, and maritime communications into a robust network which would be more reliable, fast, and can support a massive number of devices with ultra-low latency requirements.
no code implementations • 19 Jul 2020 • Yi Liu, Sahil Garg, Jiangtian Nie, Yang Zhang, Zehui Xiong, Jiawen Kang, M. Shamim Hossain
Third, to adapt the proposed framework to the timeliness of industrial anomaly detection, we propose a gradient compression mechanism based on Top-\textit{k} selection to improve communication efficiency.
no code implementations • 21 Apr 2020 • Harvineet Singh, Moumita Sinha, Atanu R. Sinha, Sahil Garg, Neha Banerjee
We posit that emails are likely to be opened sooner when send times are convenient for recipients, while for other send times, emails can get ignored.
no code implementations • IJCNLP 2019 • Sahil Garg, Aram Galstyan, Greg Ver Steeg, Guillermo Cecchi
Recently, kernelized locality sensitive hashcodes have been successfully employed as representations of natural language text, especially showing high relevance to biomedical relation extraction tasks.
no code implementations • 9 Sep 2019 • Sahil Garg, Aram Galstyan, Greg Ver Steeg, Guillermo Cecchi
Recently, kernelized locality sensitive hashcodes have been successfully employed as representations of natural language text, especially showing high relevance to biomedical relation extraction tasks.
no code implementations • 30 Jan 2019 • Poulmanogo Illy, Georges Kaddoum, Christian Miranda Moreira, Kuljeet Kaur, Sahil Garg
Many solutions proposed in the literature are reported to have high accuracy but are ineffective in real applications due to the non-representativity of the dataset used for training and evaluation of the underlying models.
no code implementations • 29 Apr 2018 • Emilia M. Wysocka, Valery Dzutsati, Tirthankar Bandyopadhyay, Laura Condon, Sahil Garg
In the paper, we elaborate on the reason of multidimensional analysis problem in the context of molecular signaling, and next, we introduce the model of choice, simulation details and obtained time series dynamics.
no code implementations • 27 Apr 2018 • Sahil Garg, Amarjeet Singh, Fabio Ramos
One of the primary aspects of sustainable development involves accurate understanding and modeling of environmental phenomena.
no code implementations • 27 Apr 2018 • Sahil Garg, Nora Ayanian
We propose an adaptive solution for the problem where stochastic real-world dynamics are modeled as a Gaussian Process (GP).
no code implementations • 27 Apr 2018 • Sahil Garg
Therefore, for an efficient yet accurate inference, we propose to build an induced latent dynamics representation using a novel algorithm LISAL that adaptively maximizes entropy or mutual information on the induced latent dynamics and marginal likelihood of observed real dynamics in an iterative manner.
no code implementations • 26 Apr 2018 • Sahil Garg, Amarjeet Singh, Fabio Ramos
The core idea in LISAL is to learn two models using Gaussian processes (GPs) wherein the first is a nonstationary GP directly modeling the phenomenon.
no code implementations • 26 Apr 2018 • Sahil Garg, Irina Rish, Guillermo Cecchi, Palash Goyal, Sarik Ghazarian, Shuyang Gao, Greg Ver Steeg, Aram Galstyan
We also derive a novel lower bound on mutual information, used as a model-selection criterion favoring representations with better alignment between the utterances of participants in a collaborative dialogue setting, as well as higher predictability of the generated responses.
no code implementations • 11 Jan 2018 • Sahil Garg, Greg Ver Steeg, Aram Galstyan
Natural language processing often involves computations with semantic or syntactic graphs to facilitate sophisticated reasoning based on structural relationships.
1 code implementation • 10 Nov 2017 • Sahil Garg, Aram Galstyan, Greg Ver Steeg, Irina Rish, Guillermo Cecchi, Shuyang Gao
Here we propose to use random subspaces of KLSH codes for efficiently constructing an explicit representation of NLP structures suitable for general classification methods.
1 code implementation • 22 Jan 2017 • Sahil Garg, Irina Rish, Guillermo Cecchi, Aurelie Lozano
In this paper, we focus on online representation learning in non-stationary environments which may require continuous adaptation of model architecture.
1 code implementation • 4 Dec 2015 • Sahil Garg, Aram Galstyan, Ulf Hermjakob, Daniel Marcu
We advance the state of the art in biomolecular interaction extraction with three contributions: (i) We show that deep, Abstract Meaning Representations (AMR) significantly improve the accuracy of a biomolecular interaction extraction system when compared to a baseline that relies solely on surface- and syntax-based features; (ii) In contrast with previous approaches that infer relations on a sentence-by-sentence basis, we expand our framework to enable consistent predictions over sets of sentences (documents); (iii) We further modify and expand a graph kernel learning framework to enable concurrent exploitation of automatically induced AMR (semantic) and dependency structure (syntactic) representations.