1 code implementation • EMNLP 2021 • Junyan Cheng, Iordanis Fostiropoulos, Barry Boehm, Mohammad Soleymani
We evaluate our model with three sentiment analysis datasets and achieve comparable or superior performance compared with the existing methods, with a 90% reduction in the number of parameters.
no code implementations • 11 Dec 2023 • Panos Achlioptas, Alexandros Benetatos, Iordanis Fostiropoulos, Dimitris Skourtis
In this work, we systematically study the problem of personalized text-to-image generation, where the output image is expected to portray information about specific human subjects.
1 code implementation • CVPR 2023 • Iordanis Fostiropoulos, Jiaye Zhu, Laurent Itti
During the $\textit{consolidation}$ phase, we combine the learned knowledge on 'batches' of $\textit{expert models}$ using a $\textit{batched consolidation loss}$ in $\textit{memory}$ data that aggregates all buffers.
1 code implementation • 24 May 2023 • Yunhao Ge, Yuecheng Li, Di wu, Ao Xu, Adam M. Jones, Amanda Sofie Rios, Iordanis Fostiropoulos, Shixian Wen, Po-Hsuan Huang, Zachary William Murdock, Gozde Sahin, Shuo Ni, Kiran Lekkala, Sumedh Anand Sontakke, Laurent Itti
We propose a new Shared Knowledge Lifelong Learning (SKILL) challenge, which deploys a decentralized population of LL agents that each sequentially learn different tasks, with all agents operating independently and in parallel.
no code implementations • 21 May 2023 • Iordanis Fostiropoulos, Bowman Brown, Laurent Itti
Machine learning is facing a 'reproducibility crisis' where a significant number of works report failures when attempting to reproduce previously published results.
no code implementations • 26 Nov 2022 • Iordanis Fostiropoulos, Laurent Itti
Inspired by the recent success of prototypical and contrastive learning frameworks for both improving robustness and learning nuance invariant representations, we propose a training framework, $\textbf{Supervised Contrastive Prototype Learning}$ (SCPL).
1 code implementation • CVPR 2022 • Iordanis Fostiropoulos, Barry Boehm
We use DQ in the context of Hierarchical Auto-Encoder and train end-to-end on an image feature representation.
1 code implementation • 1 Dec 2021 • Junyan Cheng, Iordanis Fostiropoulos, Barry Boehm
The fusion between a graph representation like Abstract Syntax Tree (AST) and a source code sequence makes the use of current approaches computationally intractable for large input sequence lengths.
1 code implementation • 17 Nov 2021 • Junyan Cheng, Iordanis Fostiropoulos, Barry Boehm
SCG is the result of the early fusion between a source code snippet and the AST representation.
1 code implementation • ICLR 2021 • Panagiotis Kyriakis, Iordanis Fostiropoulos, Paul Bogdan
Learning task-specific representations of persistence diagrams is an important problem in topological data analysis and machine learning.
no code implementations • 1 Jan 2021 • Junyan Cheng, Iordanis Fostiropoulos, Barry Boehm
As opposed to natural languages, source code understanding is influenced by grammar relations between tokens regardless of their identifier name.
1 code implementation • 11 Apr 2020 • Iordanis Fostiropoulos
Recent advancements in learning Discrete Representations as opposed to continuous ones have led to state of art results in tasks that involve Language, Audio and Vision.