no code implementations • 7 Oct 2023 • Andreas Voskou, Konstantinos P. Panousis, Harris Partaourides, Kyriakos Tolias, Sotirios Chatzis
A characteristic example is Phoenix2014T benchmark dataset, which only covers weather forecasts in German Sign Language.
no code implementations • 24 Jan 2023 • Anastasios Petropoulos, Vassilis Siakoulis, Konstantinos P. Panousis, Loukas Papadoulas, Sotirios Chatzis
In this study, we propose a novel approach of nowcasting and forecasting the macroeconomic status of a country using deep learning techniques.
1 code implementation • 2 Aug 2022 • Konstantinos Kalais, Sotirios Chatzis
This work addresses meta-learning (ML) by considering deep networks with stochastic local winner-takes-all (LWTA) activations.
no code implementations • 28 May 2022 • Lei Cheng, Feng Yin, Sergios Theodoridis, Sotirios Chatzis, Tsung-Hui Chang
However, a come back of Bayesian methods is taking place that sheds new light on the design of deep neural networks, which also establish firm links with Bayesian models and inspire new paths for unsupervised learning, such as Bayesian tensor decomposition.
no code implementations • 10 Jan 2022 • Konstantinos P. Panousis, Anastasios Antoniadis, Sotirios Chatzis
To this end, we combine information-theoretic arguments with stochastic competition-based activations, namely Stochastic Local Winner-Takes-All (LWTA) units.
1 code implementation • 5 Dec 2021 • Konstantinos P. Panousis, Sotirios Chatzis, Sergios Theodoridis
This work explores the potency of stochastic competition-based activations, namely Stochastic Local Winner-Takes-All (LWTA), against powerful (gradient-based) white-box and black-box adversarial attacks; we especially focus on Adversarial Training settings.
Ranked #2 on Adversarial Robustness on CIFAR-10
no code implementations • 15 Sep 2021 • Sergis Nicolaou, Lambros Mavrides, Georgina Tryfou, Kyriakos Tolias, Konstantinos Panousis, Sotirios Chatzis, Sergios Theodoridis
Speech is the most common way humans express their feelings, and sentiment analysis is the use of tools such as natural language processing and computational algorithms to identify the polarity of these feelings.
1 code implementation • ICCV 2021 • Andreas Voskou, Konstantinos P. Panousis, Dimitrios Kosmopoulos, Dimitris N. Metaxas, Sotirios Chatzis
In this paper, we attenuate this need, by introducing an end-to-end SLT model that does not entail explicit use of glosses; the model only needs text groundtruth.
no code implementations • 11 Feb 2021 • Harris Partaourides, Andreas Voskou, Dimitrios Kosmopoulos, Sotirios Chatzis, Dimitris N. Metaxas
Memory-efficient continuous Sign Language Translation is a significant challenge for the development of assisted technologies with real-time applicability for the deaf.
no code implementations • 4 Jan 2021 • Konstantinos P. Panousis, Sotirios Chatzis, Antonios Alexos, Sergios Theodoridis
The main operating principle of the introduced units lies on stochastic arguments, as the network performs posterior sampling over competing units to select the winner.
no code implementations • 23 Sep 2020 • Anastasios Petropoulos, Vassilis Siakoulis, Konstantinos P. Panousis, Loukas Papadoulas, Sotirios Chatzis
Current stress testing methodologies attempt to simulate the risks underlying a financial institution's balance sheet by using several satellite models.
no code implementations • 18 Jun 2020 • Antonios Alexos, Konstantinos P. Panousis, Sotirios Chatzis
This work attempts to address adversarial robustness of deep networks by means of novel learning arguments.
no code implementations • 13 Feb 2020 • Konstantinos P. Panousis, Sotirios Chatzis, Sergios Theodoridis
Hidden Markov Models (HMMs) comprise a powerful generative approach for modeling sequential data and time-series in general.
1 code implementation • 24 Apr 2019 • Harris Partaourides, Kostantinos Papadamou, Nicolas Kourtellis, Ilias Leontiadis, Sotirios Chatzis
Modern deep learning approaches have achieved groundbreaking performance in modeling and classifying sequential data.
no code implementations • 17 Sep 2018 • Aristotelis Charalampous, Sotirios Chatzis
Specifically, our work broadens the notion of NA, by attempting to account for the case that the NA model becomes inherently incapable of discerning between individual source elements; this is assumed to be the case due to higher-order temporal dynamics.
no code implementations • 4 Sep 2018 • Kyriakos Tolias, Sotirios Chatzis
Our work addresses learning subtler and more complex underlying temporal dynamics in language modeling tasks that deal with sparse sequential data.
no code implementations • 23 May 2018 • Kyriacos Tolias, Ioannis Kourouklides, Sotirios Chatzis
Neural attention (NA) has become a key component of sequence-to-sequence models that yield state-of-the-art performance in as hard tasks as abstractive document summarization (ADS) and video captioning (VC).
1 code implementation • 19 May 2018 • Konstantinos P. Panousis, Sotirios Chatzis, Sergios Theodoridis
To this end, we revisit deep networks that comprise competing linear units, as opposed to nonlinear units that do not entail any form of (local) competition.
no code implementations • 10 Feb 2018 • Harris Partaourides, Sotirios Chatzis
We effect model training by means of approximate inference based on a t-divergence measure; this generalizes the Kullback-Leibler divergence in the context of the t-exponential family of distributions.
no code implementations • 13 Jun 2017 • Sotirios Chatzis, Panayiotis Christodoulou, Andreas S. Andreou
In this work, we attempt to ameliorate the impact of data sparsity in the context of session-based recommendation.