Search Results for author: Sotirios Chatzis

Found 20 papers, 5 papers with code

Macroeconomic forecasting and sovereign risk assessment using deep learning techniques

no code implementations24 Jan 2023 Anastasios Petropoulos, Vassilis Siakoulis, Konstantinos P. Panousis, Loukas Papadoulas, Sotirios Chatzis

In this study, we propose a novel approach of nowcasting and forecasting the macroeconomic status of a country using deep learning techniques.


Stochastic Deep Networks with Linear Competing Units for Model-Agnostic Meta-Learning

1 code implementation2 Aug 2022 Konstantinos Kalais, Sotirios Chatzis

This work addresses meta-learning (ML) by considering deep networks with stochastic local winner-takes-all (LWTA) activations.

Active Learning Few-Shot Image Classification +1

Rethinking Bayesian Learning for Data Analysis: The Art of Prior and Inference in Sparsity-Aware Modeling

no code implementations28 May 2022 Lei Cheng, Feng Yin, Sergios Theodoridis, Sotirios Chatzis, Tsung-Hui Chang

However, a come back of Bayesian methods is taking place that sheds new light on the design of deep neural networks, which also establish firm links with Bayesian models and inspire new paths for unsupervised learning, such as Bayesian tensor decomposition.

Gaussian Processes Tensor Decomposition +1

Competing Mutual Information Constraints with Stochastic Competition-based Activations for Learning Diversified Representations

no code implementations10 Jan 2022 Konstantinos P. Panousis, Anastasios Antoniadis, Sotirios Chatzis

To this end, we combine information-theoretic arguments with stochastic competition-based activations, namely Stochastic Local Winner-Takes-All (LWTA) units.

Image Classification Representation Learning

Stochastic Local Winner-Takes-All Networks Enable Profound Adversarial Robustness

1 code implementation5 Dec 2021 Konstantinos P. Panousis, Sotirios Chatzis, Sergios Theodoridis

This work explores the potency of stochastic competition-based activations, namely Stochastic Local Winner-Takes-All (LWTA), against powerful (gradient-based) white-box and black-box adversarial attacks; we especially focus on Adversarial Training settings.

Adversarial Attack Adversarial Defense +2

Dialog speech sentiment classification for imbalanced datasets

no code implementations15 Sep 2021 Sergis Nicolaou, Lambros Mavrides, Georgina Tryfou, Kyriakos Tolias, Konstantinos Panousis, Sotirios Chatzis, Sergios Theodoridis

Speech is the most common way humans express their feelings, and sentiment analysis is the use of tools such as natural language processing and computational algorithms to identify the polarity of these feelings.

Classification Sentiment Analysis +1

Variational Bayesian Sequence-to-Sequence Networks for Memory-Efficient Sign Language Translation

no code implementations11 Feb 2021 Harris Partaourides, Andreas Voskou, Dimitrios Kosmopoulos, Sotirios Chatzis, Dimitris N. Metaxas

Memory-efficient continuous Sign Language Translation is a significant challenge for the development of assisted technologies with real-time applicability for the deaf.

Sign Language Translation Translation

Local Competition and Stochasticity for Adversarial Robustness in Deep Learning

no code implementations4 Jan 2021 Konstantinos P. Panousis, Sotirios Chatzis, Antonios Alexos, Sergios Theodoridis

The main operating principle of the introduced units lies on stochastic arguments, as the network performs posterior sampling over competing units to select the winner.

Adversarial Attack Adversarial Robustness

A Deep Learning Approach for Dynamic Balance Sheet Stress Testing

no code implementations23 Sep 2020 Anastasios Petropoulos, Vassilis Siakoulis, Konstantinos P. Panousis, Loukas Papadoulas, Sotirios Chatzis

Current stress testing methodologies attempt to simulate the risks underlying a financial institution's balance sheet by using several satellite models.


A Self-Attentive Emotion Recognition Network

1 code implementation24 Apr 2019 Harris Partaourides, Kostantinos Papadamou, Nicolas Kourtellis, Ilias Leontiadis, Sotirios Chatzis

Modern deep learning approaches have achieved groundbreaking performance in modeling and classifying sequential data.

Emotion Recognition

Quantum Statistics-Inspired Neural Attention

no code implementations17 Sep 2018 Aristotelis Charalampous, Sotirios Chatzis

Specifically, our work broadens the notion of NA, by attempting to account for the case that the NA model becomes inherently incapable of discerning between individual source elements; this is assumed to be the case due to higher-order temporal dynamics.

Machine Translation

t-Exponential Memory Networks for Question-Answering Machines

no code implementations4 Sep 2018 Kyriakos Tolias, Sotirios Chatzis

Our work addresses learning subtler and more complex underlying temporal dynamics in language modeling tasks that deal with sparse sequential data.

Language Modelling Question Answering +2

Amortized Context Vector Inference for Sequence-to-Sequence Networks

no code implementations23 May 2018 Kyriacos Tolias, Ioannis Kourouklides, Sotirios Chatzis

Neural attention (NA) has become a key component of sequence-to-sequence models that yield state-of-the-art performance in as hard tasks as abstractive document summarization (ADS) and video captioning (VC).

Document Summarization Variational Inference +1

Nonparametric Bayesian Deep Networks with Local Competition

1 code implementation19 May 2018 Konstantinos P. Panousis, Sotirios Chatzis, Sergios Theodoridis

To this end, we revisit deep networks that comprise competing linear units, as opposed to nonlinear units that do not entail any form of (local) competition.

Bayesian Inference

Deep learning with t-exponential Bayesian kitchen sinks

no code implementations10 Feb 2018 Harris Partaourides, Sotirios Chatzis

We effect model training by means of approximate inference based on a t-divergence measure; this generalizes the Kullback-Leibler divergence in the context of the t-exponential family of distributions.


Cannot find the paper you are looking for? You can Submit a new open access paper.