1 code implementation • 26 Jul 2023 • Maxime Leiber, Yosra Marnissi, Axel Barrau, Mohammed El Badaoui
In this paper, we propose a differentiable version of the short-time Fourier transform (STFT) that allows for gradient-based optimization of the hop length or the frame temporal position by making these parameters continuous.
1 code implementation • 26 Jul 2023 • Maxime Leiber, Yosra Marnissi, Axel Barrau, Mohammed El Badaoui
The resulting differentiable adaptive STFT possesses commendable properties, such as the ability to adapt in the same time-frequency representation to both transient and stationary components, while being easily optimized by gradient descent.
1 code implementation • 23 Aug 2022 • Maxime Leiber, Axel Barrau, Yosra Marnissi, Dany Abboud
In this paper, we revisit the use of spectrograms in neural networks, by making the window length a continuous parameter optimizable by gradient descent instead of an empirically tuned integer-valued hyperparameter.
no code implementations • 24 Oct 2016 • Yosra Marnissi, Yuling Zheng, Emilie Chouzenoux, Jean-Christophe Pesquet
We demonstrate the potential of the proposed approach through comparisons with state-of-the-art techniques that are specifically tailored to signal recovery in the presence of mixed Poisson-Gaussian noise.