no code implementations • 20 Jul 2023 • Alina Landowska, Marek Robak, Maciej Skorski
This study aims to investigate the futures projected by futurists on Twitter and explore the impact of language cues on anticipatory thinking among social media users.
no code implementations • 21 Mar 2023 • Maciej Skorski, Alessandro Temperoni
This paper revisits the performance of Rademacher random projections, establishing novel statistical guarantees that are numerically sharp and non-oblivious with respect to the input data.
no code implementations • 22 Feb 2022 • Maciej Skorski, Alessandro Temperoni, Martin Theobald
In this work, we improve upon the guarantees for sparse random embeddings, as they were recently provided and analyzed by Freksen at al. (NIPS'18) and Jagadeesan (NIPS'19).
no code implementations • 14 Apr 2021 • Maciej Skorski
This work characterizes the maximal mean-squared error of the Good-Turing estimator, for any sample \emph{and} alphabet size.
1 code implementation • 6 Apr 2021 • Maciej Skorski
The seminal result of Johnson and Lindenstrauss on random embeddings has been intensively studied in applied and theoretical computer science.
1 code implementation • 6 Jan 2021 • Maciej Skorski
This work obtains sharp closed-form exponential concentration inequalities of Bernstein type for the ubiquitous beta distribution, improving upon sub-gaussian and sub-gamma bounds previously studied in this context.
Probability Statistics Theory Applications Statistics Theory 60E05 G.3
no code implementations • 31 Dec 2020 • Maciej Skorski
This work constructs Jonson-Lindenstrauss embeddings with best accuracy, as measured by variance, mean-squared error and exponential concentration of the length distortion.
no code implementations • 23 Dec 2020 • Maciej Skorski
The paper establishes the new state-of-art in the accuracy analysis of Hutchinson's trace estimator.
no code implementations • 20 Aug 2020 • Maciej Skorski
The paper re-analyzes a version of the celebrated Johnson-Lindenstrauss Lemma, in which matrices are subjected to constraints that naturally emerge from neuroscience applications: a) sparsity and b) sign-consistency.
no code implementations • 19 May 2020 • Maciej Skorski
We revisit the problem of \emph{missing mass concentration}, developing a new method of estimating concentration of heterogenic sums, in spirit of celebrated Rosenthal's inequality.
no code implementations • 20 Apr 2020 • Maciej Skorski, Alessandro Temperoni, Martin Theobald
The proper initialization of weights is crucial for the effective training and fast convergence of deep neural networks (DNNs).
no code implementations • 10 Apr 2020 • Maciej Skorski
This short paper discusses an efficient implementation of \emph{sampled softmax loss} for Tensorflow.
no code implementations • 28 Feb 2019 • Maciej Skorski
Bayes factors, in many cases, have been proven to bridge the classic -value based significance testing and bayesian analysis of posterior odds.
no code implementations • 2 Jan 2019 • Maciej Skorski
The contribution of this paper is twofold (a) we demonstrate that, when the bandwidth is an arbitrary invertible matrix going to zero, it is necessary to keep a certain balance between the \emph{kernel decay} and \emph{magnitudes of bandwidth eigenvalues}; in fact, without the sufficient decay the estimates may not be even bounded (b) we give a rigorous derivation of bounds with explicit constants for the bias, under possibly minimal assumptions.
no code implementations • 13 Aug 2018 • Maciej Skorski
Root Cause Analysis for Anomalies is challenging because of the trade-off between the accuracy and its explanatory friendliness, required for industrial applications.