no code implementations • 6 Jun 2021 • Dvir Ben Or, Michael Kolomenkin, Gil Shabat
Dynamic difficulty adjustment ($DDA$) is a process of automatically changing a game difficulty for the optimization of user experience.
no code implementations • 28 Dec 2020 • Dvir Ben Or, Michael Kolomenkin, Gil Shabat
This note presents a simple way to add a count (or quantile) constraint to a regression neural net, such that given $n$ samples in the training set it guarantees that the prediction of $m<n$ samples will be larger than the actual value (the label).
no code implementations • 25 May 2019 • Gil Shabat, Era Choshen, Dvir Ben Or, Nadav Carmel
This paper presents a method for building a preconditioner for a kernel ridge regression problem, where the preconditioner is not only effective in its ability to reduce the condition number substantially, but also efficient in its application in terms of computational cost and memory consumption.