Search Results for author: Semyon Malamud

Found 6 papers, 2 papers with code

Large (and Deep) Factor Models

no code implementations20 Jan 2024 Bryan Kelly, Boris Kuznetsov, Semyon Malamud, Teng Andrea Xu

We open up the black box behind Deep Learning for portfolio optimization and prove that a sufficiently wide and arbitrarily deep neural network (DNN) trained to maximize the Sharpe ratio of the Stochastic Discount Factor (SDF) is equivalent to a large factor model (LFM): A linear factor pricing model that uses many non-linear characteristics.

Portfolio Optimization

A Simple Algorithm For Scaling Up Kernel Methods

1 code implementation26 Jan 2023 Teng Andrea Xu, Bryan Kelly, Semyon Malamud

The recent discovery of the equivalence between infinitely wide neural networks (NNs) in the lazy training regime and Neural Tangent Kernels (NTKs) (Jacot et al., 2018) has revived interest in kernel methods.

regression

Benign Autoencoders

1 code implementation2 Oct 2022 Semyon Malamud, Teng Andrea Xu, Antoine Didisheim

Recent progress in Generative Artificial Intelligence (AI) relies on efficient data representations, often featuring encoder-decoder architectures.

Decoder Dimensionality Reduction

Deep Regression Ensembles

no code implementations10 Mar 2022 Antoine Didisheim, Bryan Kelly, Semyon Malamud

Each layer of DRE has two components, randomly drawn input weights and output weights trained myopically (as if the final output layer) using linear ridge regression.

regression

Persuasion by Dimension Reduction

no code implementations17 Oct 2021 Semyon Malamud, Andreas Schrimpf

When the sender's marginal utility is linear, revealing the full magnitude of good information is always optimal.

Dimensionality Reduction

Optimal Transport of Information

no code implementations22 Feb 2021 Semyon Malamud, Anna Cieslak, Andreas Schrimpf

We study the general problem of Bayesian persuasion (optimal information design) with continuous actions and continuous state space in arbitrary dimensions.

Cannot find the paper you are looking for? You can Submit a new open access paper.