Search Results for author: Michael Chertkov

Found 45 papers, 3 papers with code

Space-Time Bridge-Diffusion

no code implementations13 Feb 2024 Hamidreza Behjoo, Michael Chertkov

In this study, we introduce a novel method for generating new synthetic samples that are independent and identically distributed (i. i. d.)

System-Wide Emergency Policy for Transitioning from Main to Secondary Fuel

no code implementations15 Nov 2023 Laurent Pagnier, Igal Goldshtein, Criston Hyett, Robert Ferrando, Jean Alisse, Lilah Saban, Michael Chertkov

Inspired by the challenges of running the Israel's power system -- with its increasing integration of renewables, significant load uncertainty, and primary reliance on natural gas -- we investigate an emergency scenario where there's a need to transition temporarily to a pricier secondary fuel until the emergency resolves.

U-Turn Diffusion

no code implementations14 Aug 2023 Hamidreza Behjoo, Michael Chertkov

We present a comprehensive examination of score-based diffusion models of AI for generating synthetic images.

Universality and Control of Fat Tails

no code implementations16 Mar 2023 Michael Chertkov

Motivated by applications in hydrodynamics and networks of thermostatically-control loads in buildings we study control of linear dynamical systems driven by additive and also multiplicative noise of a general position.

Position

Exact Fractional Inference via Re-Parametrization & Interpolation between Tree-Re-Weighted- and Belief Propagation- Algorithms

no code implementations25 Jan 2023 Hamidreza Behjoo, Michael Chertkov

Inference efforts -- required to compute partition function, $Z$, of an Ising model over a graph of $N$ ``spins" -- are most likely exponential in $N$.

Machine Learning for Electricity Market Clearing

no code implementations23 May 2022 Laurent Pagnier, Robert Ferrando, Yury Dvorkin, Michael Chertkov

This paper seeks to design a machine learning twin of the optimal power flow (OPF) optimization, which is used in market-clearing procedures by wholesale electricity markets.

BIG-bench Machine Learning

Towards Model Reduction for Power System Transients with Physics-Informed PDE

no code implementations26 Oct 2021 Laurent Pagnier, Michael Chertkov, Julian Fritzsch, Philippe Jacquod

This manuscript reports the first step towards building a robust and efficient model reduction methodology to capture transient dynamics in a transmission level electric power system.

Physics-informed machine learning

Which Neural Network to Choose for Post-Fault Localization, Dynamic State Estimation and Optimal Measurement Placement in Power Systems?

no code implementations7 Apr 2021 Andrei Afonin, Michael Chertkov

We consider a power transmission system monitored with Phasor Measurement Units (PMUs) placed at significant, but not all, nodes of the system.

Fault localization

Embedding Power Flow into Machine Learning for Parameter and State Estimation

no code implementations26 Mar 2021 Laurent Pagnier, Michael Chertkov

Modern state and parameter estimations in power systems consist of two stages: the outer problem of minimizing the mismatch between network observation and prediction over the network parameters, and the inner problem of predicting the system state for given values of the parameters.

BIG-bench Machine Learning

Message Passing Descent for Efficient Machine Learning

no code implementations16 Feb 2021 Francesco Concetti, Michael Chertkov

We suggest the {\bf Message Passage Descent} algorithm which relies on the piece-wise-polynomial representation of the model DF function.

BIG-bench Machine Learning

Physics-Informed Graphical Neural Network for Parameter & State Estimations in Power Systems

no code implementations12 Feb 2021 Laurent Pagnier, Michael Chertkov

Parameter Estimation (PE) and State Estimation (SE) are the most wide-spread tasks in the system engineering.

Neural Particle Image Velocimetry

no code implementations28 Jan 2021 Nikolay Stulov, Michael Chertkov

In this work, we introduce a convolutional neural network adapted to the problem, namely Volumetric Correspondence Network (VCN) which was recently proposed for the end-to-end optical flow estimation in computer vision.

Optical Flow Estimation

Super-relaxation of space-time-quantized ensemble of energy loads to curtail their synchronization after demand response perturbation

no code implementations3 Aug 2020 Ilia Luchnikov, David Métivier, Henni Ouerdane, Michael Chertkov

However, this also results in the parasitic synchronization of individual devices within the ensemble leading to long post-demand-response oscillations in the integrated energy consumption of the ensemble.

Quantization

A Hierarchical Approach to Multi-Energy Demand Response: From Electricity to Multi-Energy Applications

no code implementations5 May 2020 Ali Hassan, Samrat Acharya, Michael Chertkov, Deepjyoti Deka, Yury Dvorkin

Due to proliferation of energy efficiency measures and availability of the renewable energy resources, traditional energy infrastructure systems (electricity, heat, gas) can no longer be operated in a centralized manner under the assumption that consumer behavior is inflexible, i. e. cannot be adjusted in return for an adequate incentive.

Data-Driven Learning and Load Ensemble Control

no code implementations20 Apr 2020 Ali Hassan, Deepjyoti Deka, Michael Chertkov, Yury Dvorkin

Demand response (DR) programs aim to engage distributed small-scale flexible loads, such as thermostatically controllable loads (TCLs), to provide various grid support services.

reinforcement-learning Reinforcement Learning (RL)

Wavelet-Powered Neural Networks for Turbulence

no code implementations ICLR Workshop DeepDiffEq 2019 Arvind T. Mohan, Daniel Livescu, Michael Chertkov

One of the fundamental driving phenomena for applications in engineering, earth sciences and climate is fluid turbulence.

Embedding Hard Physical Constraints in Neural Network Coarse-Graining of 3D Turbulence

no code implementations31 Jan 2020 Arvind T. Mohan, Nicholas Lubbers, Daniel Livescu, Michael Chertkov

In the recent years, deep learning approaches have shown much promise in modeling complex systems in the physical sciences.

Computational Physics

Tractable Minor-free Generalization of Planar Zero-field Ising Models

no code implementations22 Oct 2019 Valerii Likhosherstov, Yury Maximov, Michael Chertkov

We present a new family of zero-field Ising models over $N$ binary variables/spins obtained by consecutive "gluing" of planar and $O(1)$-sized components and subsets of at most three vertices into a tree.

A New Family of Tractable Ising Models

no code implementations14 Jun 2019 Valerii Likhosherstov, Yury Maximov, Michael Chertkov

To illustrate the utility of the new family of tractable graphical models, we first build an $O(N^{3/2})$ algorithm for inference and sampling of the K5-minor-free zero-field Ising models - an extension of the planar zero-field Ising models - which is neither genus- nor treewidth-bounded.

Data Structures and Algorithms Statistical Mechanics Data Analysis, Statistics and Probability Computation

Compressed Convolutional LSTM: An Efficient Deep Learning framework to Model High Fidelity 3D Turbulence

no code implementations28 Feb 2019 Arvind Mohan, Don Daniel, Michael Chertkov, Daniel Livescu

High-fidelity modeling of turbulent flows is one of the major challenges in computational physics, with diverse applications in engineering, earth sciences and astrophysics, among many others.

Fluid Dynamics Chaotic Dynamics Computational Physics

Learning a Generator Model from Terminal Bus Data

no code implementations3 Jan 2019 Nikolay Stulov, Dejan J Sobajic, Yury Maximov, Deepjyoti Deka, Michael Chertkov

In this work we investigate approaches to reconstruct generator models from measurements available at the generator terminal bus using machine learning (ML) techniques.

BIG-bench Machine Learning

Inference and Sampling of $K_{33}$-free Ising Models

2 code implementations22 Dec 2018 Valerii Likhosherstov, Yury Maximov, Michael Chertkov

We call an Ising model tractable when it is possible to compute its partition function value (statistical inference) in polynomial time.

Gauges, Loops, and Polynomials for Partition Functions of Graphical Models

no code implementations12 Nov 2018 Michael Chertkov, Vladimir Chernyak, Yury Maximov

We show that the Gauge Function has a natural polynomial representation in terms of gauges/variables associated with edges of the multi-graph.

From Deep to Physics-Informed Learning of Turbulence: Diagnostics

no code implementations16 Oct 2018 Ryan King, Oliver Hennigh, Arvind Mohan, Michael Chertkov

We describe tests validating progress made toward acceleration and automation of hydrodynamic codes in the regime of developed turbulence by three Deep Learning (DL) Neural Network (NN) schemes trained on Direct Numerical Simulations of turbulence.

Real-time Faulted Line Localization and PMU Placement in Power Systems through Convolutional Neural Networks

no code implementations11 Oct 2018 Wenting Li, Deepjyoti Deka, Michael Chertkov, Meng Wang

Diverse fault types, fast re-closures, and complicated transient states after a fault event make real-time fault location in power grids challenging.

Gauged Mini-Bucket Elimination for Approximate Inference

no code implementations5 Jan 2018 Sungsoo Ahn, Michael Chertkov, Jinwoo Shin, Adrian Weller

Recently, so-called gauge transformations were used to improve variational lower bounds on $Z$.

Online Learning of Power Transmission Dynamics

no code implementations27 Oct 2017 Andrey Y. Lokhov, Marc Vuffray, Dmitry Shemetov, Deepjyoti Deka, Michael Chertkov

We consider the problem of reconstructing the dynamic state matrix of transmission power grids from time-stamped PMU measurements in the regime of ambient fluctuations.

Belief Propagation Min-Sum Algorithm for Generalized Min-Cost Network Flow

no code implementations20 Oct 2017 Andrii Riazanov, Yury Maximov, Michael Chertkov

Belief Propagation algorithms are instruments used broadly to solve graphical model optimization and statistical inference problems.

Model Optimization

Topology Estimation in Bulk Power Grids: Guarantees on Exact Recovery

no code implementations5 Jul 2017 Deepjyoti Deka, Saurav Talukdar, Michael Chertkov, Murti Salapaka

For grids that include cycles of length three, we provide sufficient conditions that ensure existence of algorithms for exact reconstruction.

Gauging Variational Inference

no code implementations NeurIPS 2017 Sungsoo Ahn, Michael Chertkov, Jinwoo Shin

Computing partition function is the most important statistical inference task arising in applications of Graphical Models (GM).

Variational Inference

Optimal structure and parameter learning of Ising models

1 code implementation15 Dec 2016 Andrey Y. Lokhov, Marc Vuffray, Sidhant Misra, Michael Chertkov

Reconstruction of structure and parameters of an Ising model from binary samples is a problem of practical importance in a variety of disciplines, ranging from statistical physics and computational biology to image processing and machine learning.

Synthesis of MCMC and Belief Propagation

no code implementations NeurIPS 2016 Sung-Soo Ahn, Michael Chertkov, Jinwoo Shin

In this paper, we introduce MCMC algorithms correcting the approximation error of BP, i. e., we provide a way to compensate for BP errors via a consecutive BP-aware MCMC.

Graphical Models for Optimal Power Flow

no code implementations21 Jun 2016 Krishnamurthy Dvijotham, Pascal Van Hentenryck, Michael Chertkov, Sidhant Misra, Marc Vuffray

In this paper, we formulate the optimal power flow problem over tree networks as an inference problem over a tree-structured graphical model where the nodal variables are low-dimensional vectors.

MCMC assisted by Belief Propagation

no code implementations29 May 2016 Sungsoo Ahn, Michael Chertkov, Jinwoo Shin

Furthermore, we also design an efficient rejection-free MCMC scheme for approximating the full series.

Interaction Screening: Efficient and Sample-Optimal Learning of Ising Models

no code implementations NeurIPS 2016 Marc Vuffray, Sidhant Misra, Andrey Y. Lokhov, Michael Chertkov

We prove that with appropriate regularization, the estimator recovers the underlying graph using a number of samples that is logarithmic in the system size p and exponential in the maximum coupling-intensity and maximum node-degree.

Minimum Weight Perfect Matching via Blossom Belief Propagation

no code implementations NeurIPS 2015 Sungsoo Ahn, Sejun Park, Michael Chertkov, Jinwoo Shin

Max-product Belief Propagation (BP) is a popular message-passing algorithm for computing a Maximum-A-Posteriori (MAP) assignment over a distribution represented by a Graphical Model (GM).

Combinatorial Optimization

Learning Planar Ising Models

no code implementations3 Feb 2015 Jason K. Johnson, Diane Oyen, Michael Chertkov, Praneeth Netrapalli

Inference and learning of graphical models are both well-studied problems in statistics and machine learning that have found many applications in science and engineering.

Approximate inference on planar graphs using Loop Calculus and Belief Propagation

no code implementations9 Aug 2014 Vicenc Gomez, Hilbert Kappen, Michael Chertkov

We introduce novel results for approximate inference on planar graphical models using the loop calculus framework.

Loop Calculus and Bootstrap-Belief Propagation for Perfect Matchings on Arbitrary Graphs

no code implementations5 Jun 2013 Michael Chertkov, Andrew Gelfand, Jinwoo Shin

This manuscript discusses computation of the Partition Function (PF) and the Minimum Weight Perfect Matching (MWPM) on arbitrary, non-bipartite graphs.

Belief Propagation for Linear Programming

no code implementations17 May 2013 Andrew Gelfand, Jinwoo Shin, Michael Chertkov

For this class of problems, MAP inference can be stated as an integer LP with an LP relaxation that coincides with minimization of the BFE at ``zero temperature".

Cannot find the paper you are looking for? You can Submit a new open access paper.