no code implementations • 22 Oct 2024 • Qi Xiao, Lidong Song, Jongha Woo, Rongxing Hu, Bei Xu, Kai Ye, Ning Lu
By stealthily manipulating measurements of a critical asset prior to the target time period, the attacker can subtly guide the engineering system toward a predetermined operational state without detection.
no code implementations • 5 Oct 2024 • Jong Ha Woo, Qi Xiao, Victor Daldegan Paduani, Ning Lu
Initially, the method estimates equivalent irradiance from PV power, voltage, and current data, eliminating the need for direct irradiance sensors.
no code implementations • 14 Aug 2024 • Ning Lu, Qian Xie, Hao Zhang, Wenyi Fang, Yang Zheng, Zheng Hu, Jiantao Ma
In this work, we introduce a novel reliability metric called \emph{Training Overhead Ratio} (TOR) to evaluate the reliability of fault-tolerant LLM training systems.
no code implementations • 3 Jul 2024 • Jiahao Wu, Ning Lu, Zeiyu Dai, Wenqi Fan, Shengcai Liu, Qing Li, Ke Tang
Effective backdoor attacks on graph condensation aim to (1) maintain the quality and utility of condensed graphs despite trigger injections and (2) ensure trigger effectiveness through the condensation process, yielding a high attack success rate.
no code implementations • 2 Jun 2024 • Yi Hu, Hyeonjin Kim, Kai Ye, Ning Lu
This paper presents a novel method for utilizing fine-tuned Large Language Models (LLMs) to minimize data requirements in load profile analysis, demonstrated through the restoration of missing data in power system load profiles.
no code implementations • 12 Apr 2024 • Hyeonjin Kim, Yi Hu, Kai Ye, Ning Lu
This paper introduces ViT4LPA, an innovative Vision Transformer (ViT) based approach for Load Profile Analysis (LPA).
no code implementations • 26 Nov 2023 • Valliappan Muthukaruppan, Rongxing Hu, Ashwin Shirsat, Mesut Baran, Ning Lu, Wenyuan Tang, David Lubkeman
This papers highlights the benefit of coordinating resources on mulitple active distribution feeders during severe long duration outages through multi-microgrid formation.
no code implementations • 19 Nov 2023 • Qi Xiao, Jongha Woo, Lidong Song, Bei Xu, David Lubkeman, Ning Lu, Abdul Shafae Mohammed, Johan Enslin, Cara De Coste Chacko, Kat Sico, Steven G. Whisenant
The widespread deployment of inverter-based resources (IBRs) renders distribution systems susceptible to transmission-level faults.
no code implementations • 26 Oct 2023 • Yi Hu, Kai Ye, Hyeonjin Kim, Ning Lu
To adopt a standard Transformer model structure for profile inpainting, we segment the load and temperature profiles into line segments, treating each segment as a word and the entire profile as a sentence.
no code implementations • 25 Oct 2023 • Xiucheng Wang, Nan Cheng, Longfei Ma, Zhisheng Yin, Tom. Luan, Ning Lu
Two cascade neural networks (NN) are used to optimize the joint number of virtually generated UAVs, the DT construction cost, and the performance of multi-UAV networks.
no code implementations • 12 Sep 2023 • Han Pyo Lee, Keith DSouza, Ke Chen, Ning Lu, Mesut Baran
However, the effectiveness of this scheme is not well documented, and there is limited literature on alternative control and placement schemes that can maximize the effective use of a DVC.
no code implementations • 3 Sep 2023 • Bei Xu, Victor Paduani, Qi Xiao, Lidong Song, David Lubkeman, Ning Lu
Furthermore, in comparison to sectionalizer-based UFLS, using smart meters or controllable loads for UFLS allows for a more accurate per-phase load shedding in a progressive manner.
no code implementations • 29 Aug 2023 • Ruijin Liu, Ning Lu, Dapeng Chen, Cheng Li, Zejian yuan, Wei Peng
We present PBFormer, an efficient yet powerful scene text detector that unifies the transformer with a novel text shape representation Polynomial Band (PB).
1 code implementation • 17 Aug 2023 • Ziyin Zhang, Ning Lu, Minghui Liao, Yongshuai Huang, Cheng Li, Min Wang, Wei Peng
It incorporates a framewise regularization term in CTC loss to emphasize individual supervision, and leverages the maximizing-a-posteriori of latent alignment to solve the inconsistency problem that arises in distillation between CTC-based models.
1 code implementation • 18 May 2023 • Ning Lu, Shengcai Liu, Rui He, Qi Wang, Yew-Soon Ong, Ke Tang
Large language models (LLMs) have shown remarkable performance in various tasks and have been extensively utilized by the public.
no code implementations • 24 Apr 2023 • Wenwen Yu, MingYu Liu, Mingrui Chen, Ning Lu, Yinlong Wen, Yuliang Liu, Dimosthenis Karatzas, Xiang Bai
To promote research in this area, we organized ICDAR 2023 competition on reading the seal title (ReST), which included two tasks: seal title text detection (Task 1) and end-to-end seal title recognition (Task 2).
no code implementations • CVPR 2023 • Yongshuai Huang, Ning Lu, Dapeng Chen, Yibo Li, Zecheng Xie, Shenggao Zhu, Liangcai Gao, Wei Peng
The ablation study also validates that the proposed coordinate sequence decoder and the visual-alignment loss are the keys to the success of our method.
no code implementations • 10 Mar 2023 • Xiucheng Wang, Nan Cheng, Longfei Ma, Ruijin Sun, Rong Chai, Ning Lu
In this paper, to deal with the heterogeneity in federated learning (FL) systems, a knowledge distillation (KD) driven training framework for FL is proposed, where each user can select its neural network model on demand and distill knowledge from a big teacher model using its own private dataset.
no code implementations • 6 Feb 2023 • Ning Lu, Shengcai Liu, Zhirui Zhang, Qi Wang, Haifeng Liu, Ke Tang
Our comprehensive experiments reveal that in approximately 90\% of cases, word-level attacks lead to the generation of examples where the frequency of $n$-grams decreases, a tendency we term as the $n$-gram Frequency Descend ($n$-FD).
no code implementations • 19 Jan 2023 • Rongxing Hu, Ashwin Shirsat, Valliappan Muthukaruppan, Si Zhang, Yiyan Li, Lidong Song, Bei Xu, Victor Paduani, Ning Lu, Mesut Baran, Wenyuan Tang
This paper presents a novel 2-stage microgrid unit commitment (Microgrid-UC) algorithm considering cold-load pickup (CLPU) effects, three-phase load balancing requirements, and feasible reconfiguration options.
no code implementations • 16 Dec 2022 • Rongxing Hu, Kai Ye, Hyeonjin Kim, Hanpyo Lee, Ning Lu, Di wu, PJ Rehm
This paper presents a coordinative demand charge mitigation (DCM) strategy for reducing electricity consumption during system peak periods.
no code implementations • 9 Dec 2022 • Kai Ye, Hyeonjin Kim, Yi Hu, Ning Lu, Di wu, PJ Rehm
This paper presents a modified sequence-to-point (S2P) algorithm for disaggregating the heat, ventilation, and air conditioning (HVAC) load from the total building electricity consumption.
no code implementations • 7 Dec 2022 • Victor Paduani, Qi Xiao, Bei Xu, David Lubkeman, Ning Lu
The controller's objective is to control the PV and BESS to follow power setpoints sent to the the hybrid system while maintaining desired power reserves and meeting system operational constraints.
no code implementations • 29 Nov 2022 • Yiyan Li, Lidong Song, Yi Hu, Hanpyo Lee, Di wu, PJ Rehm, Ning Lu
We propose a Generator structure consisting of a coarse network and a fine-tuning network.
no code implementations • 7 Nov 2022 • Han Pyo Lee, Yiyan Li, Lidong Song, Di wu, Ning Lu
In contrast to many existing methods, we treat CVR baseline estimation as a missing data retrieval problem.
1 code implementation • 2 Nov 2022 • Ziyou Ren, Nan Cheng, Ruijin Sun, Xiucheng Wang, Ning Lu, Wenchao Xu
Multiple-input multiple-output and orthogonal frequency-division multiplexing (MIMO-OFDM) are the key technologies in 4G and subsequent wireless communication systems.
no code implementations • 3 Oct 2022 • Yi Hu, Yiyan Li, Lidong Song, Han Pyo Lee, PJ Rehm, Matthew Makdad, Edmond Miller, Ning Lu
This paper presents a deep-learning framework, Multi-load Generative Adversarial Network (MultiLoad-GAN), for generating a group of synthetic load profiles (SLPs) simultaneously.
no code implementations • 1 Oct 2022 • Han Pyo Lee, PJ Rehm, Matthew Makdad, Edmond Miller, Ning Lu
To ensure the credibility of the identification results, utility engineers conduct field verification for all 13 feeders.
no code implementations • 19 Sep 2022 • Hyeonjin Kim, Kai Ye, Han Pyo Lee, Rongxing Hu, Ning Lu, Di wu, PJ Rehm
The residual load profiles are processed using ICA for HVAC load extraction.
no code implementations • 23 Aug 2022 • Valliappan Muthukaruppan, Ashwin Shirsat, Rongxing Hu, Victor Paduani, Bei Xu, Yiyan Li, Mesut Baran, Ning Lu, David Lubkeman, Wenyuan Tang
The management of such feeder-level microgrid has however many challenges, such as limited resources that can be deployed on the feeder quickly, and the limited real-time monitoring and control on the distribution system.
no code implementations • 2 Aug 2022 • Longfei Ma, Nan Cheng, Xiucheng Wang, Ruijin Sun, Ning Lu
On-demand service provisioning is a critical yet challenging issue in 6G wireless communication networks, since emerging services have significantly diverse requirements and the network resources become increasingly heterogeneous and dynamic.
no code implementations • 10 Feb 2022 • Ashwin Shirsat, Valliappan Muthukaruppan, Rongxing Hu, Victor Paduani, Bei Xu, Lidong Song, Yiyan Li, Ning Lu, Mesut Baran, David Lubkeman, Wenyuan Tang
Distribution system integrated community microgrids (CMGs) can partake in restoring loads during extended duration outages.
1 code implementation • 23 Dec 2021 • Fu Peng, Shengcai Liu, Ning Lu, Ke Tang
This work considers a challenging Deep Neural Network(DNN) quantization task that seeks to train quantized DNNs without involving any full-precision operations.
no code implementations • 23 Nov 2021 • Si Zhang, Mingzhi Zhang, Rongxing Hu, David Lubkeman, Yunan Liu, Ning Lu
In Stage 1(individual training), while holding all the other agents inactive, we separately train each agent to obtain its own optimal VVC actions in the action space: {consume, generate, do-nothing}.
no code implementations • 20 Nov 2021 • Han Pyo Lee, Mingzhi Zhang, Mesut Baran, Ning Lu, PJ Rehm, Edmond Miller, Matthew Makdad
To improve the identification accuracy, a data segmentation method is proposed to exclude data segments that are collected when the voltage correlation between smart meters on the same phase are weakened.
no code implementations • 19 Nov 2021 • Victor Paduani, Bei Xu, David Lubkeman, Ning Lu
This paper presents the development and benchmarking of a novel real-time electromagnetic-transient and transient-stability (EMT-TS) modeling architecture for distribution feeder restoration studies.
no code implementations • 18 Nov 2021 • Bei Xu, Victor Paduani, Hui Yu, David Lubkeman, Ning Lu
Compared with the conventional rotating reference frame ($dq$) based control scheme, the proposed scheme shows better dynamic performance.
no code implementations • 16 Nov 2021 • Yiyan Li, Lidong Song, Si Zhang, Laura Kraus, Taylor Adcox, Roger Willardson, Abhishek Komandur, Ning Lu
The hybrid framework consists of two forecasting models: a physics-based trend forecasting (TF) model and a data-driven cloud-event forecasting (CF) model.
1 code implementation • 2 Nov 2021 • Shengcai Liu, Ning Lu, Wenjing Hong, Chao Qian, Ke Tang
The field of adversarial textual attack has significantly grown over the last few years, where the commonly considered objective is to craft adversarial examples (AEs) that can successfully fool the target model.
no code implementations • 14 Oct 2021 • Mingzhi Zhang, Xiangqi Zhu, Ning Lu
At the DER-level, a two-dimensional flexibility region can be formed based on the real and reactive power regulating limits of each DER considering forecast uncertainty.
no code implementations • 6 Sep 2021 • Shengcai Liu, Ning Lu, Cheng Chen, Ke Tang
Over the past few years, various word-level textual attack approaches have been proposed to reveal the vulnerability of deep neural networks used in natural language processing.
no code implementations • 18 Jul 2021 • Lidong Song, Yiyan Li, Ning Lu
When training the ProfileSR-GAN generator network, to make the generated profiles more realistic, we introduce two new shape-related losses in addition to conventionally used content loss: adversarial loss and feature-matching loss.
Generative Adversarial Network Non-Intrusive Load Monitoring +3
no code implementations • 11 May 2021 • Victor Paduani, Hui Yu, Bei Xu, Ning Lu
By using MPPE to decouple the impact of irradiance changes on the measured PV output power, we develop a fast convergence technique for tracking power-reference changes within three FPPT iterations.
no code implementations • 4 May 2021 • Asmaa Alrushoud, Ning Lu
IBR are used in the first stage to regulate voltage changes continuously and VR are used in the second stage to regulate large voltage deviations.
no code implementations • 31 Dec 2020 • Asmaa Alrushoud, Catie McEntee, Ning Lu
Because each zone is weakly coupled, voltage of each zone can be controlled independently.
no code implementations • 19 Nov 2020 • Ashwin Shirsat, Valliappan Muthukaruppan, Rongxing Hu, Ning Lu, Mesut Baran, David Lubkeman, Wenyuan Tang
The intermediate near real-time scheduling stage updates the DA schedule closer to the dispatch time, followed by the RT dispatch stage.
no code implementations • 16 Nov 2020 • Boyao Li, Tao Lu, Jiayi Li, Ning Lu, Yinghao Cai, Shuo Wang
Exploration in environments with sparse feedback remains a challenging research problem in reinforcement learning (RL).
no code implementations • 25 Sep 2020 • Yiyan Li, Si Zhang, Rongxing Hu, Ning Lu
This paper presents a meta-learning based, automatic distribution system load forecasting model selection framework.
1 code implementation • 3 Sep 2020 • Weijia Wu, Ning Lu, Enze Xie
To address the severe domain distribution mismatch, we propose a synthetic-to-real domain adaptation method for scene text detection, which transfers knowledge from synthetic data (source domain) to real data (target domain).
2 code implementations • 16 Apr 2020 • Wenwen Yu, Ning Lu, Xianbiao Qi, Ping Gong, Rong Xiao
Computer vision with state-of-the-art deep learning models has achieved huge success in the field of Optical Character Recognition (OCR) including text detection and recognition tasks recently.
no code implementations • 3 Apr 2020 • Ming Liang, Yao Meng, Jiyu Wang, David Lubkeman, Ning Lu
This paper presents a novel, automated, generative adversarial networks (GAN) based synthetic feeder generation mechanism, abbreviated as FeederGAN.
7 code implementations • 7 Oct 2019 • Ning Lu, Wenwen Yu, Xianbiao Qi, Yihao Chen, Ping Gong, Rong Xiao, Xiang Bai
Attention-based scene text recognizers have gained huge success, which leverages a more compact intermediate representation to learn 1d- or 2d- attention by a RNN-based encoder-decoder architecture.