1 code implementation • 15 Apr 2024 • Xianghua Zeng, Hao Peng, Dingli Su, Angsheng Li
An innovative two-layer skill-based learning mechanism is introduced to compute the common path entropy of each state transition as its identified probability, thereby obviating the requirement for expert knowledge.
1 code implementation • 18 Dec 2023 • Guangjie Zeng, Hao Peng, Angsheng Li, Zhiwei Liu, Runze Yang, Chunyang Liu, Lifang He
In this work, we present Semi-supervised clustering via Structural Entropy (SSE), a novel method that can incorporate different types of constraints from diverse sources to perform both partitioning and hierarchical clustering.
1 code implementation • 13 Dec 2023 • Xianghua Zeng, Hao Peng, Angsheng Li
The importance of effective detection is underscored by the fact that socialbots imitate human behavior to propagate misinformation, leading to an ongoing competition between socialbots and detectors.
1 code implementation • 5 Sep 2023 • Guangjie Zeng, Hao Peng, Angsheng Li, Zhiwei Liu, Chunyang Liu, Philip S. Yu, Lifang He
In this work, we propose a novel unsupervised Skin Lesion sEgmentation framework based on structural entropy and isolation forest outlier Detection, namely SLED.
1 code implementation • 24 Apr 2023 • Xianghua Zeng, Hao Peng, Angsheng Li, Chunyang Liu, Lifang He, Philip S. Yu
State abstraction optimizes decision-making by ignoring irrelevant environmental information in reinforcement learning with rich observations.
1 code implementation • 3 Apr 2023 • Xianghua Zeng, Hao Peng, Angsheng Li
Role-based learning is a promising approach to improving the performance of Multi-Agent Reinforcement Learning (MARL).
no code implementations • 27 Jan 2020 • Angsheng Li
It observes the data points of real world, builds the {\it connections} among the observed data and constructs a {\it data space}, for which the principle is to choose the way of connections of data points so that the {\it decoding information} of the data space is maximized, finds the {\it encoding tree} of the data space that minimizes the dynamical uncertainty of the data space, in which the encoding tree is hence referred to as a {\it decoder}, due to the fact that it has already eliminated the maximum amount of uncertainty embedded in the data space, interprets the {\it semantics} of the decoder, an encoding tree, to form a {\it knowledge tree}, extracts the {\it remarkable common features} for both semantical and syntactical features of the modules decoded by a decoder to construct {\it trees of abstractions}, providing the foundations for {\it intuitive reasoning} in the learning when new data are observed.
1 code implementation • NeurIPS 2019 • Yiwei Liu, Jiamou Liu, Zijian Zhang, Liehuang Zhu, Angsheng Li
This paper focuses on the privacy risks of disclosing the community structure in an online social network.
1 code implementation • 30 Apr 2018 • George Barmpalias, Neng Huang, Andrew Lewis-Pye, Angsheng Li, Xuechen Li, YiCheng Pan, Tim Roughgarden
We introduce the \emph{idemetric} property, which formalises the idea that most nodes in a graph have similar distances between them, and which turns out to be quite standard amongst small-world network models.
Social and Information Networks Discrete Mathematics