no code implementations • 4 Jan 2014 • Bryon Aragam, Qing Zhou
We develop a penalized likelihood estimation framework to estimate the structure of Gaussian Bayesian networks from observational data.
no code implementations • 17 Jan 2014 • Qing Zhou
Regularized linear regression under the $\ell_1$ penalty, such as the Lasso, has been shown to be effective in variable selection and sparse modeling.
1 code implementation • 10 Mar 2014 • Jiaying Gu, Fei Fu, Qing Zhou
Bayesian networks, with structure given by a directed acyclic graph (DAG), are a popular class of graphical models.
no code implementations • 24 Apr 2014 • Yuliya Marchetti, Qing Zhou
Fast accumulation of large amounts of complex data has created a need for more sophisticated statistical methodologies to discover interesting patterns and better extract information from these data.
no code implementations • 4 Dec 2014 • Yuliya Marchetti, Qing Zhou
We develop an iterative subsampling approach to improve the computational efficiency of our previous work on solution path clustering (SPC).
1 code implementation • 29 Nov 2015 • Bryon Aragam, Arash A. Amini, Qing Zhou
We study a family of regularized score-based estimators for learning the structure of a directed acyclic graph (DAG) for a multivariate normal distribution from high-dimensional data with $p\gg n$.
2 code implementations • 11 Mar 2017 • Bryon Aragam, Jiaying Gu, Qing Zhou
To meet this challenge, we have developed a new R package called sparsebn for learning the structure of large, sparse graphical models with a focus on Bayesian networks.
1 code implementation • 3 Nov 2017 • Arash A. Amini, Bryon Aragam, Qing Zhou
We study the computational complexity of computing these structures and show that under a sparsity assumption, they can be computed in polynomial time, even in the absence of the assumption of perfectness to a graph.
no code implementations • 24 Apr 2019 • Jiaying Gu, Qing Zhou
Structure learning of Bayesian networks has always been a challenging problem.
1 code implementation • 28 Apr 2019 • Qiaoling Ye, Arash A. Amini, Qing Zhou
We propose a novel structure learning method, annealing on regularized Cholesky score (ARCS), to search over topological sorts, or permutations of nodes, for a high-scoring Bayesian network.
no code implementations • 26 May 2019 • Hangjian Li, Oscar Hernan Madrid Padilla, Qing Zhou
Structural learning of directed acyclic graphs (DAGs) or Bayesian networks has been studied extensively under the assumption that data are independent.
no code implementations • 3 Sep 2019 • Arash A. Amini, Bryon Aragam, Qing Zhou
Knowing when a graphical model is perfect to a distribution is essential in order to relate separation in the graph to conditional independence in the distribution, and this is particularly important when performing inference from data.
no code implementations • NeurIPS 2019 • Bryon Aragam, Arash Amini, Qing Zhou
We prove that $\Omega(s\log p)$ samples suffice to learn a sparse Gaussian directed acyclic graph (DAG) from data, where $s$ is the maximum Markov blanket size.
no code implementations • 20 Apr 2020 • Bingling Wang, Qing Zhou
Discovery of causal relationships from observational data is an important problem in many areas.
no code implementations • JEPTALNRECITAL 2020 • Qing Zhou, Didier Demolin
Dans la premi{\`e}re, des stimuli monosyllabiques produits naturellement, compos{\'e}s de 9 attaques ([{\o}(z{\'e}ro), p, t, tʰ, tɕ, ɕ, tʂ, tʂʰ, m]) et 2 rimes ([i, ɑu]), ont {\'e}t{\'e} identifi{\'e}s par 19 apprenants fran{\c{c}}ais de mandarin de niveau d{\'e}butant et 18 auditeurs de langue maternelle mandarin.
no code implementations • 22 Mar 2021 • Jireh Huang, Qing Zhou
We develop a novel hybrid method for Bayesian network structure learning called partitioned hybrid greedy search (pHGS), composed of three distinct yet compatible new algorithms: Partitioned PC (pPC) accelerates skeleton learning via a divide-and-conquer strategy, $p$-value adjacency thresholding (PATH) effectively accomplishes parameter tuning with a single execution, and hybrid greedy initialization (HGI) maximally utilizes constraint-based information to obtain a high-scoring and well-performing initial graph for greedy search.
no code implementations • 23 Jan 2022 • Qiaoling Ye, Arash A. Amini, Qing Zhou
We consider the task of learning causal structures from data stored on multiple machines, and propose a novel structure learning method called distributed annealing on regularized likelihood score (DARLS) to solve this problem.
no code implementations • 3 Feb 2022 • Gabriel Ruiz, Oscar Hernan Madrid Padilla, Qing Zhou
We demonstrate a novel application of this general approach to estimate the topological ordering of a DAG.
no code implementations • 12 Jun 2022 • Arash A. Amini, Bryon Aragam, Qing Zhou
We introduce and study the neighbourhood lattice decomposition of a distribution, which is a compact, non-graphical representation of conditional independence that is valid in the absence of a faithful graphical representation.
no code implementations • 28 Jul 2022 • Ping Wei, Sheng Li, Xinpeng Zhang, Ge Luo, Zhenxing Qian, Qing Zhou
A new steganographic approach called generative steganography (GS) has emerged recently, in which stego images (images containing secret data) are generated from secret data directly without cover media.
no code implementations • 5 May 2023 • Ping Wei, Qing Zhou, Zichi Wang, Zhenxing Qian, Xinpeng Zhang, Sheng Li
However, existing GAN-based GS methods cannot completely recover the hidden secret data due to the lack of network invertibility, while Flow-based methods produce poor image quality due to the stringent reversibility restriction in each module.