You need to log in to edit.

You can create a new account if you don't have one.

Or, discuss a change on Slack.

You can create a new account if you don't have one.

Or, discuss a change on Slack.

no code implementations • 28 Jul 2022 • Ping Wei, Sheng Li, Xinpeng Zhang, Ge Luo, Zhenxing Qian, Qing Zhou

A new steganographic approach called generative steganography (GS) has emerged recently, in which stego images (images containing secret data) are generated from secret data directly without cover media.

no code implementations • 12 Jun 2022 • Arash A. Amini, Bryon Aragam, Qing Zhou

We introduce and study the neighbourhood lattice decomposition of a distribution, which is a compact, non-graphical representation of conditional independence that is valid in the absence of a faithful graphical representation.

no code implementations • 3 Feb 2022 • Gabriel Ruiz, Oscar Hernan Madrid Padilla, Qing Zhou

We demonstrate a novel application of this general approach to estimate the topological ordering of a DAG.

no code implementations • 23 Jan 2022 • Qiaoling Ye, Arash A. Amini, Qing Zhou

We consider the task of learning causal structures from data stored on multiple machines, and propose a novel structure learning method called distributed annealing on regularized likelihood score (DARLS) to solve this problem.

no code implementations • 22 Mar 2021 • Jireh Huang, Qing Zhou

We develop a novel hybrid method for Bayesian network structure learning called partitioned hybrid greedy search (pHGS), composed of three distinct yet compatible new algorithms: Partitioned PC (pPC) accelerates skeleton learning via a divide-and-conquer strategy, $p$-value adjacency thresholding (PATH) effectively accomplishes parameter tuning with a single execution, and hybrid greedy initialization (HGI) maximally utilizes constraint-based information to obtain a high-scoring and well-performing initial graph for greedy search.

no code implementations • JEPTALNRECITAL 2020 • Qing Zhou, Didier Demolin

Dans la premi{\`e}re, des stimuli monosyllabiques produits naturellement, compos{\'e}s de 9 attaques ([{\o}(z{\'e}ro), p, t, tʰ, tɕ, ɕ, tʂ, tʂʰ, m]) et 2 rimes ([i, ɑu]), ont {\'e}t{\'e} identifi{\'e}s par 19 apprenants fran{\c{c}}ais de mandarin de niveau d{\'e}butant et 18 auditeurs de langue maternelle mandarin.

no code implementations • 20 Apr 2020 • Bingling Wang, Qing Zhou

Discovery of causal relationships from observational data is an important problem in many areas.

no code implementations • NeurIPS 2019 • Bryon Aragam, Arash Amini, Qing Zhou

We prove that $\Omega(s\log p)$ samples suffice to learn a sparse Gaussian directed acyclic graph (DAG) from data, where $s$ is the maximum Markov blanket size.

no code implementations • 3 Sep 2019 • Arash A. Amini, Bryon Aragam, Qing Zhou

Knowing when a graphical model is perfect to a distribution is essential in order to relate separation in the graph to conditional independence in the distribution, and this is particularly important when performing inference from data.

no code implementations • 26 May 2019 • Hangjian Li, Oscar Hernan Madrid Padilla, Qing Zhou

Structural learning of directed acyclic graphs (DAGs) or Bayesian networks has been studied extensively under the assumption that data are independent.

1 code implementation • 28 Apr 2019 • Qiaoling Ye, Arash A. Amini, Qing Zhou

We propose a novel structure learning method, annealing on regularized Cholesky score (ARCS), to search over topological sorts, or permutations of nodes, for a high-scoring Bayesian network.

no code implementations • 24 Apr 2019 • Jiaying Gu, Qing Zhou

Structure learning of Bayesian networks has always been a challenging problem.

1 code implementation • 3 Nov 2017 • Arash A. Amini, Bryon Aragam, Qing Zhou

We study the computational complexity of computing these structures and show that under a sparsity assumption, they can be computed in polynomial time, even in the absence of the assumption of perfectness to a graph.

3 code implementations • 11 Mar 2017 • Bryon Aragam, Jiaying Gu, Qing Zhou

To meet this challenge, we have developed a new R package called sparsebn for learning the structure of large, sparse graphical models with a focus on Bayesian networks.

2 code implementations • 29 Nov 2015 • Bryon Aragam, Arash A. Amini, Qing Zhou

We study a family of regularized score-based estimators for learning the structure of a directed acyclic graph (DAG) for a multivariate normal distribution from high-dimensional data with $p\gg n$.

no code implementations • 4 Dec 2014 • Yuliya Marchetti, Qing Zhou

We develop an iterative subsampling approach to improve the computational efficiency of our previous work on solution path clustering (SPC).

no code implementations • 24 Apr 2014 • Yuliya Marchetti, Qing Zhou

Fast accumulation of large amounts of complex data has created a need for more sophisticated statistical methodologies to discover interesting patterns and better extract information from these data.

2 code implementations • 10 Mar 2014 • Jiaying Gu, Fei Fu, Qing Zhou

Bayesian networks, with structure given by a directed acyclic graph (DAG), are a popular class of graphical models.

no code implementations • 17 Jan 2014 • Qing Zhou

Regularized linear regression under the $\ell_1$ penalty, such as the Lasso, has been shown to be effective in variable selection and sparse modeling.

no code implementations • 4 Jan 2014 • Bryon Aragam, Qing Zhou

We develop a penalized likelihood estimation framework to estimate the structure of Gaussian Bayesian networks from observational data.

Cannot find the paper you are looking for? You can
Submit a new open access paper.

Contact us on:
hello@paperswithcode.com
.
Papers With Code is a free resource with all data licensed under CC-BY-SA.