no code implementations • 18 Jul 2023 • Nathaniel Josephs, Arash A. Amini, Marina Paez, Lizhen Lin
We introduce the nested stochastic block model (NSBM) to cluster a collection of networks while simultaneously detecting communities within each network.
no code implementations • 29 May 2023 • Ilsang Ohn, Lizhen Lin, Yongdai Kim
In this paper, we propose a new Bayesian inference method for a high-dimensional sparse factor model that allows both the factor dimensionality and the sparse structure of the loading matrix to be inferred.
no code implementations • 21 Apr 2023 • Steven Winter, Trevor Campbell, Lizhen Lin, Sanvesh Srivastava, David B. Dunson
Bayesian models are a powerful tool for studying complex data, allowing the analyst to encode rich hierarchical dependencies and leverage prior information.
no code implementations • 5 Mar 2023 • Forough Fazeli-Asl, Michael Minyi Zhang, Lizhen Lin
Bayesian methods for GOF can be appealing due to their ability to incorporate expert knowledge through prior distributions.
no code implementations • 16 Feb 2023 • Yihao Fang, Ilsang Ohn, Vijay Gupta, Lizhen Lin
We propose extrinsic and intrinsic deep neural network architectures as general frameworks for deep learning on manifolds.
no code implementations • 21 Dec 2022 • Yihao Fang, Mu Niu, Pokman Cheung, Lizhen Lin
We propose an extrinsic Bayesian optimization (eBO) framework for general optimization problems on manifolds.
1 code implementation • 4 Mar 2022 • Luyi Shen, Arash Amini, Nathaniel Josephs, Lizhen Lin
The increasing prevalence of network data in a vast variety of fields and the need to extract useful information out of them have spurred fast developments in related models and algorithms.
1 code implementation • 25 Nov 2021 • Bernardo Aquino, Arash Rahnama, Peter Seiler, Lizhen Lin, Vijay Gupta
Adversarial examples can easily degrade the classification performance in neural networks.
no code implementations • 7 Sep 2021 • Ilsang Ohn, Lizhen Lin
In this paper, we explore adaptive inference based on variational Bayes.
no code implementations • 4 Sep 2021 • Ziqing Hu, Yihao Fang, Lizhen Lin
In this work, we propose to train a graph neural network via resampling from a graphon estimate obtained from the underlying network data.
no code implementations • 9 May 2021 • Minwoo Chae, Dongha Kim, Yongdai Kim, Lizhen Lin
In the considered model, a usual likelihood approach can fail to estimate the target distribution consistently due to the singularity.
no code implementations • 18 Oct 2020 • Lizhen Lin, Bayan Saparbayeva, Michael Minyi Zhang, David B. Dunson
One of the key challenges for optimization on manifolds is the difficulty of verifying the complexity of the objective function, e. g., whether the objective function is convex or non-convex, and the degree of non-convexity.
no code implementations • 29 Sep 2020 • Dong Quan Ngoc Nguyen, Lin Xing, Lizhen Lin
We introduce, for the first time, a metric geometry approach to studying edge weight prediction in WDNs.
1 code implementation • 29 Sep 2020 • Dong Quan Ngoc Nguyen, Lin Xing, Lizhen Lin
Also based on the topological space structure of hypergraph data introduced in our paper, we introduce a modified nearest neighbors methods which is a generalization of the classical nearest neighbors methods from machine learning.
no code implementations • 28 Sep 2020 • Yihao Hu, Tong Zhao, Zhiliang Xu, Lizhen Lin
Inspired by the traditional finite difference and finite elements methods and emerging advancements in machine learning, we propose a sequence-to-sequence learning (Seq2Seq) framework called Neural-PDE, which allows one to automatically learn governing rules of any time-dependent PDE system from existing data by using a bidirectional LSTM encoder, and predict the solutions in next $n$ time steps.
1 code implementation • 8 Sep 2020 • Yihao Hu, Tong Zhao, Shixin Xu, Zhiliang Xu, Lizhen Lin
Partial differential equations (PDEs) play a crucial role in studying a vast number of problems in science and engineering.
1 code implementation • 21 Aug 2020 • Mohammad Rasool Izadi, Yihao Fang, Robert Stevenson, Lizhen Lin
In this work, we propose to employ information-geometric tools to optimize a graph neural network architecture such as the graph convolutional networks.
Ranked #1 on
Node Classification
on Cora
2 code implementations • 9 Apr 2020 • Nathaniel Josephs, Lizhen Lin, Steven Rosenberg, Eric D. Kolaczyk
While the study of a single network is well-established, technological advances now allow for the collection of multiple networks with relative ease.
Applications
1 code implementation • 30 Mar 2019 • Arash A. Amini, Marina S. Paez, Lizhen Lin
Moreover, our model automatically picks up the necessary number of communities at each layer (as validated by real data examples).
2 code implementations • 21 Mar 2019 • Arash A. Amini, Marina Paez, Lizhen Lin, Zahra S. Razaee
We propose an exact slice sampler for Hierarchical Dirichlet process (HDP) and its associated mixture models (Teh et al., 2006).
no code implementations • NeurIPS 2018 • Bayan Saparbayeva, Michael Minyi Zhang, Lizhen Lin
Our work aims to fill a critical gap in the literature by generalizing parallel inference algorithms to optimization on manifolds.
no code implementations • 5 Oct 2018 • Dianbin Bao, Kisung You, Lizhen Lin
In this paper, we focus on proposing such a distance between network objects.
no code implementations • 3 Jan 2018 • Mu Niu, Pokman Cheung, Lizhen Lin, Zhenwen Dai, Neil Lawrence, David Dunson
in-GPs respect the potentially complex boundary or interior conditions as well as the intrinsic geometry of the spaces.
no code implementations • 19 Oct 2016 • Michael Minyi Zhang, Henry Lam, Lizhen Lin
Effective and accurate model selection is an important problem in modern data analysis.
1 code implementation • NeurIPS 2017 • Soumendu Sundar Mukherjee, Purnamrita Sarkar, Lizhen Lin
Community detection, which focuses on clustering nodes or detecting communities in (mostly) a single network, is a problem of considerable practical interest and has received a great deal of attention in the research community.
no code implementations • 11 Mar 2014 • Stanislav Minsker, Sanvesh Srivastava, Lizhen Lin, David B. Dunson
We propose a novel approach to Bayesian analysis that is provably robust to outliers in the data and often has computational advantages over standard methods.