no code implementations • EACL (LTEDI) 2021 • Olawale Onabola, Zhuang Ma, Xie Yang, Benjamin Akera, Ibraheem Abdulrahman, Jia Xue, Dianbo Liu, Yoshua Bengio
In this work, we present hBERT, where we modify certain layers of the pretrained BERT model with the new Hopfield Layer.
no code implementations • 21 May 2022 • Dianbo Liu, Vedant Shah, Oussama Boussif, Cristian Meo, Anirudh Goyal, Tianmin Shu, Michael Mozer, Nicolas Heess, Yoshua Bengio
In Multi-Agent Reinforcement Learning (MARL), specialized channels are often introduced that allow agents to communicate directly with one another.
1 code implementation • 19 May 2022 • Mike He Zhu, Léna Néhale Ezzine, Dianbo Liu, Yoshua Bengio
Federated learning is a distributed machine learning approach which enables a shared server model to learn by aggregating the locally-computed parameter updates with the training data from spatially-distributed client silos.
no code implementations • 2 Feb 2022 • Dianbo Liu, Alex Lamb, Xu Ji, Pascal Notsawo, Mike Mozer, Yoshua Bengio, Kenji Kawaguchi
Vector Quantization (VQ) is a method for discretizing latent representations and has become a major part of the deep learning toolkit.
no code implementations • 10 Dec 2021 • Tianyi Zhang, Shirui Zhang, Ziwei Chen, Dianbo Liu
Federated machine learning is a versatile and flexible tool to utilize distributed data from different sources, especially when communication technology develops rapidly and an unprecedented amount of data could be collected on mobile devices nowadays.
no code implementations • NeurIPS 2021 • Dianbo Liu, Alex Lamb, Kenji Kawaguchi, Anirudh Goyal, Chen Sun, Michael Curtis Mozer, Yoshua Bengio
Deep learning has advanced from fully connected architectures to structured models organized into components, e. g., the transformer composed of positional elements, modular architectures divided into slots, and graph neural nets made up of nodes.
no code implementations • 6 Apr 2021 • Olawale Onabola, Zhuang Ma, Yang Xie, Benjamin Akera, Abdulrahman Ibraheem, Jia Xue, Dianbo Liu, Yoshua Bengio
In this work, we present hBERT, where we modify certain layers of the pretrained BERT model with the new Hopfield Layer.
1 code implementation • ICCV 2021 • Yuwei Cheng, Jiannan Zhu, Mengxin Jiang, Jie Fu, Changsong Pang, Peidong Wang, Kris Sankaran, Olawale Onabola, Yimin Liu, Dianbo Liu, Yoshua Bengio
To promote the practical application for autonomous floating wastes cleaning, we present FloW, the first dataset for floating waste detection in inland water areas.
no code implementations • 1 Dec 2020 • Leyu Dai, He Zhu, Dianbo Liu
Patient similarity analysis is important in health care applications.
no code implementations • 23 Nov 2020 • He Zhu, Dianbo Liu
The concept of disinformation is to use fake messages to confuse people in order to protect the real information.
1 code implementation • 8 Apr 2020 • Dianbo Liu, Leonardo Clemente, Canelle Poirier, Xiyu Ding, Matteo Chinazzi, Jessica T Davis, Alessandro Vespignani, Mauricio Santillana
We present a timely and novel methodology that combines disease estimates from mechanistic models with digital traces, via interpretable machine-learning methodologies, to reliably forecast COVID-19 activity in Chinese provinces in real-time.
no code implementations • 20 Feb 2020 • Dianbo Liu, Tim Miller
Large scale contextual representation models, such as BERT, have significantly advanced natural language processing (NLP) in recently years.
no code implementations • 25 Dec 2019 • Jianfei Cui, He Zhu, Hao Deng, Ziwei Chen, Dianbo Liu
Sometimes electrical medical records are restricted and difficult to centralize for machine learning, which could only be trained in distributed manner that involved many institutions in the process.
no code implementations • 23 Oct 2019 • Rulin Shao, Hongyu He, Hui Liu, Dianbo Liu
Specifically, we design, implement and evaluate a channel-based update algorithm for the central server in a distributed system, which selects the channels with regard to the most active features in a training loop and uploads them as learned information from local datasets.
no code implementations • ICLR 2020 • Dianbo Liu, Kathe Fox, Griffin Weber, Tim Miller
We proposed and evaluated a confederated learning to training machine learning model to stratify the risk of several diseases among when data are horizontally separated by individual, vertically separated by data type, and separated by identity without patient ID matching.
no code implementations • 4 Oct 2019 • Rulin Shao, Hui Liu, Dianbo Liu
Artificial neural network has achieved unprecedented success in a wide variety of domains such as classifying, predicting and recognizing objects.
no code implementations • WS 2019 • Dianbo Liu, Dmitriy Dligach, Timothy Miller
A large percentage of medical information is in unstructured text format in electronic medical record systems.
no code implementations • 22 Mar 2019 • Li Huang, Dianbo Liu
Electronic medical records (EMRs) supports the development of machine learning algorithms for predicting disease incidence, patient response to treatment, and other healthcare events.
no code implementations • 23 Dec 2018 • Dianbo Liu, Nestor Sepulveda, Ming Zheng
In this project we explored methods to increase computational efficiency of ML algorithms, in particular Artificial Neural Nets (NN), while not compromising the accuracy of the predicted results.
no code implementations • 30 Nov 2018 • Li Huang, Yifeng Yin, Zeng Fu, Shifa Zhang, Hao Deng, Dianbo Liu
One challenge in applying federated machine learning is the possibly different distributions of data from diverse sources.
no code implementations • 28 Nov 2018 • Dianbo Liu, Timothy Miller, Raheel Sayeed, Kenneth D. Mandl
Electronic health record (EHR) data is collected by individual institutions and often stored across locations in silos.
no code implementations • 9 Aug 2017 • Dianbo Liu, Fengjiao Peng, Andrew Shea, Ognjen, Rudovic, Rosalind Picard
Previous research on automatic pain estimation from facial expressions has focused primarily on "one-size-fits-all" metrics (such as PSPI).