Search Results for author: Daniel T. Chang

Found 22 papers, 0 papers with code

Concept-Oriented Deep Learning with Large Language Models

no code implementations29 Jun 2023 Daniel T. Chang

We discuss these in this paper, as well as major uses of LLMs for CODL including concept extraction from text, concept graph extraction from text, and concept learning.

Text Generation

Variational Quantum Classifiers for Natural-Language Text

no code implementations4 Mar 2023 Daniel T. Chang

As part of the recent research effort on quantum natural language processing (QNLP), variational quantum sentence classifiers (VQSCs) have been implemented and supported in lambeq / DisCoPy, based on the DisCoCat model of sentence meaning.

coreference-resolution Sentence +2

Variational Quantum Kernels with Task-Specific Quantum Metric Learning

no code implementations8 Nov 2022 Daniel T. Chang

For machine learning, the notion of similarity assumes that points close in the feature space should be close in the machine learning task space.

feature selection Metric Learning +2

Parameterized Quantum Circuits with Quantum Kernels for Machine Learning: A Hybrid Quantum-Classical Approach

no code implementations28 Sep 2022 Daniel T. Chang

In this paper we discuss some important aspects of PQCs with quantum kernels including PQCs, quantum kernels, quantum kernels with quantum advantage, and the trainability of quantum kernels.

Dimensionality Reduction Quantum Machine Learning

Distance-Geometric Graph Attention Network (DG-GAT) for 3D Molecular Geometry

no code implementations16 Jul 2022 Daniel T. Chang

Deep learning for molecular science has so far mainly focused on 2D molecular graphs.

Graph Attention

Embodied-Symbolic Contrastive Graph Self-Supervised Learning for Molecular Graphs

no code implementations13 May 2022 Daniel T. Chang

We discuss the use of dual embodied-symbolic concept representations for molecular graph representation learning, specifically with exemplar-based contrastive self-supervised learning (SSL).

Graph Representation Learning Self-Supervised Learning

Dual Embodied-Symbolic Concept Representations for Deep Learning

no code implementations1 Mar 2022 Daniel T. Chang

As such, we further advocate the use of dual embodied-symbolic concept representations for deep learning.

Few-Shot Class-Incremental Learning Graph Generation +7

Exemplar-Based Contrastive Self-Supervised Learning with Few-Shot Class Incremental Learning

no code implementations5 Feb 2022 Daniel T. Chang

We discuss extending CSSL (1) to be based mainly on exemplars and only secondly on data augmentation, and (2) to apply to both unlabeled data (a large amount is available in general) and labeled data (a few exemplars can be obtained with valuable supervised knowledge).

Data Augmentation Few-Shot Class-Incremental Learning +3

Concept Representation Learning with Contrastive Self-Supervised Learning

no code implementations10 Dec 2021 Daniel T. Chang

Concept-oriented deep learning (CODL) is a general approach to meet the future challenges for deep learning: (1) learning with little or no external supervision, (2) coping with test examples that come from a different distribution than the training examples, and (3) integrating deep learning with symbolic AI.

Class Incremental Learning Incremental Learning +3

Hybrid Bayesian Neural Networks with Functional Probabilistic Layers

no code implementations14 Jul 2021 Daniel T. Chang

To support this, we propose hybrid Bayesian neural networks with functional probabilistic layers that encode function (and activation) uncertainty.

Bayesian Inference Gaussian Processes +2

Bayesian Neural Networks: Essentials

no code implementations22 Jun 2021 Daniel T. Chang

Since these probabilistic layers are designed to be drop-in replacement of their deterministic counter parts, Bayesian neural networks provide a direct and natural way to extend conventional deep neural networks to support probabilistic deep learning.

Bayesian Inference Probabilistic Deep Learning

Probabilistic Deep Learning with Probabilistic Neural Networks and Deep Probabilistic Models

no code implementations31 May 2021 Daniel T. Chang

Probabilistic deep learning is deep learning that accounts for uncertainty, both model uncertainty and data uncertainty.

Gaussian Processes Probabilistic Deep Learning

Distance-Geometric Graph Convolutional Network (DG-GCN) for Three-Dimensional (3D) Graphs

no code implementations6 Jul 2020 Daniel T. Chang

To facilitate the incorporation of geometry in deep learning on 3D graphs, we propose a message-passing graph convolutional network based on the distance-geometric graph representation: DG-GCN (distance-geometric graph convolution network).

Translation

Geometric Graph Representations and Geometric Graph Convolutions for Deep Learning on Three-Dimensional (3D) Graphs

no code implementations2 Jun 2020 Daniel T. Chang

The geometry of three-dimensional (3D) graphs, consisting of nodes and edges, plays a crucial role in many important applications.

Hyperparameter Optimization

Bayesian Hyperparameter Optimization with BoTorch, GPyTorch and Ax

no code implementations11 Dec 2019 Daniel T. Chang

To find the best configuration for these hyperparameters in such a high dimensional space, with time-consuming and expensive model training / validation, is not a trivial challenge.

Bayesian Optimization Gaussian Processes +2

Deep Learning for Molecular Graphs with Tiered Graph Autoencoders and Graph Prediction

no code implementations24 Oct 2019 Daniel T. Chang

In this paper, we discuss the use of tiered graph autoencoders together with graph prediction for molecular graphs.

General Classification Graph Classification

Tiered Graph Autoencoders with PyTorch Geometric for Molecular Graphs

no code implementations22 Aug 2019 Daniel T. Chang

As a result of using tiered graph autoencoders for deep learning, each molecular graph possesses tiered latent representations.

Transfer Learning

Tiered Latent Representations and Latent Spaces for Molecular Graphs

no code implementations21 Mar 2019 Daniel T. Chang

Flat latent representations (node embeddings or graph embeddings) fail to represent, and support the use of, groups.

Probabilistic Generative Deep Learning for Molecular Design

no code implementations11 Feb 2019 Daniel T. Chang

Probabilistic generative deep learning for molecular design involves the discovery and design of new molecules and analysis of their structure, properties and activities by probabilistic generative models using the deep learning approach.

Latent Variable Modeling for Generative Concept Representations and Deep Generative Models

no code implementations26 Dec 2018 Daniel T. Chang

Latent representations are the essence of deep generative models and determine their usefulness and power.

Attribute

Concept-Oriented Deep Learning: Generative Concept Representations

no code implementations15 Nov 2018 Daniel T. Chang

Generative concept representations have three major advantages over discriminative ones: they can represent uncertainty, they support integration of learning and reasoning, and they are good for unsupervised and semi-supervised learning.

Concept-Oriented Deep Learning

no code implementations5 Jun 2018 Daniel T. Chang

Concepts are the foundation of human deep learning, understanding, and knowledge integration and transfer.

Continual Learning Representation Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.