Search Results for author: Ho-Kin Tang

Found 7 papers, 4 papers with code

Parameter Competition Balancing for Model Merging

1 code implementation3 Oct 2024 Guodong Du, Junlin Lee, Jing Li, Runhua Jiang, Yifei Guo, Shuyang Yu, Hanting Liu, Sim Kuan Goh, Ho-Kin Tang, Daojing He, Min Zhang

Recently developed model merging techniques enable the direct integration of multiple models, each fine-tuned for distinct tasks, into a single model.

Domain Generalization model

Meta-heuristic Optimizer Inspired by the Philosophy of Yi Jing

no code implementations10 Aug 2024 Yisheng Yang, Sim Kuan Goh, Qing Cai, Shen Yuong Wong, Ho-Kin Tang

Specifically, we enhance the Yin-Yang pair in YYPO with a proposed Yi-point, in which we use Cauchy flight to update the solution, by implementing both the harmony and reversal concept of Yi Jing.

Philosophy

Impacts of Darwinian Evolution on Pre-trained Deep Neural Networks

no code implementations10 Aug 2024 Guodong Du, Runhua Jiang, Senqiao Yang, Haoyang Li, Wei Chen, Keren Li, Sim Kuan Goh, Ho-Kin Tang

The empirical results show that the proposed framework has positive impacts on the network, with reduced over-fitting and an order of magnitude lower time complexity compared to BP.

Evolutionary Neural Architecture Search for 3D Point Cloud Analysis

no code implementations10 Aug 2024 Yisheng Yang, Guodong Du, Chean Khim Toa, Ho-Kin Tang, Sim Kuan Goh

This paper presents Success-History-based Self-adaptive Differential Evolution with a Joint Point Interaction Dimension Search (SHSADE-PIDS), an evolutionary NAS framework that encodes discrete deep neural network architectures to continuous spaces and performs searches in the continuous spaces for efficient point cloud neural architectures.

Evolutionary Algorithms Navigate +1

Knowledge Fusion By Evolving Weights of Language Models

1 code implementation18 Jun 2024 Guodong Du, Jing Li, Hanting Liu, Runhua Jiang, Shuyang Yu, Yifei Guo, Sim Kuan Goh, Ho-Kin Tang

Fine-tuning pre-trained language models, particularly large language models, demands extensive computing resources and can result in varying performance outcomes across different domains and datasets.

Decoder Evolutionary Algorithms

CADE: Cosine Annealing Differential Evolution for Spiking Neural Network

1 code implementation4 Jun 2024 Runhua Jiang, Guodong Du, Shuyang Yu, Yifei Guo, Sim Kuan Goh, Ho-Kin Tang

This paper attempts to tackle the challenges by introducing Cosine Annealing Differential Evolution (CADE), designed to modulate the mutation factor (F) and crossover rate (CR) of differential evolution (DE) for the SNN model, i. e., Spiking Element Wise (SEW) ResNet.

Transfer Learning

A Novel Non-population-based Meta-heuristic Optimizer Inspired by the Philosophy of Yi Jing

1 code implementation17 Apr 2021 Ho-Kin Tang, Sim Kuan Goh

As a conceptual prototype, we examine YI with IEEE CEC 2017 benchmark and compare its performance with a Levy flight-based optimizer CV1. 0, the state-of-the-art dynamical Yin-Yang pair optimization in YYPO family and a few classical optimizers.

Philosophy

Cannot find the paper you are looking for? You can Submit a new open access paper.