no code implementations • 23 Oct 2024 • Qinsi Wang, Saeed Vahidian, Hancheng Ye, Jianyang Gu, Jianyi Zhang, Yiran Chen
In this paper, we introduce CoreInfer, an MLP-free adaptive sparse activation inference method based on sentence-level prediction.
no code implementations • 2 Sep 2024 • Vyacheslav Kungurtsev, Yuanfang Peng, Jianyang Gu, Saeed Vahidian, Anthony Quinn, Fadwa Idlahcen, Yiran Chen
Dataset distillation (DD) is an increasingly important technique that focuses on constructing a synthetic dataset capable of capturing the core information in training data to achieve comparable performance in models trained on the latter.
1 code implementation • 24 Mar 2024 • Yao Lu, Jianyang Gu, Xuguang Chen, Saeed Vahidian, Qi Xuan
Given that there are no suitable biased datasets for DD, we first construct two biased datasets, CMNIST-DD and CCIFAR10-DD, to establish a foundation for subsequent analysis.
1 code implementation • 7 Feb 2024 • Saeed Vahidian, Mingyu Wang, Jianyang Gu, Vyacheslav Kungurtsev, Wei Jiang, Yiran Chen
However, targeting the training dataset must be thought of as auxiliary in the same sense that the training set is an approximate substitute for the population distribution, and the latter is the data of interest.
1 code implementation • 3 Dec 2023 • Yuqi Jia, Saeed Vahidian, Jingwei Sun, Jianyi Zhang, Vyacheslav Kungurtsev, Neil Zhenqiang Gong, Yiran Chen
This process allows local devices to train smaller surrogate models while enabling the training of a larger global model on the server, effectively minimizing resource utilization.
1 code implementation • CVPR 2024 • Jianyang Gu, Saeed Vahidian, Vyacheslav Kungurtsev, Haonan Wang, Wei Jiang, Yang You, Yiran Chen
Observing that key factors for constructing an effective surrogate dataset are representativeness and diversity, we design additional minimax criteria in the generative training to enhance these facets for the generated images of diffusion models.
1 code implementation • 29 Aug 2023 • Umar Khalid, Hasan Iqbal, Saeed Vahidian, Jing Hua, Chen Chen
Machine learning plays a vital role in industrial HRI by enhancing the adaptability and autonomy of robots in complex environments.
1 code implementation • 9 May 2023 • Jianyi Zhang, Saeed Vahidian, Martin Kuo, Chunyuan Li, Ruiyi Zhang, Tong Yu, Yufan Zhou, Guoyin Wang, Yiran Chen
This repository offers a foundational framework for exploring federated fine-tuning of LLMs using heterogeneous instructions across diverse categories.
no code implementations • ICCV 2023 • Saeed Vahidian, Sreevatsank Kadaveru, Woonjoon Baek, Weijia Wang, Vyacheslav Kungurtsev, Chen Chen, Mubarak Shah, Bill Lin
Specifically, we aim to investigate how ordered learning principles can contribute to alleviating the heterogeneity effects in FL.
1 code implementation • 14 Oct 2022 • Jicang Cai, Saeed Vahidian, Weijia Wang, Mohsen Joneidi, Bill Lin
Inspired by the widely recognized finding in neuroscience that distinct parts of the brain are highly specialized for different types of tasks, we aim to improve the model performance of the current meta learning algorithms by selectively using only parts of the model conditioned on the input tasks.
1 code implementation • 30 Sep 2022 • Mahdi Morafah, Saeed Vahidian, Chen Chen, Mubarak Shah, Bill Lin
Though successful, federated learning presents new challenges for machine learning, especially when the issue of data heterogeneity, also known as Non-IID data, arises.
1 code implementation • 21 Sep 2022 • Saeed Vahidian, Mahdi Morafah, Weijia Wang, Vyacheslav Kungurtsev, Chen Chen, Mubarak Shah, Bill Lin
This small set of principal vectors is provided to the server so that the server can directly identify distribution similarities among the clients to form clusters.
1 code implementation • 20 Aug 2022 • Mahdi Morafah, Saeed Vahidian, Weijia Wang, Bill Lin
Classical federated learning approaches yield significant performance degradation in the presence of Non-IID data distributions of participants.
1 code implementation • 2 May 2021 • Saeed Vahidian, Mahdi Morafah, Bill Lin
The traditional approach in FL tries to learn a single global model collaboratively with the help of many clients under the orchestration of a central server.
no code implementations • 1 Jan 2021 • Saeed Vahidian, Mohsen Joneidi, Ashkan Esmaeili, Siavash Khodadadeh, Sharare Zehtabian, Ladislau Boloni, Nazanin Rahnavard, Bill Lin, Mubarak Shah
The approach is based on the concept of {\em self-rank}, defined as the minimum number of samples needed to reconstruct all samples with an accuracy proportional to the rank-$K$ approximation.
no code implementations • ICLR 2021 • Siavash Khodadadeh, Sharare Zehtabian, Saeed Vahidian, Weijia Wang, Bill Lin, Ladislau Bölöni
Unsupervised meta-learning approaches rely on synthetic meta-tasks that are created using techniques such as random selection, clustering and/or augmentation.
no code implementations • 3 Jun 2019 • Saeed Vahidian, Baharan Mirzasoleiman, Alexander Cloninger
In a number of situations, collecting a function value for every data point may be prohibitively expensive, and random sampling ignores any structure in the underlying data.
no code implementations • 8 Sep 2017 • Yasaman Ettefagh, Mohammad Hossein Moghaddam, Saeed Vahidian
In this research work, a novel framework is pro- posed as an efficient successor to traditional imaging methods for breast cancer detection in order to decrease the computational complexity.
no code implementations • 31 Jul 2017 • Mohammad Hossein Moghaddam, Mohammad Javad Azizipour, Saeed Vahidian, Besma Smida
We use the sparsity of residual information in residual frames as the key point in devising our framework.