Search Results for author: Mengmeng Ma

Found 6 papers, 2 papers with code

Are Multimodal Transformers Robust to Missing Modality?

no code implementations CVPR 2022 Mengmeng Ma, Jian Ren, Long Zhao, Davide Testuggine, Xi Peng

Based on these findings, we propose a principle method to improve the robustness of Transformer models by automatically searching for an optimal fusion strategy regarding input data.

SMIL: Multimodal Learning with Severely Missing Modality

1 code implementation9 Mar 2021 Mengmeng Ma, Jian Ren, Long Zhao, Sergey Tulyakov, Cathy Wu, Xi Peng

A common assumption in multimodal learning is the completeness of training data, i. e., full modalities are available in all training examples.

Meta-Learning

HSR: L1/2 Regularized Sparse Representation for Fast Face Recognition using Hierarchical Feature Selection

no code implementations23 Sep 2014 Bo Han, Bo He, Tingting Sun, Mengmeng Ma, Amaury Lendasse

By employing hierarchical feature selection, we can compress the scale and dimension of global dictionary, which directly contributes to the decrease of computational cost in sparse representation that our approach is strongly rooted in.

Face Recognition feature selection +1

RMSE-ELM: Recursive Model based Selective Ensemble of Extreme Learning Machines for Robustness Improvement

no code implementations9 Aug 2014 Bo Han, Bo He, Mengmeng Ma, Tingting Sun, Tianhong Yan, Amaury Lendasse

It becomes a potential framework to solve robustness issue of ELM for high-dimensional blended data in the future.

LARSEN-ELM: Selective Ensemble of Extreme Learning Machines using LARS for Blended Data

no code implementations9 Aug 2014 Bo Han, Bo He, Rui Nian, Mengmeng Ma, Shujing Zhang, Minghui Li, Amaury Lendasse

Extreme learning machine (ELM) as a neural network algorithm has shown its good performance, such as fast speed, simple structure etc, but also, weak robustness is an unavoidable defect in original ELM for blended data.

Cannot find the paper you are looking for? You can Submit a new open access paper.