FaultFormer: Pretraining Transformers for Adaptable Bearing Fault Classification

4 Dec 2023  ·  Anthony Zhou, Amir Barati Farimani ·

The growth of global consumption has motivated important applications of deep learning to smart manufacturing and machine health monitoring. In particular, vibration data offers a rich and reliable source to provide meaningful insights into machine health and predictive maintenance. In this work, we present pretraining and fine-tuning frameworks for identifying bearing faults based on transformer models. In particular, we investigate different tokenization and data augmentation strategies to improve performance and reach state of the art accuracies. Furthermore, we demonstrate masked self-supervised pretraining for vibration signals and its application to low-data regimes, task adaptation, and dataset adaptation. Pretraining is able to improve performance on 10-way bearing classification on scarce, unseen training samples. Transformer models also benefit from pretraining when fine-tuning on fault classes outside of the pretraining distribution. Lastly, pretrained transformers are shown to be able to generalize to a different dataset in a few-shot manner. This introduces a new paradigm where models can be pretrained across different bearings, faults, and machinery and quickly deployed to new, data-scarce applications to suit specific manufacturing needs.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods