Efficient Feature Learning Approach for Raw Industrial Vibration Data Using Two-Stage Learning Framework

Mohamed Ali Tnani, Paul Subarnaduti, Klaus Diepold

Research output: Contribution to journalArticlepeer-review

9 Scopus citations

Abstract

In the last decades, data-driven methods have gained great popularity in the industry, supported by state-of-the-art advancements in machine learning. These methods require a large quantity of labeled data, which is difficult to obtain and mostly costly and challenging. To address these challenges, researchers have turned their attention to unsupervised and few-shot learning methods, which produced encouraging results, particularly in the areas of computer vision and natural language processing. With the lack of pretrained models, time series feature learning is still considered as an open area of research. This paper presents an efficient two-stage feature learning approach for anomaly detection in machine processes, based on a prototype few-shot learning technique that requires a limited number of labeled samples. The work is evaluated on a real-world scenario using the publicly available CNC Machining dataset. The proposed method outperforms the conventional prototypical network and the feature analysis shows a high generalization ability achieving an F1-score of 90.3%. The comparison with handcrafted features proves the robustness of the deep features and their invariance to data shifts across machines and time periods, which makes it a reliable method for sensory industrial applications.

Original languageEnglish
Article number4813
JournalSensors (Switzerland)
Volume22
Issue number13
DOIs
StatePublished - 1 Jul 2022

Keywords

  • CNC machining
  • feature learning
  • few-shot learning
  • machine learning
  • machine monitoring
  • two-stage learning
  • vibration data

Fingerprint

Dive into the research topics of 'Efficient Feature Learning Approach for Raw Industrial Vibration Data Using Two-Stage Learning Framework'. Together they form a unique fingerprint.

Cite this