An ensemble of pre-trained transformer models for imbalanced multiclass malware classification
No Thumbnail Available
Date
2022
Journal Title
Journal ISSN
Volume Title
Publisher
Elsevier Advanced Technology
Open Access Color
OpenAIRE Downloads
OpenAIRE Views
Abstract
Classification of malware families is crucial for a comprehensive understanding of how they can infect devices, computers, or systems. Hence, malware identification enables security researchers and incident responders to take precautions against malware and accelerate mitigation. API call sequences made by malware are widely utilized features by machine and deep learning models for malware classification as these sequences represent the behavior of malware. However, traditional machine and deep learning models remain incapable of capturing sequence relationships among API calls. Unlike traditional machine and deep learning models, the transformer-based models process the sequences in whole and learn relationships among API calls due to multi-head attention mechanisms and positional embeddings. Our experiments demonstrate that the Transformer model with one transformer block layer surpasses the performance of the widely used base architecture, LSTM. Moreover, BERT or CANINE, the pre-trained transformer models, outperforms in classifying highly imbalanced malware families according to evaluation metrics: F1-score and AUC score. Furthermore, our proposed bagging-based random transformer forest (RTF) model, an ensemble of BERT or CANINE, reaches the state-of-the-art evaluation scores on the three out of four datasets, specifically it captures a state-of-the-art F1-score of 0.6149 on one of the commonly used benchmark dataset. (C) 2022 Elsevier Ltd. All rights reserved.
Description
Demirkiran, Ferhat/0000-0001-7335-9370; Unal, Ugur/0000-0001-6552-6044
Keywords
Transformer, Tokenization-free, API Calls, Imbalanced, Multiclass, BERT, CANINE, Ensemble, Malware classification
Turkish CoHE Thesis Center URL
Fields of Science
Citation
16
WoS Q
Q1
Scopus Q
Q1
Source
Volume
121