Explainable Tabular Transformer Model for Enhancing Security in Cryptocurrencies
Explainable Tabular Transformer Model for Enhancing Security in Cryptocurrencies
Ruth Olusegun,Bo Yang
TLDR
This study introduced a scalable AI solution that can be deployed within the blockchain environment for real-time analysis and addresses efficiency and ethical concerns impacting the reliance on AI in cryptocurrencies.
摘要
The application of Blockchain technology is increasingly becoming attractive to many users and organizations due to its immutability properties. However, its platforms, especially cryptocurrencies, remain vulnerable to cyberattacks. Blockchain relies on its inherent security features, consensus mechanism, and cryptographic hash function. However, these alone are inadequate to address the network's evolving fraud activities. AI-based fraud detection systems have increasingly been utilized to enhance security, but they often struggle with handling complex feature dynamics in blockchain data. This results in intense model training, computational overhead, adaptability, and a lack of clarity. This study presents efficient pre-trained tabular transformers that analyze and make predictions in less than a few seconds. A hybrid feature selection framework was introduced to reduce the data feature space without sacrificing key elements. Four deep learning models were developed alongside the proposed system, including Long ShortTerm Memory (LSTM), Convolutional Neural Networks (CNN), Convolutional Neural Networks, and Long Short-Term Memory (CLSTM) and Multilayer Perceptron (MLP) for performance evaluation. Our proposed model achieves 99.96% and 98.54% accuracies on Ethereum and Bitcoin transactions, respectively, outperforming conventional neural networks and SOTA models. This study introduced a scalable AI solution that can be deployed within the blockchain environment for real-time analysis. This work also addresses efficiency and ethical concerns impacting the reliance on AI in cryptocurrencies.
