Have a personal or library account? Click to login
KBJNet: Kinematic Bi-Joint Temporal Convolutional Network Attention for Anomaly Detection in Multivariate Time Series Data Cover

KBJNet: Kinematic Bi-Joint Temporal Convolutional Network Attention for Anomaly Detection in Multivariate Time Series Data

Open Access
|Mar 2024

Figures & Tables

Table I

Summary of terminology used.

TERMINOLOGYDEFINITION
ARIMAAutoregressive Integrated Moving Average
AUCArea under the ROC Curve.
CAVConnected and Autonomous Vehicle
COPODCopula-Based Outlier Detection
CPODCore Point-based Outlier Detection
DAGMMDeep Autoencoding Gaussian Mixture Model
DTAADDual Tcn-Attention Networks for Anomaly Detection in Multivariate Time Series Data
ECGElectrocardiography
EVTExtreme Value Theory
FFNFeedforward Neural Network
GANGenerative Adversarial Network
GDNGraph Deviation Networks
GHLGas-oil Plant Heating Loop
GPDGeneralized Pareto Distribution
GRUGated Recurrent Unit
GTAGraph Learning with Transformer for Anomaly Detection
HBOSHistogram-Based Outlier Score
IoTInternet of Things
KBJNetKinematic Bi-Joint Temporal Convolutional Network Attention for Anomaly Detection
KDDKnowledge Discovery and Data Mining
KNNk-Nearest Neighbor
LDCOFLocal Density Cluster-based Outlier Factor
LOFLocal Outlier Factor
LSTMLong Short-Term Memory Networks
LSTM-VAELong Short-Term Memory Networks and Variational Autoencoder
MAD-GANMultivariate Anomaly Detection for Time Series Data with Generative Adversarial Networks
MAMLModel-Agnostic and Meta-Learning
MBAMIT-BIH Supraventricular Arrhythmia Database
MSCREDMulti-Scale Convolutional Recurrent Encoder-Decoder
MSDSMaterial Safety Data Sheet
MSEMean Squared Error
MSLMars Science Laboratory
MTAD-GATMultivariate Time-Series Anomaly Detection via Graph Attention Networks
MTSMultivariate Time Series
NABNumenta Anomaly Benchmark
NSIBFNeural System Identification and Bayesian Filtering
PCAPrincipal Component Analysis
POTPeaks Over Threshold
ReLURectified Linear Unit
RNNRecurrent Neural Network
SMAPSoil Moisture Active Passive
SMDServer Machine Dataset
SoTaState of the Art
SVDSupport Vector Data
SVDDSupport Vector Data Description
SVMSupport Vector Machine
SWaTSecure Water Treatment
TCNTemporal Convolutional Network
TranADDeep Transformer Networks for Anomaly Detection in Multivariate Time Series Data
TWSVMTwin Support Vector Machine
USADUnsupervised Anomaly Detection
UTRADAnomaly Detection and Localization with U-Transformer
WADIWater Distribution
Table II

Summary of literature review multivariate time series.

METHODAPPROACHMAIN ARCHITECTURESUPERVISED/UNSUPERVISEDABLE TO HANDLE LIMITED DATAINTERPRETABILITY
DAGMM (Zong et al. 2018)ForecastingAEUnsupervised××
HitAnomaly (Huang et al. 2020)ForecastingTransformerSupervised××
TCN-AE (Thill et al. 2021)ReconstructionAEUnsupervised××
OmniAnomaly (Su et al. 2019)ReconstructionVAEUnsupervised××
LSTM-VAE (Park et al. 2018)ReconstructionVAESemi××
GTA (Chen et al. 2022)ReconstructionGNNSemi××
MSCRED (Zhang et al. 2019)ReconstructionAEUnsupervised×
MAD-GAN (Li et al. 2019)ReconstructionGANUnsupervised××
USAD (Li et al. 2019)ReconstructionAEUnsupervised××
MTAD-GAT (Zhao et al. 2020)HybridGNNSupervised×
CAE-M (Zhang et al. 2021)HybridAEUnsupervised××
GDN (Deng & Hooi 2021)ForecastingGNNUnsupervised×
TranAD (Tuli et al. 2022)ReconstructionTransformerUnsupervised
DTAAD (Yu etal. 2023)ReconstructionTransformerUnsupervised
KBJNetReconstructionTransformerUnsupervised
dsj-23-1602-g1.png
Figure 1

Kinematic bi-joint network architecture for anomaly detection.

dsj-23-1602-g2.png
Figure 2

An illustration or depiction of data that involves multiple variables and occurs over a period of time.

dsj-23-1602-g3.png
Figure 3

The convolution has specific dilation factors of 1, 2, and 4 and a kernel size of 3. The input is represented as x, and the output is represented as y.

dsj-23-1602-g11.png
Algorithm 1

The KBJNet Training Algorithm.

dsj-23-1602-g12.png
Algorithm 2

The KBJNet Testing Algorithm.

dsj-23-1602-g4.png
Figure 4

Visualization of anomaly prediction.

Table III

Dataset characteristics.

TYPEDIMENSIONSTRAINVALIDATIONANOMALIES RATE (%)
MSDS10 (1)1464301464305.37
SMD38 (4)7084207084204.16
SWaT51 (1)49680044991911.98
MSL55 (3)583177372910.72
SMAP25 (55)13518342761713.13
MBA2 (8)1000001000000.14
UCR1 (4)160059001.88
NAB1 (6)403340330.92
WADI123 (1)10485711728015.99
Table V

Comparison of KBJNet model with baseline methods with 20% of anomalies dataset.

METHODNABUCRMBASMAPMSLSWATSMDMSDSWADI
AUC*F1*AUC*F1*AUC*F1*AUC*F1*AUC*F1*AUC*F1*AUC*F1*AUC*F1*AUC*F1*
MSCRED0.82980.70120.96360.49280.94980.91070.98100.80490.97960.82310.83840.79210.97670.80030.77150.82820.60280.0412
MAD-GAN0.81930.71080.99580.82150.95490.91910.98760.84670.96480.81890.84550.80110.86340.93170.50010.73890.53820.0936
USAD0.72680.67820.99680.85390.96980.94260.98840.83800.96500.81910.84390.80880.98550.92140.76140.83900.70120.0734
MTAD-GAT0.69570.70120.99750.86720.96890.94260.98150.82260.97830.80250.84600.80800.97990.66620.61230.82490.62680.0521
CAE-M0.73130.71270.99270.75260.96170.90030.98930.83130.98370.73040.84590.78420.95700.93190.60020.83900.61100.0782
GDN0.83000.70140.99380.80300.96720.93170.98880.84120.94150.89600.83910.80730.98120.71080.68200.83900.61220.0413
TranAD0.92160.84200.99830.92110.99460.98970.98840.89360.98560.91710.84610.80930.98470.87940.81120.83890.68520.0698
DTAAD0.93300.90570.99840.92200.99550.99120.98940.89960.98640.92120.84600.80870.98660.89410.81150.83900.78180.0977
KBJNet0.99990.92310.99990.93280.99320.98690.98940.90070.99070.94510.84600.80870.99860.99830.98290.91070.84530.1511
dsj-23-1602-g5.png
Figure 5

Results in UCR.

Table IV

Comparison of KBJNet model with baseline methods with 80% of the training dataset.

METHODNABUCRMBASMAPSWaT
PRAUCF1PRAUCF1PRAUCF1PRAUCF1PRAUCF1
MSCRED0.85210.67000.84000.75010.54400.97170.99190.69750.92711.00000.97980.96220.81740.92150.98200.86630.99910.67690.84320.8071
MAD-GAN0.86650.70110.84770.77510.85370.98900.99830.91640.93951.00000.98350.96880.81560.92150.98900.86530.95920.69560.84620.8064
USAD0.84210.66670.83320.74430.89531.00000.99900.89530.89540.99900.97020.94440.74810.96280.98900.84190.99770.68790.84600.8143
MTAD-GAT0.84220.72730.82220.78030.78130.99730.99790.87620.90191.00000.97200.94830.79920.99920.98460.88820.97190.69580.84650.8110
CAE-M0.79190.80200.80200.79690.69821.00000.99580.82230.84430.99980.96620.91550.81940.95680.99020.88280.96980.69580.84650.8102
GDN0.81300.78730.85430.79990.68950.99890.99600.81590.88330.98930.95290.93330.74810.98920.98650.85190.96980.69580.84630.8102
TranAD0.88890.98920.95410.93640.94071.00000.99940.96940.95761.00000.98860.97830.81040.99980.98870.89530.99770.68790.84380.8143
DTAAD0.88890.99990.99960.94120.88801.00000.99880.94070.96081.00000.98960.98000.82200.99990.99110.90230.96970.69570.84620.8101
KBJNet0.88890.99990.99960.94120.99991.00000.99990.99990.98051.00000.98980.98050.83020.99990.99010.90720.97180.69570.84630.8109
METHODSMDMSLMSDSWADI
PRAUCF1PRAUCF1PRAUCF1PRAUCF1
MSCRED0.72750.99730.99200.84130.89110.98610.98060.93620.99980.79820.89420.88780.25120.73180.84110.3740
MAD-GAN0.99900.84390.99320.91490.85150.99290.98610.91680.99810.61060.80530.75780.22320.91230.80250.3587
USAD0.90610.99750.99340.94960.79490.99120.97950.88220.99130.79600.89800.88290.18740.82970.87240.3057
MTAD-GAT0.82110.92160.99220.86840.79180.98250.98900.87690.99200.79650.89830.88350.28190.80130.88220.4170
CAE-M0.90810.96700.97820.93680.77521.00000.99040.87340.99090.84400.90140.91150.27830.79170.87270.4118
GDN0.71710.99750.99250.83430.93090.98930.98150.95920.99900.80270.91060.89000.29130.79320.87780.4261
TranAD0.90510.99730.99330.94900.90370.99990.99150.94930.99980.86250.90120.89040.39590.82950.89980.5360
DTAAD0.84630.99740.98920.91470.90380.99990.99180.94950.99990.80260.90130.89050.90170.39100.69500.5455
KBJNet0.99850.99740.99870.99850.90380.99990.99160.94960.95920.95540.92480.95730.84650.82960.91300.8379
dsj-23-1602-g10.png
Figure 10

Ground truth and predicted for the SMD using the KBJNet.

dsj-23-1602-g6.png
Figure 6

Training time in all datasets.

dsj-23-1602-g7.png
Figure 7

Inference time in all datasets.

dsj-23-1602-g8.png
Figure 8

Sensitivity to window size.

Table VI

The connection epochs and the performance on SMD datasets.

EPOCHPRECISIONRECALLAUCF1-SCORE
10.95670.84400.92000.8968
20.88760.99740.99220.9393
30.88310.99740.99190.9368
40.89960.99740.99290.9460
50.96620.99740.99690.9815
60.99850.99740.99860.9979
70.99960.99740.99870.9985
80.99920.99740.99860.9983
90.99850.99740.99860.9979
100.99700.99740.99850.9972
Table VII

Sensitivity KBJNet to MAML 20% datasets according to meta step-size.

METHODS5101520
NAB0.92310.90570.90570.9231
UCR0.93280.93280.93280.9328
MBA0.98690.98710.98670.9871
SMAP0.90070.89260.89260.9338
MSL0.94510.89980.89980.8998
SWaT0.80870.80870.80940.8087
SMD0.99830.99700.98200.9983
MSDS0.91070.91070.91070.9107
WADI0.15110.11040.12080.1071
dsj-23-1602-g9.png
Figure 9

Sensitivity to kernel size.

Table VIII

F1 scores and AUC for KBJNet with 80% of the training datasets.

ComponentNABUCRMBA
AUCF1AUCF1AUCF1
KBJNet0.99960.94120.99990.99990.98980.9805
(-)Bi-Joint TCN0.99960.94110.99860.93270.98980.9787
(-)MAML0.99960.94120.99900.95270.98890.9787
(-)Transformer0.93250.90500.99800.91880.99260.9858
COMPONENTSMAPMSLSWAT
AUCF1AUCF1AUCF1
KBJNet0.99010.90720.99160.94960.84630.8109
(-)Bi-Joint TCN0.99030.90830.95650.78480.84620.8101
(-)MAML0.98900.89740.95730.78780.84620.8101
(-)Transformer0.98530.86820.97000.84120.84590.8086
COMPONENTSMDMSDS
AUCF1AUCF1
KBJNet0.99870.99850.92480.9573
(-)Bi-Joint TCN0.99110.87320.98090.8991
(-)MAML0.99230.87900.97840.8872
(-)Transformer0.98520.85820.97890.8937
Language: English
Submitted on: Jun 29, 2023
Accepted on: Oct 6, 2023
Published on: Mar 4, 2024
Published by: Ubiquity Press
In partnership with: Paradigm Publishing Services
Publication frequency: 1 issue per year

© 2024 Muhammad Abdan Mulia, Muhammad Bintang Bahy, Muhammad Zain Fawwaz Nuruddin Siswantoro, Nur Rahmat Dwi Riyanto, Nella Rosa Sudianjaya, Ary Mazharuddin Shiddiqi, published by Ubiquity Press
This work is licensed under the Creative Commons Attribution 4.0 License.