Figure 1:

Figure 2:

Figure 3:

Performance Metrics Evaluation
| SL.NO | Evaluation Metrics | Formula |
|---|---|---|
| 1 | Overall Accuracy | |
| 2 | Positive Predictive Value (Precision) | |
| 3 | Sensitivity | |
| 4 | Harmonic-Mean | |
| 5 | True Negative Rate (Specificity) |
Performance Comparison Across Different Models
| Algorithm | Accuracy | Precision | Recall | F1-Score | Specificity |
|---|---|---|---|---|---|
| SVM | 0.77 | 0.78 | 0.78 | 0.79 | 0.78 |
| CNN | 0.79 | 0.8 | 0.79 | 0.8 | 0.8 |
| LSTM | 0.85 | 0.86 | 0.88 | 0.87 | 0.88 |
| GRU | 0.89 | 0.9 | 0.9 | 0.89 | 0.9 |
| Proposed | 0.99 | 0.98 | 0.98 | 0.97 | 0.98 |