Figure 1:

Figure 2:

Figure 3:

Figure 4:

Figure 5:

Figure 6:

Figure 7:

Figure 8:

Figure 9:

Figure 10.

Figure 11.

Statistical Parameter analysis using 5-Fold cross Validation_
| Iteration | Accuracy | Macro AUC | Macro Recall | Macro F1-score | EER | NRMSE |
|---|---|---|---|---|---|---|
| 1 | 0.9480 | 0.9133 | 0.8853 | 0.9131 | 0.1491 | 0.0897 |
| 2 | 0.9101 | 0.8946 | 0.8728 | 0.8937 | 0.2325 | 0.0864 |
| 3 | 0.9328 | 0.8845 | 0.9269 | 0.8740 | 0.1497 | 0.0879 |
| 4 | 0.9501 | 0.9644 | 0.9559 | 0.9494 | 0.1478 | 0.0983 |
| 5 | 0.9306 | 0.9161 | 0.9087 | 0.9570 | 0.1581 | 0.0941 |
| Average | 0.9343 | 0.9146 | 0.9099 | 0.9174 | 0.1674 | 0.0913 |
SMBO
| Data: f,x,A,M |
| // Data initialiszation |
| Q ← Init. Samples (f,x); |
| // run up to N steps |
| for i ← |Q| to N do |
| // Model Training |
| p(y ∣ x, Q) ← FitModel(M, Q); |
| // Selection of best hyperparameter |
| xi ← argmaxx∈X A(x,p(y ∣ x, Q)); |
| // estimate the hyperparameters |
| yi ← f(xi); |
| // append new data |
| Q ← Q ∪ (xi,yi); |
| end |
Comparison of proposed model with state of art of models
| No. | Authors | Accuracy in% |
|---|---|---|
| 1 | Kumarmohanta et al.[41] | 85.20 |
| 2 | Soleymani et al.[42] | 88.12 |
| 3 | Alshardan et al. [6] | 90.01 |
| 4 | Proposed Model | 95.01 |
Optimization analysis of the parameters
| Iter | Eval | Objective | Objective (run time) | Best so far | Best so far est time | Box constraint | Kernel scale |
|---|---|---|---|---|---|---|---|
| 1 | B | 0.096 | 13.135 | 0.096 | 0.096 | 0.337 | 0.002 |
| 2 | B | 0.079 | 1.576 | 0.079 | 0.082 | 871.77 | 0.010 |
| 3 | A | 0.331 | 91.062 | 0.079 | 0.092 | 431.78 | 0.029 |
| 4 | A | 0.833 | 2.125 | 0.079 | 0.079 | 0.002 | 26.961 |
| 5 | A | 0.079 | 3.016 | 0.079 | 0.079 | 0.074 | 0.002 |
| 6 | A | 0.902 | 0.735 | 0.079 | 0.079 | 0.041 | 998.72 |
| 7 | B | 0.069 | 12.351 | 0.069 | 0.069 | 0.001 | 0.001 |
| 8 | A | 0.077 | 2.731 | 0.069 | 0.069 | 0.065 | 0.004 |
| 9 | A | 0.223 | 46.893 | 0.069 | 0.069 | 152.81 | 0.001 |
| 10 | A | 0.167 | 0.636 | 0.069 | 0.079 | 0.001 | 0.004 |
| 11 | A | 0.079 | 5.504 | 0.069 | 0.078 | 964.37 | 0.001 |
| 12 | A | 0.079 | 2.760 | 0.069 | 0.069 | 984.19 | 0.003 |
| 13 | A | 0.079 | 2.050 | 0.069 | 0.069 | 7.319 | 0.004 |
| 14 | A | 0.098 | 20.025 | 0.069 | 0.069 | 0.059 | 0.001 |
| 15 | A | 0.763 | 1.779 | 0.069 | 0.069 | 0.001 | 0.039 |
| 16 | A | 0.894 | 242.51 | 0.069 | 0.069 | 8.479 | 984.37 |
| 17 | A | 0.079 | 2.163 | 0.069 | 0.068 | 0.455 | 0.005 |
| 18 | A | 0.398 | 0.693 | 0.069 | 0.069 | 0.001 | 0.748 |
| 19 | A | 0.079 | 7.152 | 0.069 | 0.068 | 90.169 | 0.001 |
| 20 | A | 0.098 | 6.362 | 0.069 | 0.069 | 0.001 | 0.002 |
| 21 | A | 0.079 | 2.122 | 0.069 | 0.069 | 984.74 | 0.007 |
| 22 | A | 0.196 | 55.945 | 0.069 | 0.069 | 863.67 | 0.005 |
| 23 | A | 0.073 | 5.081 | 0.069 | 0.069 | 0.001 | 0.001 |
| 24 | A | 0.831 | 195.47 | 0.069 | 0.069 | 965.59 | 8.748 |
| 25 | A | 0.094 | 1.953 | 0.069 | 0.069 | 0.001 | 0.002 |
| 26 | A | 0.079 | 4.070 | 0.069 | 0.069 | 840.91 | 0.002 |
| 27 | A | 0.079 | 7.255 | 0.069 | 0.069 | 0.062 | 0.001 |
| 28 | A | 0.079 | 2.759 | 0.069 | 0.069 | 0.195 | 0.003 |
| 29 | A | 0.240 | 0.550 | 0.069 | 0.069 | 0.001 | 0.135 |
| 30 | A | 0.388 | 149.79 | 0.069 | 0.069 | 920.34 | 0.236 |
Hyper parameter optimization process using Bayesian approach
| BO process | Hyperparameter tuning |
|---|---|
| Total function evaluations | 30 |
| Observed objective function value | 0.06875 |
| Estimated objective function value | 0.069326 |
| Function evaluation time | 12.3512 |
| Box constraint | 0.0013385 |
| Kernel scale | 0.0010021 |
BSFF Objective Function
| Input: F, y, λLBP, λsvm |
| Output: Statistical Parameter |
| // hyperparameters usage from BO |
| λLBP← Trial from BO; |
| λSVM← Trial from BO. |
| // run for each Trait Ear, Iris, Fingerprint |
| for j←1 to n do |
| // LBP feature extraction |
| FEj ← Get LBP (xj, λLBP); //Ear |
| FIj ← Get LBP (xj, λLBP); // Iris |
| FFPj ← Get LBP (xj, λLBP); // Fingerprint |
| // concatenate the features as a Fused Feature vector |
| F ← {FEj, FIj, FFPj}; |
| // split Fused features (F) into train and test data |
| xtrain, xtest, ytrain, ytest ← Train Test Split(x, y); |
| // train a SVM model with BO |
| SVMFit (xtrain, ytrain, λSVM); |
| // use the trained model (M) to predict test data |
| ypred ← SVMPredict (M, xtest); |
| // Evaluate Statistical Parameter (SP) |
| SP← Get SP (ytest, ypred); |
| return SP. |