Have a personal or library account? Click to login
Open Access
|Sep 2018

Figures & Tables

Figure 1

Block diagram of an ECG arrhythmia classification topic.
Block diagram of an ECG arrhythmia classification topic.

Figure 2

Flow chart of the proposed work.
Flow chart of the proposed work.

Figure 3

Normal ECG recording.
Normal ECG recording.

Figure 4

MLP Network structure.
MLP Network structure.

Figure 5

RBF Network structure.
RBF Network structure.

Figure 6

PNN Network structure.
PNN Network structure.

Selected attributes_

Selected FeaturesMax-QStd-a6Std- a5Std- a7Std- a4Std- a3Std- a2Std-a1std- a8Moy-d8
Correlation rank0.3500.3260.3180.3140.3110.3100.3070.3060.3040.300

RBF performance using different spread values_

Basic functionSpreadACC(%)Test_time(s)Time_response (s)MSE
Inverse_multiquadric1099.90.1210.6430.250 e-30
199.90.1280.6680.250 e-30
0.199.90.1400.6760.250 e-30

MLP, RBF, and PNN Feed_Forward neural networks_

Criteria typesCriterionMLPRBFPNN
StructuralArchitectureAn input layer, one or more hidden layer and an output layerAn input layer, one hidden layer and an output layerAn input layer, one hidden layer, a summation layer and an output layer
Activation functionThe activation function is non-linear (sigmoid, log-sigmoid, tan-sigmoid.)The activation function is a radial basis function which computes the Euclidian distance of the input vector and its weightsThe activation function is based on the probability density function
Number of hidden neuronsno defined principle for determining the number of neuronsno defined principle for determining the number of neuronsThe number is equal to the number of instance
Output layerThe final layer uses the activation function before linearly combining itThe final layer doesn’t use activation function, it rather linearly combines the output of the previous neuronThe final layer is a competitive output layer. It picks the maximum of the computed probabilities
Training ProcessBackpropagation training algorithmsBackpropagation or clustering algorithmsThere is no computation of weights. The Bayesian decision rule
ParametricParametersMomentum factor, learning rate, parameters according to the training algorithmNumber of centers, spread of radial functionSpread value of the probability density function

Performances of the PNN network according to the spread parameter_

SpreadACC(%)Test_time(s)Time_response (s)MSE
0.179.50.0810.2180.162
179.50.0700.2180.162
1079.50.0740.2180.162

ANNs Performances_

ANNN0/HLACC(%)Tr_time(s)Test_time(s)Time_response (s)MSE
MLP_opt1086.40.2940.0960.3900.064
RBF_opt2099.90.7550.1210.8760.250 e-30
PNN_opt2279.50.0700.2180.2880.162

Comparative study with related works_

ANNPre-processingTest conditionsDivision DatasetsACC(%)
(Abhinav-Vishwa et al., 2011)R peakMIT_BIH database of 48 signals of 30 min50% for training and 50% for testing96.8
(Rai et al., 2013)Morphological and DWT coefficients45 ECG signal of 1 min from MIT-BIH database26 signals for training and 19 for testing97.8
(Tomar et al., 2013)Morphological, DWT coefficients, power spectral density and Energy of Periodogram62 ECG signals of 10 s from MIT-BIH database and Normal Sinus Rhythm (NSR)Cross validation division (70%, 30%)98.4
(Savalia et al., 2017)R peaks, the heart beats/min, the duration of complex QRS66 ECG signal from MIT_BIH arrhythmia database and NSR databaseCross validation division (70%, 30%).82.5
(Dalvi et al., 2016)QRS complex, RR interval and the beat waveform morphology. PCA for feature selectionMIT_BIH database of 48 signals of 30 min.18 ECG records for test dataset and 30 ECG for train dataset96.9
Proposed workMorphological and DWT coefficients44 ECG signals of 1 min. recording from MIT_BIH database22 signal for training and 22 for testing following the AAM.I recommendations99.9

MLP performance with different learning algorithms_

Learning algorithms typesLearning algorithmsACC(%)Tr_time(s)Test_time(s)Time_response (s)MSE
Jacobian derivatives trainlm 86.40.2220.0240.2460.094
trainbr74.50.2290.0310.2600.124
Gradient derivativestrainscg86.40.5530.3550.9080.077
traingda81.80.4160.2180.6340.078
trainrp86.40.2940.0960.3900.073
traingdx81.80.5430.3450.8880.076
trainbfg86.40.3300.1320.4620.087
traincgb81.80.3160.1180.4340.070

MLP performance by learning rate_

LrACC(%)Time_response (s)MSE
0.0186.40.3900.130
0.186.40.3990.064
186.40.3920.073

RBF performance using different N/HL_

NO/HLACC (%)Tr_time(s)Test_time(s)Time_response (s)MSE
554.50.3800.0150.3950.204
1059.10.4360.0130.4490.150
1581.80.5010.0140.5150.080
2099.90.6400.0140.6541.266e-30
2599.90.6990.0170.7162.411e-30

PNN performance_

NO/HLACC(%)Tr_time(s)Test_time(s)Time_response (s)MSE
2279.50.0810.2180.2990.162

MLP performance using different number of hidden layer_

NO/HL
N_HLH1H2ACC(%)Tr_time(s)Test_time(s)Time_response (s)MSE
5072.70.2210.1130.3340.026
100 86.4 0.2220.0940.3160.024
115081.80.3800.0990.4790.021
20081.80.3900.0730.4630.016
25081.80.4060.0710.4770.013
10581.80.3150.2380.5530.019
2101081.80.3190.2530.5720.021
101572.70.3830.2810.6640.018
102072.70.4230.3170.7400.017

RBF performance using different basic functions_

RBF functionsACC(%)Tr_time(s)Test_time(s)Time_response (s)MSE
Gaussian99.90.3800.0140.6541.266e-30
Polyharmonic99.90.4360.0510.6391.221e-30
Inverse_multiquadric99.90.5010.1210.6760.250e-30
Multiquadric99.90.6400.1320.6980.891e-30
Biharmonic99.90.6990.0020.6400.891e-30
Language: English
Page range: 1 - 15
Published on: Sep 12, 2018
Published by: Professor Subhas Chandra Mukhopadhyay
In partnership with: Paradigm Publishing Services
Publication frequency: 1 times per year

© 2018 Hela Lassoued, Raouf Ketata, Slim Yacoub, published by Professor Subhas Chandra Mukhopadhyay
This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License.