Abstract
This work investigates the effect of Spike Timing-Dependent Plasticity (STDP) of the synapses in the randomly connected Spiking Neural Networks (SNN) on the distribution of the firing rates of the individual neurons. It was observed that STDP, as a homeostatic plasticity rule, forces SNN activity to reflect the input structure. This effect is similar but not identical to the Intrinsic Plasticity (IP) tuning of Reservoir Computing (RC) recurrent neural networks. Both IP and STDP rules allow for capturing of the input data structure into the network state. This explains why STDP-trained SNNs are good for feature extraction from multidimensional data for classification purposes.
