Have a personal or library account? Click to login
A Fast Neural Network Learning Algorithm with Approximate Singular Value Decomposition Cover

A Fast Neural Network Learning Algorithm with Approximate Singular Value Decomposition

Open Access
|Sep 2019

Abstract

The learning of neural networks is becoming more and more important. Researchers have constructed dozens of learning algorithms, but it is still necessary to develop faster, more flexible, or more accurate learning algorithms. With fast learning we can examine more learning scenarios for a given problem, especially in the case of meta-learning. In this article we focus on the construction of a much faster learning algorithm and its modifications, especially for nonlinear versions of neural networks. The main idea of this algorithm lies in the usage of fast approximation of the Moore–Penrose pseudo-inverse matrix. The complexity of the original singular value decomposition algorithm is O(mn2). We consider algorithms with a complexity of O(mnl),where l<n and l is often significantly smaller than n. Such learning algorithms can be applied to the learning of radial basis function networks, extreme learning machines or deep ELMs, principal component analysis or even missing data imputation.

DOI: https://doi.org/10.2478/amcs-2019-0043 | Journal eISSN: 2083-8492 | Journal ISSN: 1641-876X
Language: English
Page range: 581 - 594
Submitted on: Sep 6, 2018
Accepted on: Feb 22, 2019
Published on: Sep 28, 2019
Published by: University of Zielona Góra
In partnership with: Paradigm Publishing Services
Publication frequency: 4 issues per year

© 2019 Norbert Jankowski, Rafał Linowiecki, published by University of Zielona Góra
This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License.