Have a personal or library account? Click to login
Multi-Aspect Incremental Tensor Decomposition Based on Distributed In-Memory Big Data Systems Cover

Multi-Aspect Incremental Tensor Decomposition Based on Distributed In-Memory Big Data Systems

Open Access
|May 2020

Abstract

Purpose

We propose InParTen2, a multi-aspect parallel factor analysis three-dimensional tensor decomposition algorithm based on the Apache Spark framework. The proposed method reduces re-decomposition cost and can handle large tensors.

Design/methodology/approach

Considering that tensor addition increases the size of a given tensor along all axes, the proposed method decomposes incoming tensors using existing decomposition results without generating sub-tensors. Additionally, InParTen2 avoids the calculation of Khari–Rao products and minimizes shuffling by using the Apache Spark platform.

Findings

The performance of InParTen2 is evaluated by comparing its execution time and accuracy with those of existing distributed tensor decomposition methods on various datasets. The results confirm that InParTen2 can process large tensors and reduce the re-calculation cost of tensor decomposition. Consequently, the proposed method is faster than existing tensor decomposition algorithms and can significantly reduce re-decomposition cost.

Research limitations

There are several Hadoop-based distributed tensor decomposition algorithms as well as MATLAB-based decomposition methods. However, the former require longer iteration time, and therefore their execution time cannot be compared with that of Spark-based algorithms, whereas the latter run on a single machine, thus limiting their ability to handle large data.

Practical implications

The proposed algorithm can reduce re-decomposition cost when tensors are added to a given tensor by decomposing them based on existing decomposition results without re-decomposing the entire tensor.

Originality/value

The proposed method can handle large tensors and is fast within the limited-memory framework of Apache Spark. Moreover, InParTen2 can handle static as well as incremental tensor decomposition.

DOI: https://doi.org/10.2478/jdis-2020-0010 | Journal eISSN: 2543-683X | Journal ISSN: 2096-157X
Language: English
Page range: 13 - 32
Submitted on: Oct 30, 2019
Accepted on: Mar 6, 2020
Published on: May 20, 2020
Published by: Chinese Academy of Sciences, National Science Library
In partnership with: Paradigm Publishing Services
Publication frequency: 4 issues per year

© 2020 Hye-Kyung Yang, Hwan-Seung Yong, published by Chinese Academy of Sciences, National Science Library
This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License.