Have a personal or library account? Click to login
Learning stability on graphs Cover

Abstract

In artificial intelligence applications, the model training phase is critical and computationally demanding. In the graph neural networks (GNNs) research field, it is interesting to investigate how varying the graph topological and spectral structure impacts the learning process and overall GNN performance. In this work, we aim to theoretically investigate how the topology and the spectrum of a graph changes when nodes and edges are added or removed. Numerical results highlight stability issues in the learning process on graphs. In this work, we aim to theoretically investigate how the topology and the spectrum of a graph changes when nodes and edges are added or removed. We propose the topological relevance function as a novel method to quantify the stability of graph-based neural networks when graph structures are perturbed. We also explore the relationship between this topological relevance function, Graph Edit Distance, and spectral similarity. Numerical results highlight stability issues in the learning process on graphs.

Language: English
Page range: 91 - 101
Submitted on: Jul 10, 2020
Accepted on: Sep 30, 2024
Published on: Nov 21, 2024
Published by: Italian Society for Applied and Industrial Mathemathics
In partnership with: Paradigm Publishing Services
Publication frequency: 1 issue per year

© 2024 Antonioreneé Barletta, Salvatore Cuomo, Gianluca Milano, published by Italian Society for Applied and Industrial Mathemathics
This work is licensed under the Creative Commons Attribution 4.0 License.