AdaptGNN: A self-supervised graph neural network with test-time adaptation for robust multiuser detection in MC-CDMA systems
Abstract
Multiuser detection in multicarrier-code-division multiple access (MC-CDMA) systems is a critical problem, especially in a situation where the user densities are high, channel conditions are dynamically changing, and therefore, labelled data is scarce. The current paper introduces AdaptGNN, a self-supervised graph neural network (GNN) receiver that learns the MC-CDMA uplink by treating the actors as a heterogeneous graph of users and subcarriers, meaning that the interference topology is explicitly represented in the receiver. The self-supervised tasks, including masked subcarrier reconstruction, interference-edge prediction, and contrastive representation learning, facilitate the direct learning of interference-aware embeddings from the received waveforms, without the need for manually annotated labels. To improve operational robustness, a self-supervised test-time adaptation (TTA) system is integrated; the system adapts a limited set of model parameters during inference on unlabeled test examples and therefore mitigates distributional change caused by changes in user load and channel statistics. Monte Carlo simulations under Rayleigh fading conditions demonstrate that AdaptGNN significantly reduces bit-error rate (BER) and outperforms traditional multiuser detection methods, particularly in highly congested interference environments. Besides, the method reduces the latency of detection and has a high resistance to channel estimation errors compared to traditional detectors and graph-based models. These results highlight that AdaptGNN is well-positioned to serve as a scalable and efficient receiver for deployment in dense and dynamic wireless environments.
© 2026 Ridha Ilyas Bendjillali, Mohammed Rida Lahcene, Mohammed Sofiane Bendelhoum, Asma Ouardas, Miloud Kamline, Fadila Amel Miloudi, published by Slovak University of Technology in Bratislava
This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License.