Have a personal or library account? Click to login
Improving Kanice Mini: A Study on Hyperparameter Tuning and Optimization Cover

Improving Kanice Mini: A Study on Hyperparameter Tuning and Optimization

Open Access
|Feb 2026

Abstract

This paper investigates and extends the capabilities of KANICE Mini, a hybrid neural architecture that integrates the Kolmogorov – Arnold Network framework with Interactive Convolution Elements. While the original implementation achieved 99.35% accuracy on the MNIST dataset, we improve the model’s performance through a refined training pipeline, enhanced regularization techniques, and structured hyperparameter optimization. Our optimized KANICE Mini achieves 99.56% accuracy on MNIST, surpassing the original result. Furthermore, we evaluate its generalization capability on more complex real-world data by applying it to the Invasive Ductal Carcinoma classification task, where it reaches 85.79% accuracy. These results demonstrate that, with careful tuning, KANICE Mini can rival significantly larger architectures in performancewhile preserving advantages in efficiency, modularity, and interpretability.

DOI: https://doi.org/10.2478/aei-2025-0014 | Journal eISSN: 1338-3957 | Journal ISSN: 1335-8243
Language: English
Page range: 11 - 16
Submitted on: Jul 15, 2025
|
Accepted on: Aug 21, 2025
|
Published on: Feb 25, 2026
In partnership with: Paradigm Publishing Services
Publication frequency: 4 issues per year

© 2026 Laura Pituková, Peter Sinčák, published by Technical University of Košice
This work is licensed under the Creative Commons Attribution 4.0 License.