Sensitivity-Oriented YOLOv11 for Robust Multi-Label Lesion Detection in Chest X-rays
Abstract
Chest X-ray lesion detection remains challenging due to severe class imbalance, subtle lesion appearance, and the risk of over-optimistic evaluation caused by improper data splitting. In this study, we propose a sensitivity-oriented detection framework based on YOLOv11 for robust chest X-ray screening under clinically realistic conditions. The proposed approach integrates patient-wise data partitioning, enhanced data augmentation, and prediction fusion to improve generalization while mitigating data leakage. Experiments are conducted on the VinDr-CXR dataset using a strict patient-level split to ensure full separation between training and validation sets. A series of internal fine-tuning scenarios is designed to analyse the trade-offs among precision, recall, and localization accuracy. Based on internal validation, the medium-scale YOLOv11-m configuration (denoted as M3) is selected as the reference model, as it provides the most stable balance between sensitivity and localization performance. Under rigorous evaluation, M3 achieves a precision of 0.431, a recall of 0.416, an mAP@0.5 of 0.387, and an mAP@0.5:0.95 of 0.193. Compared with representative baselines, M3 demonstrates improved robustness under patient-wise evaluation, outperforming transformer-based DETR by a large margin (mAP@0.5: 0.387 vs. 0.232) and achieving performance comparable to YOLOv7 while exhibiting substantially higher sensitivity to small and diffuse lesions. Further comparison with recent studies shows that the proposed method achieves higher overall mAP@0.5 (0.387 vs. 0.362–0.378) while improving detection performance on clinically challenging abnormality classes. These results indicate that the proposed YOLOv11-based framework provides a reliable and clinically meaningful baseline for chest X-ray lesion screening and future methodological advancements.
© 2026 Thi-Da-Huong Truong, Ngoc Huynh Pham, Vo-Phuong-Tam Nguyen, Thi-Thanh-Thuy Le, Hai Thanh Nguyen, published by Riga Technical University
This work is licensed under the Creative Commons Attribution 4.0 License.