Have a personal or library account? Click to login
Optimize to Open: An Exploratory-Experimental Approach to the Computational Optimization of Open Large Language Models for Educational Access Cover

Optimize to Open: An Exploratory-Experimental Approach to the Computational Optimization of Open Large Language Models for Educational Access

Open Access
|Mar 2026

Abstract

The rapid integration of artificial intelligence into open education has intensified global demands for equitable access, sustainable infrastructure, and technological inclusion. This study evaluated the feasibility and educational impact of optimizing Open Large Language Models (OLLMs) for deployment in low-resource learning environments. Five open-source models (Falcon, Bloom, GPT-NeoX, T5, and Flan-T5) were optimized using an exploratory-experimental approach with unstructured pruning and Retrieval-Augmented Generation (RAG). The intervention was tested in three simulated educational infrastructures (public university, community digital center, and rural classroom) and analyzed using quantitative metrics on system efficiency and educational output. The findings revealed: (a) a reduction of up to 11% in response time across all models; (b) a decrease of approximately 20% in RAM and VRAM usage; (c) a 1.4% improvement in the educational relevance of responses; and (d) a 33% increase in query throughput, indicating greater scalability in open education contexts. These results offer practical and ethical guidance for educators, policymakers, and technology developers by showcasing how optimized OLLMs can become key enablers of worldwide open, inclusive, and sustainable learning ecosystems.

DOI: https://doi.org/10.5334/jime.1051 | Journal eISSN: 1365-893X
Language: English
Submitted on: Apr 30, 2025
|
Accepted on: Oct 24, 2025
|
Published on: Mar 20, 2026
Published by: Ubiquity Press
In partnership with: Paradigm Publishing Services
Publication frequency: 1 issue per year

© 2026 Iván Miguel García-López, José-Martín Molina-Espinosa, María-Soledad Ramírez-Montoya, published by Ubiquity Press
This work is licensed under the Creative Commons Attribution 4.0 License.