Have a personal or library account? Click to login
Achieving Efficient Prompt Engineering in Large Language Models Using a Hybrid and Multi-Objective Optimization Framework Cover

Achieving Efficient Prompt Engineering in Large Language Models Using a Hybrid and Multi-Objective Optimization Framework

Open Access
|Jun 2025

Abstract

Prompt optimization is crucial for enhancing the performance of large language models. Traditional Bayesian Optimization (BO) methods face challenges such as local refinement limitations, insufficient parameter tuning, and difficulty handling multi-objectives. This study introduces a hybrid multi-objective optimization framework that integrates BO for global exploration and a Genetic Algorithm for fine-tuning prompt hyperparameters using evolutionary techniques. The Non-dominated Sorting Genetic Algorithm II is employed to identify Pareto-optimal solutions, balancing accuracy, efficiency, and interpretability. The framework is evaluated using the GLUE benchmark dataset with BERT-based tokenization for structured input representation. Experimental results demonstrate that the proposed model achieves 95% accuracy, 85% efficiency, and 79% interpretability across three benchmark datasets, outperforming conventional BO-based methods. The findings confirm that the hybrid approach significantly enhances search efficiency, refinement, and multi-objective optimization, leading to more effective and robust prompt optimization.

DOI: https://doi.org/10.2478/cait-2025-0012 | Journal eISSN: 1314-4081 | Journal ISSN: 1311-9702
Language: English
Page range: 67 - 82
Submitted on: Mar 11, 2025
Accepted on: May 4, 2025
Published on: Jun 25, 2025
Published by: Bulgarian Academy of Sciences, Institute of Information and Communication Technologies
In partnership with: Paradigm Publishing Services
Publication frequency: 4 issues per year

© 2025 Sridevi Kottapalli Narayanaswamy, Rajanna Muniswamy, published by Bulgarian Academy of Sciences, Institute of Information and Communication Technologies
This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License.