Abstract
This paper examines the integration of emotional responses and embodied cognition into AI-driven generative spatial design. It defines and validates parameters enabling AI systems to dynamically adjust spatial environments based on human emotions and behaviours. Employing embodied cognition theory, natural user interfaces, comparative software analysis, and machine learning, the authors identify methods to translate human interaction into generative design, laying the groundwork for responsive, emotionally adaptive environments.