AI-Driven Multi-Modal Framework: Expanding Comfort Zones for Autistic Children

Main Article Content

Zeeshan Jawed Shah
Vimala Perumal

Abstract

Autistic children often rely on familiar routines, which can limit exposure to new social, sensory, and creative experiences. This paper presents a multi-modal framework that integrates augmented reality (AR), virtual reality (VR), interactive storytelling, and interpretable AI-based adaptation to support the gradual expansion of autistic children’s comfort zones. The paper does not report a completed intervention or effectiveness trial. Instead, it combines an integrative review of autism-support and creative-technology literature with a preliminary qualitative needs assessment conducted in Saudi Arabia to derive design requirements, localise the framework, and clarify how its components are linked. The proposed architecture is organized around five connected elements: narrative orchestration, AR experiences in familiar spaces, VR-based social rehearsal, state-based AI scaffolding, and caregiver-supported real-world transfer. The manuscript further specifies functional creative-media modules, a music sandbox and a digital art studio, and explains how challenge level, sensory load, and pacing can be adjusted through transparent adaptation rules. This paper proposes a design methodology and ethical framework, establishing a theoretical basis for future empirical validation of AI-driven transitions in ASD support.

Article Details

How to Cite
Jawed Shah, Z., & Perumal, V. (2026). AI-Driven Multi-Modal Framework: Expanding Comfort Zones for Autistic Children. International Journal of Creative Multimedia, 7(1), 23–47. https://doi.org/10.33093/ijcm.2026.7.1.2
Section
Articles