Abstract
This thesis investigates how visual, spatial, and auditory (visuospatial) features shape human perception and attention in everyday interactions. By analyzing both low-level (e.g., motion kinematics) and high-level (e.g., gestures, gaze, speech) cues, it explores their role in understanding intent, emotion, and action. The research introduces a systematic model to characterize these features and demonstrates their influence using perceptual experiments and curated datasets. The work bridges cognitive science and computational modeling, offering insights for designing smarter human-centric technologies in fields like robotics, autonomous systems, and interactive media.
Type
Publication
Ph.D. Dissertation. University of Skövde