Human Spatial Orientation Perception During Simulated Lunar Landing Motions
Title
Human Spatial Orientation Perception During Simulated Lunar Landing Motions
Safe and precise piloted lunar landings require control inputs that depend on an accurate perception of vehicle orientation and motion. However, the unique environment and motions experienced during a lunar landing trajectory may lead to misperceptions in vehicle state. Eight subjects participated in a human subject experiment in the NASA Ames vertical motion simulator, where self-reports of perceptions of vehicle tilt angle and horizontal velocity were made during lunar-landing-like motions. Three cases of sensory cues were studied: Subjects were blindfolded and given no visual cues; subjects were provided a simulated dynamic view of the lunar terrain out a forward-looking window; and subjects were provided dynamic instrument displays showing current vehicle states. Subjects’ perception indications differed substantially from the motions being simulated in the blindfolded and out-the-window conditions, but were better matched when viewing instrument displays. Subject perceptions were also compared with predictions from a numerical model of orientation perception, and qualitatively matched in the blindfolded case, but were less well aligned in the out-the-window case. These types of misperceptions may impact astronaut control inputs
and degrade vehicle performance and safety.