Editorial Type: RESEARCH ARTICLE
 | 
Online Publication Date: 01 Nov 2024

Augmented Reality Assessments to Support Human Spaceflight Performance Evaluation

Full access
and
Article Category: Research Article
Page Range: 831 – 840
DOI: 10.3357/AMHP.6393.2024
Save
Download PDF

INTRODUCTION: As next-generation space exploration missions require increased autonomy from crews, real-time diagnostics of astronaut health and performance are essential for mission operations, especially for determining extravehicular activity readiness. An augmented reality (AR) system may be a viable tool allowing holographic visual cueing to replace physical objects used in traditional assessments.

METHODS: In this study, 20 healthy adults were compared in an Ingress and Egress Task and Obstacle Weave Task with holographic and physical objects to determine the effect of AR on performance. Subjects performed each task three times within each modality.

RESULTS: AR exhibited increased task completion times with greater head pitch angles across the two tasks. The head and torso angular velocity showed a reduction in magnitude in both tasks within AR, while decreased magnitudes of head and torso acceleration were observed for the Obstacle Weave Task. The subjects were more deliberate and careful in their task completion during the Ingress and Egress Task within AR, stepping higher and lowering their heads further.

DISCUSSION: Subjects successfully completed both tasks using AR and meaningful assessments of their performance were obtained. The increased head pitch observed supported the hologram visualization with the reduced AR field of view. The increased task time and reduced torso angular velocity were compared to strategies used by astronauts postflight while experiencing sensorimotor impairments. AR may be a useful instrumentation solution for assessing in-flight performance, providing embedded sensors and onboard computations; however, thresholds for assessing extravehicular activity readiness must be developed.

Weiss H, Stirling L. Augmented reality assessments to support human spaceflight performance evaluation. Aerosp Med Hum Perform. 2024; 95(11):831–840.

Spaceflight presents unique challenges to astronauts, particularly during extended missions beyond low Earth orbit. During microgravity exposure, the vestibular organs receive novel cues, which can lead to altered vestibular response and sensory reinterpretation.5 Microgravity-induced vestibular alterations can result in maladaptive behavior when returning to gravity-rich environments. Postflight analyses of crewmembers returning from the International Space Station (ISS) have demonstrated reduced postural control,28 impaired locomotion,21 and compromised fine motor skills,16 which may contribute to motion sickness,14 spatial disorientation, and diminished hand-eye coordination. During the initial stages of readaptation, vestibular challenges can significantly impact critical mission tasks, such as manual vehicle control,19 early extravehicular activity (EVA), and capsule ingress and egress.23 Future human spaceflight exploration, especially missions farther from Earth and longer in duration, poses even greater challenges for astronaut crews as they will not receive the same level of medical support upon landing as they do on Earth.27 Additionally, no readiness assessments currently exist for vestibular and sensorimotor performance during space operations. Assessing, diagnosing, and reducing risks associated with vestibular alterations requires assessments and technologies that observe mission resource constraints and are flexible for diagnostic capabilities and risk mitigation.

Presently, postflight Earth-based clinical measures of astronauts’ neurovestibular balance performance is evaluated using the Sensory Organization Test battery with computerized dynamic posturography,1,28 the Functional Mobility Test (FMT),8 and a suite of functional tasks called Sensorimotor Standard Measures.6 The FMT is an obstacle course that imposes distinct challenges to the balance control system, as it requires the subject to step over, under, and around foam obstacles while changing direction and body postures.18,20 Following landing, postflight testing of long-duration crewmembers has shown an increase in total FMT time and an estimated recovery time of 15 d to achieve 95% of preflight baseline performance.20 Additional investigations have demonstrated that crewmembers have difficulty performing rotational movements around a cone upon returning to Earth following ISS missions, with reduced torso angular velocity.6 As crewmembers are increasingly required to operate more autonomously and engage in prolonged EVAs, addressing the vestibular and sensorimotor alterations associated with space travel becomes crucial in flight.

In ground-based clinical practice, various screening tests are used for vestibular disorders and elderly populations, including tandem walking balance,9 the Timed Up and Go task,7 and the Berg Balance Scale.7 These Earth-based assessments used in clinical care and postflight evaluations require various resources and equipment, laboratory space, and trained personnel that make them unsuitable for spaceflight operations.

The development of innovative diagnostic tools and countermeasure devices for spaceflight operations requires considering both volumetric and resource limitations. Volumetric capacity is constrained by the available space within the spacecraft module or habitat, impacting the utilization and storage of assessment tools. The spectrum of resource constraints encompasses the requirements imposed on the crew for setup, operation, and storage of these devices, the energy demands they place on the spacecraft’s systems, and mass limitations essential for launch considerations. Maximizing multifunctionality and adaptability in assessment tools for microgravity and reduced gravity environments is crucial for minimizing mass and storage constraints while continuously assessing the crew’s physical performance throughout the mission.

Augmented reality (AR) is a promising solution to spaceflight requirements by replacing traditional physical assessment tools with holographic visual cues, merging real-world and computer-generated content across sensory modalities. Extended reality technologies are becoming more prevalent for training and operational support in various terrestrial and in-flight microgravity domains, enabling crews to improve operational and behavioral skills.10 AR headsets possess several inherent advantages for space applications. These systems are easily deployable and allow users to perceive their physical surroundings, a crucial safety feature in confined spaces. Moreover, AR headsets offer the flexibility to integrate new software applications and serve multiple functions, including procedural guidance,4 within the context of space missions. Furthermore, AR headsets feature built-in computational processing, which can be seamlessly combined with embedded sensors such as inertial measurement units (IMUs) to measure and assess performance in real time. As a body-worn tool, AR headsets can administer assessments, provide visual cues and animations of task demonstrations, and record and analyze user performance, features distinguishing AR from other computer-based systems.

Recent research has explored AR and virtual reality systems to facilitate balance assessments and virtual training regimes to improve motor function across diverse populations. These applications include tasks for assessing standing balance and promoting rehabilitation through gamification of physiotherapy tasks.25 By incorporating visual cues and engaging holographic content that adapts to user movements through 4–12 wk of intervention, these applications have demonstrated improvements in balance as measured by the Timed Up and Go task and Berg Balance Scale in stroke patients and elderly populations. Nevertheless, a knowledge gap persists regarding the influence of holographic objects on individuals’ balance performance within discrete assessments and the implementation of dynamic operationally relevant assessments that require interactions with objects. Evaluating the feasibility of AR technology for in-flight vestibular assessments and determining performance standards for AR use in EVA readiness decision-making requires a comparative analysis of operational tasks within AR and those involving physical objects.

Informing EVA readiness requires assessments that closely resemble operational tasks and body postures without the inherent risks and complexity of suited operations. Planetary EVA tasks such as geology sampling, capsule egress, and dynamic directional control during ambulation require appropriate integration of vestibular cues to be safely and efficiently performed. The functional attributes of EVA tasks that should be considered in operational assessments include pitching of the head and torso, full-body yaw, kneeling, and ambulatory adaptability in response to surface attributes such as regolith deformation and obstacles.

In this study, the performance of healthy adults is compared for two dynamic operationally informed balance tasks between an AR environment and with physical objects. The selected assessments comprised an Ingress and Egress Task (IET) and an Obstacle Weave Task (OWT). Due to the limited field of view of the headset, we anticipated that there would be differences in: 1) task completion times, 2) step characteristics, and 3) head and torso kinematics as measured by accelerations, angular velocities, and orientations. This study carries significant implications for AR application designers, especially those focused on postural control and balance assessments. The research underscores the importance of considering how holographic content is presented and how the use of AR may affect user performance. Understanding whether AR’s use notably impacts users’ balance is essential for ensuring safety. Furthermore, our findings shed light on the potential usage of AR as an in-flight assessment tool for future exploration-class missions.

METHODS

Subjects

A total of 20 University of Michigan students participated in this experiment (11 women and 9 men, mean age 25.20 yr ± 8.7 SD, range: 18–60 years). The subjects were informed of the procedures before participating and signed the informed consent form. The study was approved by the University of Michigan Institutional Review Board for Health Sciences and Behavioral Sciences, study number HUM00213807. All subjects reported normal (N = 14) to corrected-to-normal vision (N = 6) and reported no musculoskeletal, auditory, or vestibular disorders. Of the subjects, 15 reported previous limited experience (between 1 and 10 h) with extended reality devices (i.e., AR or virtual reality); the remaining subjects were considered first-time users.

Procedure

This study was part of a larger protocol evaluating a set of clinically informed dynamic assessments and operationally informed functional tasks. To observe potential in-flight volumetric restrictions, the tasks were constrained to a 2 × 2-m circular volume (Fig. 1). This paper presents the IET and the OWT. The subjects underwent a 90-min protocol, encompassing 15–20 min for headset calibration, AR hand gesture and system navigation training, and task familiarization, with the remaining duration for evaluative testing and survey responses. Each subject donned 11 IMUs and 36 motion capture markers at specific anatomical locations. Subjects were randomized into one of two groups, determining the order in which they performed the tasks (holographic AR objects vs. physical objects first). To ensure consistent weight distribution on the head across the modalities, an AR headset (Microsoft HoloLens 2, Microsoft, WA, United States) weighing approximately 1.25 lb was used across all conditions. With see-through holographic lenses, subjects could see their physical environment regardless of whether the device was active or inactive during the AR and physical trials.

Fig. 1.Fig. 1.Fig. 1.
Fig. 1. IET (left) and OWT (right). Internal portal dimensions were 1.70 m high by 1 m wide with a step height of 0.2 m. The obstacles were 0.15 m wide, placed 1 m apart, and 0.5 m from the edge of the interaction space. IET: Ingress and Egress Task; OWT: Obstacle Weave Task.

Citation: Aerospace Medicine and Human Performance 95, 11; 10.3357/AMHP.6393.2024

For accurate hologram positioning, higher display quality, and a comfortable viewing experience, subjects first underwent eye calibration with the AR headset, which computes eye positions automatically by using onboard eye-tracking capabilities.24 After the eye calibration, subjects were trained for the hand gestures necessary to interact with the system (e.g., touch selection with the index finger, a hand ray to manipulate distant holograms, and a pinch gesture to relocate and scale holograms26). Prior to the testing portion of the protocol, subjects were provided with task demonstrations and the ability to practice each assessment task within each modality. Subjects were required to demonstrate error-free task execution before progressing to the testing phase of the protocol, a requirement successfully met with two practice attempts in each interface. Errors were identified as unsuccessful task sequencing or collisions with physical and holographic objects. Instructions for the tasks were consistent across both conditions. The subjects performed the IET followed by the OWT, irrespective of their modality order. Three trials were conducted for each task and modality, totaling six trials per task. A post-task survey was administered after subjects completed three trials for each modality for a single task. The post-task survey incorporated questions from the National Aeronautics and Space Administration Task Load Index (NASA-TLX).13

An evaluation of dynamic balance and task performance was conducted for the two operationally informed assessments, the IET and the OWT. The assessments were completed in an AR environment with holographic content and again with physical objects of corresponding dimensions. The operational balance assessments were derived from the FMT task20 and were chosen due to their operational relevance and functional characteristics that have proven difficult for crewmembers after long-duration missions.6,21 The intention is that crewmembers will be better prepared for their operational EVA tasks with the additional complexity of spacesuits if they are able to perform these dynamic tasks prior to EVA without an increased concern of falls or functional immobility.

The IET simulates an astronaut entering or exiting their spacecraft or airlock upon landing on a planetary body, a potentially difficult task immediately following a gravity transition from extended exposure in microgravity. The dynamic task required subjects to ambulate through an airlock portal while minding their posture to allow clearance for their head with the top of the airlock and their feet with the step portion of the airlock (Fig. 1). To evaluate the relative stability of both legs as support legs during the dynamic motion, subjects are instructed to step through the airlock, turn 180°, and step through the airlock again with the opposite leg as the leading leg. During transitions through the airlock, subjects were permitted to maintain movement in the frontal plane or turn sideways, as egress strategies may vary when a crewmember wears a spacesuit.

The OWT simulates aspects of astronauts ambulating on a planetary surface, which will require dynamic directional changes in movement to avoid large geology samples, surface structures, or craters. Subjects were required to maneuver around two aligned circular obstacles placed on the floor 1 m apart (Fig. 1). The subjects were instructed to weave in a figure-eight pattern to allow for both rotations about the right and left side, i.e., clockwise and counterclockwise motions. The starting position was located behind one of the objects, positioned axially along the midplane of the obstacles. After crossing the midplane of the two obstacles at the center of the interaction space, subjects walked around the second obstacle before returning to the starting point. The OWT is most akin to the slalom segment of the FMT, where crewmembers must maneuver around multiple foam columns. Task performance of the OWT can also be compared to segments of the obstacle walk, another postflight test in which crewmembers navigate around a cone.6

Based on literature and clinical practice for dynamic balance tasks, IMU sensors and motion capture data were used to derive dependent measures. The use of IMUs enabled the assessment of postural sway and orientation metrics to inform subjects’ postures while performing the assessment tasks. The motion capture system was used to derive task-specific metrics requiring precise position estimation, such as collisions with holographic content. Despite the ability to measure postural sway with motion capture, IMUs are more practical as they do not require a laboratory setting. The study included both methods due to the wide availability and use of IMUs outside the laboratory. In addition, the AR headset device featured an embedded IMU sensor, which can be leveraged to enable future standalone assessments.

The primary dependent measure was task completion time, as is customary with the FMT task.20 The IMU-based measures extracted from the raw accelerometer and gyroscope signals were based on the L2-norm of the signals. Using the L2-norm, metrics included the root mean square (RMS) of the norm vectors and the magnitude (i.e., the maximum value of the rectified norm vector). Using these dependent measures for evaluating standing postural balance with IMU sensors is common in the literature.11

Orientation decomposition and sensor-to-segment alignment were performed on the IMU measurements to estimate the orientation of the head and torso in the sagittal, coronal, and transverse planes. A neutral standing posture was used as a zeroing method for computing alterations in pitch, roll, and yaw across the time series of each task. For the accelerations and angular velocity signals, an extended Kalman filter was used as a sensor fusion algorithm to estimate the orientation of the IMU. The remaining algorithmic process includes defining the local task frame with respect to the IMU sensors, defining the anatomical frames, projecting the relevant global vectors into the anatomical planes,22 and computing the orientation angles using the right-handed ISB coordinate system.29 The primary metric of interest for the IET was pitch, as measured along the sagittal plane. Roll was defined as the side-to-side lean in the coronal plane. Yaw was considered the side-to-side twist in the transverse plane. A positive pitch was defined when the subject leaned back (nose up), a positive roll occurred when the subject leaned to the right, and a positive yaw occurred when the subject twisted to the left.

Task-specific analyses that aligned with task goals were conducted using motion capture data. Dependent measures were averaged within the trial and then across the three trials for each modality to yield a single measure of performance within AR and physical conditions. Although subjects could not feel physical contact with the holograms, the level of clearance achieved from the holograms was investigated during post-processing. Lower-limb metrics were extracted from markers located on the feet, i.e., the toe markers and medial and lateral malleoli. Upper-body metrics relating to the head were extracted from the motion capture marker located on the forehead portion of the headset.

For the IET, foot stepping heights were captured during each aspect of the task, providing measures for the first leading leg to step through the airlock, the first following leg, the second leading leg after the 180° turn, and the second following leg closing out the task. The head clearance was estimated by calculating the distance between the head and the ground during the most prominent head downward pitch under the airlock structure. The head clearance height is interpreted with respect to the height of the airlock within the discussion. Finally, temporal decomposition was conducted to determine the total time to complete each portion of the task, i.e., the translation through the airlock on the first and second attempts and the 180° turn within the middle of the task.

For the OWT, foot clearance (assessed by step height) and step height variability (assessed by standard deviation) were calculated and averaged across the feet for each trial. The enclosed walking area was measured using a conforming boundary function around the 2-D planar positions from the toe markers with a shrink factor of 0.9, providing a compact boundary that envelopes the data. The enclosed walking area characterized the overall space used by the subjects, while similar analyses were conducted to capture the inner foot boundaries to understand the area used around each obstacle. A larger enclosed walking area indicates further distance from the task obstacles, as does a larger inner boundary area.

Statistical Analysis

Performance measures and survey responses were assessed using paired t-tests to compare group means for the AR and physical modalities. The level of significance for these pairwise comparisons was 0.05. The effect size was calculated using Cohen’s method. Data from two subjects were excluded in the analysis of the OWT for measures that relied on motion capture markers from the feet (i.e., step height, step height variability, and enclosed walking area), as one toe marker was occluded for most of the task. The inner boundary area measures for the OWT were unaffected, as the inner foot marker was consistently present across subjects.

RESULTS

Table I and Table II present the compiled results and associated P-values for the IET and the OWT. Fig. 2, Fig. 3, and Fig. 4 present subject-specific data, the median, and interquartile ranges.

Table I. Task-Specific Measures of Performance and Related P-Values and Cohen’s Effect Size (d).
Table I.
Table II. IMU-Based Measures of Performance and Related P-Values and Cohen’s Effect Size (d).
Table II.

Fig. 2.Fig. 2.Fig. 2.
Fig. 2. IET measures included total task time, head distance from the floor, second leading leg height, RMS of downward head pitch, and RMS of head and torso angular velocity. IET: Ingress and Egress Task; RMS: root mean square.

Citation: Aerospace Medicine and Human Performance 95, 11; 10.3357/AMHP.6393.2024

Fig. 3.Fig. 3.Fig. 3.
Fig. 3. OWT measures included total task time, average distance to the first obstacle, step height variability, RMS of the head pitch angle, and RMS of the head and torso angular velocity. OWT: Obstacle Weave Task; RMS: root mean square.

Citation: Aerospace Medicine and Human Performance 95, 11; 10.3357/AMHP.6393.2024

Fig. 4.Fig. 4.Fig. 4.
Fig. 4. NASA-TLX measures for the IET and OWT. Top: perceived physical demand (P < 0.001, d = 0.66), mental demand (P = 0.001, d = 0.71), and performance (P < 0.001, d = 1.29) for IET. Bottom: perceived physical demand (P = 0.099, d = 0.42), mental demand (P = 0.023, d = 0.72), and performance (P = 0.034, d = 0.63) for OWT. IET: Ingress and Egress Task; OWT: Obstacle Weave Task.

Citation: Aerospace Medicine and Human Performance 95, 11; 10.3357/AMHP.6393.2024

The IET yielded a significant difference in the total task time between AR and physical conditions, with increased task duration while using AR (Table I and Fig. 2). The IET segments (first and second attempts through the portal and the turn) exhibited increased duration for AR compared to the physical condition. While there was no significant difference between the step heights during the first time through the airlock, the step heights were higher during AR with the second leading leg and second following leg during the return traverse (Fig. 2). During both the first and second traverses, a significantly lower head height was measured from the floor for AR (Table I, Fig. 2), with a significantly greater RMS head pitch angle (Table II). AR resulted in significantly lower RMS of the head and torso angular velocity (Fig. 2). There was no significant difference between groups found for the RMS of the head acceleration, RMS of the torso acceleration, the magnitude of the acceleration and angular velocity signals for the head and torso, or the roll and yaw angles of the head and torso.

Subjects in the AR condition completed the OWT slower than when performed in the physical condition (Table I). In the physical condition, subjects stepped higher and with greater variability when performing the task. The results yielded no significant differences between the combined and separate inner boundary areas.

For the IMU-derived measures (Table II), subjects in the AR condition had significantly lower RMS of head acceleration, head angular velocity, torso acceleration, torso angular velocity, and the magnitudes of the head acceleration and torso angular velocity (Fig. 3). No significant difference was observed for the head angular velocity or torso acceleration magnitude.

The RMS of the head pitch and roll angles were significantly greater while using AR (Fig. 3). In contrast, the RMS of the head yaw angle was significantly reduced within AR. Subjects exhibited higher magnitudes for the RMS of the torso roll angle while performing the task within AR. The results yielded no significant difference in the RMS of the torso pitch or yaw angles.

Subjects reported higher perceived mental demands while using AR to perform the IET and OWT (Fig. 4). The perceived task performance was rated higher when the tasks were completed in the physical condition. While the perceived physical demand, effort [t(19) = −3.53, P = 0.002, d = 0.83], and frustration [t(19) = −2.80, P = 0.012, d = 0.39] were rated higher in AR for the IET, there were no differences in physical demand, effort, or frustration observed for the OWT.

DISCUSSION

In the present study, healthy adults performed two operationally informed functional assessment tasks within an AR environment and while using physical objects. The selected assessments comprised an IET and an OWT. Due to the limited field of view of the headset, it was hypothesized there would be differences in: 1) task completion time, 2) step characteristics, and 3) head and torso kinematics as measured by accelerations, angular velocities, and orientations.

The results affirm our first hypothesis that subject performance would reveal differences in task completion times. The use of the AR environment led to prolonged completion times for both assessments. According to postflight research, crewmembers have altered locomotor function after long-duration space travel (163–195 d on the ISS), requiring a median of 48% longer time to complete the entire FMT courses 1 d postflight relative to preflight.20 Using analogous portions of the FMT, we found that the use of AR increased the IET time by 39% and OWT time by 12%. For the IET, all aspects of the functional movement, such as the initial step-through, 180° turn, and second step-through, were performed slower while using AR, contributing to the overall extended task duration. AR produced similar results in terms of directional change along the temporal dimensions of the task as microgravity-induced locomotion, suggesting that AR use may increase task time further while one is also experiencing vestibular disorientation. An increase in the time required to complete a discrete task successfully is not inherently dangerous or unfavorable. The increased time is likely to be beneficial to someone experiencing motion sickness. However, to inform decision-making regarding EVA operations, a new threshold of performance is required to incorporate the longer baseline completion times observed in AR. In this study, the effects of continued use of AR on time completion were not examined; however, the times observed may serve as preliminary thresholds for healthy users. The development of thresholds for users with vestibular alterations, such as astronauts, requires further research.

The second hypothesis regarding step characteristics was supported. Subjects exhibited greater caution during the IET in the AR environment by lifting their feet higher to avoid the bottom portion of the portal during the egress attempt following the 180° turn. While all subjects maintained proper clearance of their feet with respect to the step of the airlock, one subject narrowly cleared the holographic step (0.02 m) with their following leg, which may suggest that their perception of the object’s height may have been diminished and would have increased fall or tripping risk if the object were real. For the OWT, no differences were found in the enclosed walking area or the inner boundary area measures, suggesting that subjects maintained consistent distances from the obstacles across both modalities. The consistency in how subjects positioned their feet across modalities could be attributed to the limited task space and the need to circumnavigate the obstacles. Subjects took lower steps during the OWT with respect to the ground and were less variable in their step heights within AR, suggesting a more cautious or deliberate approach for the OWT. The perception of task performance supports these findings, with higher perceived success within the physical condition and more mental demand within AR for both tasks. Additional investigations could explore alternative metrics, such as the average distance of the user’s feet to the center of the obstacles; however, it is critical to ensure an accurate representation of the foot locations in the holographic reference frame to compare obstacle locations for distance-based analyses in AR. Existing AR systems lack the capability to track foot movements, necessitating efforts to incorporate aspects of foot-tracking features into new AR systems to support step characteristics for task analysis.

The third hypothesis was supported by differences in head and torso kinematics. With reduced peripheral vision of the holograms in AR, subjects increased head pitch for both the IET and OWT. In addition to increased head pitch for the OWT, subjects increased their head and torso roll angles to support consistent visualization of the obstacles while navigating in the figure-eight pattern. For the IET, subjects lowered their heads further below the top portion of the airlock and adopted a greater downward head pitch. In contrast, torso posture remained consistent across both modalities. All subjects could duck their heads low enough to clear the top of the holographic portal, although the perceptions of task performance were discordant with the direct measures analyzed. This discrepancy suggests that subjects believed they had not sufficiently lowered their heads, potentially due to a constrained perception of the entire portal structure. Within the FMT, although most crewmembers have exhibited an ability to avoid the obstacles completely, a subset failed to avoid contacting the foam obstacles in testing sessions 1 d following their return to Earth.20 While foam obstacles are generally safe, AR may provide a contactless and safe approach to assessing and training obstacle avoidance during periods of significant vestibular dysfunction both in flight and postflight on Earth.

The acceleration and angular velocity data also supported our third hypothesis regarding differences in head and torso motions, with decreased head and torso angular velocity for both tasks and decreased torso accelerations for the OWT compared to the physical conditions. A modulation of head and torso angular motion may have been used to stabilize holographic images within the users’ view. In the physical condition for the OWT, head and torso accelerations and angular velocities were higher, potentially indicating greater confidence in performing the task and perceptions of object locations relative to their walking trajectories. The results suggest that subjects were more deliberate in their movements while interacting with holographic objects, despite there being no risk of collisions or penalties. In operations, the reduction of motion sickness symptoms through reduced body movements in astronauts is likely to take precedence over holographic collisions. Therefore, metrics of successful task completion, such as object collisions, should be incorporated to track incidents of error, especially if the goal is to determine EVA readiness where task precision is desired for safety.

The constrained field of view of the headset, which required adjustments in head posture to keep objects in view, played a role in users’ selected strategies. The reduced head and torso angular velocities observed in this study are similar to the crew performance observed while turning around a cone, which was three times lower on landing day than preflight.6 AR presented similar strategies as the reduction of postflight angular rate due to motion sickness and vestibular impairments.6 Therefore, performing tasks in AR would not be expected to exacerbate these symptoms. In operational terms, when astronauts conduct extravehicular activities in spacesuits, they must contend with a reduced visual field of view and maintain a mental map of obstacles and objects for safe navigation. The aspect of AR promoting more deliberate motion could prove beneficial, in addition to other advantages of the technology that observes spaceflight resource constraints.

Although the decrease in accelerations and angular velocities resembled those of astronauts in deconditioned states, the adjustment in head pitch differed from crew performance, which could have either positive or negative implications. When astronauts transition between different gravity environments, increased head pitch can pose a challenge for reducing motion sickness symptoms. Postflight research demonstrates astronauts primarily reduce head pitch during locomotion, a strategy also used by patients with vestibular deficits.2 Ultimately, minimizing head movements serves as a rapid short-term behavioral countermeasure to remedy motion sickness12; however, crewmembers are encouraged to administer progressively larger self-paced head movements to facilitate readaptation to Earth’s gravity.27 The operational tasks in AR may require a larger head pitch due to the field of vision, which may encourage a controlled approach to promoting more head movements to assist in adaptation or generate increased symptoms of motion sickness. Crewmembers are advised to remain within their personal threshold of motion tolerance after landing to minimize motion sickness symptoms because performing head tilts too rapidly or with too much amplitude may aggravate symptoms.27 Ultimately, user performance under vestibular and sensorimotor disorientation will differ and future research should examine how AR will impact these situations.

While the study explored several motion-based and time-based measures for a sample of 20 healthy subjects, future studies should examine these findings across larger population samples and sensorimotor impaired populations such as recently returned astronauts. Although the NASA-TLX is widely implemented and highly regarded for its sensitivity to various levels of workload and operator acceptance,15 incorporating additional performance or physiological measures could facilitate the assessment of workload based on observable behaviors. The obstacles chosen for the OWT were more representative of the cone dimensions used within other postflight assessments, such as the walk-and-turn,6 rather than the dimensions of foam pillars used within the FMT.20 Considering the AR headset’s field of view constraints, shorter obstacles were used to allow users to see the entire object when directing their gaze. However, future investigations could consider the impact of obstacles that exceed the user’s rendered field of view within AR, thereby potentially requiring more head movements to view the object. While this study was part of a larger protocol that evaluated six different dynamic assessments and functional tasks, additional evaluations of other relevant operational tasks to assess EVA readiness are needed to understand the potential benefit of using AR as a standalone assessment tool within spaceflight operations.

The current study was able to successfully use head measures to assess performance, a capability that could be extracted from the embedded IMU sensor of the AR headset to enable self-administered assessments with real-time data feedback to inform readiness. Nevertheless, additional body-worn sensors would need to be integrated to characterize torso and lower body performance. Additional work may consider AR and wearable sensors integrated concurrently with deep-learning algorithms and artificial intelligence to support real-time analysis of performance, body postures, and fall detection.3,17,30 While no instances of collisions with task objects were observed in this study, potential future applications could introduce an audio alert if a user’s head contacts the top of the holographic airlock, particularly if this alert is deemed valuable for instructing crewmembers to adopt postures that optimize egress performance. The AR application could serve as a potential countermeasure to support the rehabilitation and readaptation of the crewmembers by encouraging gradually more extreme body postures that they will eventually need to perform in more difficult circumstances, such as while wearing a spacesuit.

This research evaluated the effect of holographically administered operationally informed functional tasks on subject performance in an IET and an OWT. The performance of 20 healthy adults was compared within AR and physical object environments. As measured by IMU sensors and motion capture software, subjects performed both tasks successfully; however, task completion strategies differed across modalities. Task completion time was longer, and the head and torso angular velocities were reduced for both assessment tasks when administered in AR. More extreme head pitch was observed across the tasks as well as increased average head roll orientation during the OWT. Subjects were more deliberate and careful with the task execution by stepping higher and lowering their heads further for the IET. The subjects were able to complete both tasks in AR and meaningful measures of postural control were obtained, indicating that although different baseline performance is obtained, AR may be a useful instrumentation solution with embedded sensors to evaluate a variety of populations for postural control. The differences observed support the need for implementation-specific baselines. Subjects adopted strategies similar to sensorimotor impaired crewmembers, with increased task time and reduced head and torso angular velocities. Contrastingly, subjects exhibited increased head pitch angles, which is typically reduced in astronauts postflight. From these analyses, AR may be a useful instrumentation solution for evaluating in-flight performance with embedded sensors and could be part of a countermeasure toolset.

Copyright: Reprint and copyright © by the Aerospace Medical Association, Alexandria, VA. 2024
Fig. 1.
Fig. 1.

IET (left) and OWT (right). Internal portal dimensions were 1.70 m high by 1 m wide with a step height of 0.2 m. The obstacles were 0.15 m wide, placed 1 m apart, and 0.5 m from the edge of the interaction space. IET: Ingress and Egress Task; OWT: Obstacle Weave Task.


Fig. 2.
Fig. 2.

IET measures included total task time, head distance from the floor, second leading leg height, RMS of downward head pitch, and RMS of head and torso angular velocity. IET: Ingress and Egress Task; RMS: root mean square.


Fig. 3.
Fig. 3.

OWT measures included total task time, average distance to the first obstacle, step height variability, RMS of the head pitch angle, and RMS of the head and torso angular velocity. OWT: Obstacle Weave Task; RMS: root mean square.


Fig. 4.
Fig. 4.

NASA-TLX measures for the IET and OWT. Top: perceived physical demand (P < 0.001, d = 0.66), mental demand (P = 0.001, d = 0.71), and performance (P < 0.001, d = 1.29) for IET. Bottom: perceived physical demand (P = 0.099, d = 0.42), mental demand (P = 0.023, d = 0.72), and performance (P = 0.034, d = 0.63) for OWT. IET: Ingress and Egress Task; OWT: Obstacle Weave Task.


Contributor Notes

Address correspondence to: Hannah Weiss, Industrial and Operations Engineering, University of Michigan Rackham Graduate School, 1205 Beal Ave., Ann Arbor, MI 48109, United States; hanweiss@umich.edu.
Received: 01 Nov 2023
Accepted: 01 Jun 2024
  • Download PDF