This page has only limited features, please log in for full access.
In virtual reality (VR) applications such as games, virtual training, and interactive neurorehabilitation, one can employ either the first-person user perspective or the third-person perspective to perceive the virtual environment; however, applications rarely offer both perspectives for the same task. We used a targeted-reaching task in a large-scale virtual reality environment (N=30 healthy volunteers) to evaluate the effects of user perspective on the head and upper extremity movements, and on user performance. We further evaluated how different cognitive challenges would modulate these effects. Finally, we obtained the user-reported engagement level under the different perspectives. We found that first-person perspective resulted in larger head movements (3.52±1.3m) than the third-person perspective (2.41±0.7m). First-person perspective also resulted in more upper-extremity movement (30.08±7.28m compared to 26.66±4.86m) and longer completion times (61.3±16.4s compared to 53±10.4s) for more challenging tasks such as the “flipped mode”, in which moving one arm causes the opposite virtual arm to move. We observed no significant effect of user perspective alone on the success rate. Subjects reported experiencing roughly the same level of engagement in both first-person and third-person perspectives (F(1.58)=0.9,P=.445). User perspective and its interaction with higher-cognitive load tasks influences the extent of movement and user performance in a virtual theater environment, and may influence the choice of the interface type (first or third person) in immersive training depending on the user conditions and exercise requirements.
Juan Trelles Trabucco; Andrea Rottigni; Marco Cavallo; Daniel Bailey; James Patton; G. Elisabeta Marai. User perspective and higher cognitive task-loads influence movement and performance in immersive training environments. BMC Biomedical Engineering 2019, 1, 1 -12.
AMA StyleJuan Trelles Trabucco, Andrea Rottigni, Marco Cavallo, Daniel Bailey, James Patton, G. Elisabeta Marai. User perspective and higher cognitive task-loads influence movement and performance in immersive training environments. BMC Biomedical Engineering. 2019; 1 (1):1-12.
Chicago/Turabian StyleJuan Trelles Trabucco; Andrea Rottigni; Marco Cavallo; Daniel Bailey; James Patton; G. Elisabeta Marai. 2019. "User perspective and higher cognitive task-loads influence movement and performance in immersive training environments." BMC Biomedical Engineering 1, no. 1: 1-12.
Through the use of open data portals, cities, districts and countries are increasingly making available energy consumption data. These data have the potential to inform both policymakers and local communities. At the same time, however, these datasets are large and complicated to analyze. We present the activity-centered-design, from requirements to evaluation, of a web-based visual analysis tool to explore energy consumption in Chicago. The resulting application integrates energy consumption data and census data, making it possible for both amateurs and experts to analyze disaggregated datasets at multiple levels of spatial aggregation and to compare temporal and spatial differences. An evaluation through case studies and qualitative feedback demonstrates that this visual analysis application successfully meets the goals of integrating large, disaggregated urban energy consumption datasets and of supporting analysis by both lay users and experts.
Juan Trelles Trabucco; Dongwoo Lee; Sybil Derrible; G. Elisabeta Marai. Visual Analysis of a Smart City’s Energy Consumption. Multimodal Technologies and Interaction 2019, 3, 30 .
AMA StyleJuan Trelles Trabucco, Dongwoo Lee, Sybil Derrible, G. Elisabeta Marai. Visual Analysis of a Smart City’s Energy Consumption. Multimodal Technologies and Interaction. 2019; 3 (2):30.
Chicago/Turabian StyleJuan Trelles Trabucco; Dongwoo Lee; Sybil Derrible; G. Elisabeta Marai. 2019. "Visual Analysis of a Smart City’s Energy Consumption." Multimodal Technologies and Interaction 3, no. 2: 30.