Skip to content

eltrompetero/perception_neuron

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Audio cues enhance mirroring of arm motion when visual cues are scarce

Edward D. Lee, Edward Esposito, Itai Cohen

Developers: Edward D. Lee, Saerom Choi

Publication forthcoming in the Journal of the Royal Society Interface.

This code base is for our project on the synchronization of human motion using Noitom's Perception Neuron suit and the UE4 gaming engine in a virtual reality environment. This is the backend for running the experiment and the analysis.

The raw motion data is available here organized by Subject and by experiment version. Experiment version 3_3 corresponds to "Visual Only," 3_4 to "Audio," 3_5 to "Audio and Train," and 3_6 to "Train." The animations used for mirroring are in the Animations folder.

For any questions about the data, contact EDL edl56@cornell.edu.

About

Code repository for JRS Interface paper

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published