We present a novel brain-computer interface integrated with a musical instrument that adapts implicitly to users' changing cognitive state during musical improvisation. We present three outcomes: 1) We use machine-learning classifications to demonstrate that cognitive workload can be measured in pianists. 2) We build a real-time, implicit system using this brain signal that musically adapts to users. 3) We demonstrate that users prefer this novel musical instrument over other conditions.
As virtual reality begins to make its way into mainstream computing and the consumer market, it is important to ensure that these solutions are careful to ensure a balanced adoption rate for both men and women. Differences in the proprioceptive system and other areas of the brain result in sex-based differences in how men and women view 3D virtual space, which needs to be accounted for in VR development.
What happens when a new form of technology also introduces an unanticipated problem? This presentation will reflect on the social acceptability of wearable and mobile forms of technology in order to better understand what makes people accept a device, and what makes people reject it. Specific attention will be given to the stigma of assistive devices and how emotional design informed by the fashion/apparel industries can support wearable technology adoption.