EMOTE PROJECT 2017
Interdisciplinary collaboration between Media Art Nexus and the NTU School of Computer Science and Engineering, (SCSE) and NTU School of Social Sciences (SSS) Psychology.
Emote, real-time audio reactive animation, Derivative TD, EEG, Duration 50:00
Emote is ongoing series of real-time experimental animations. EEG was used to record and analyze brain signals for recognizing emotional patterns, which were then used to generate 20 animated chapters in real time after non-lyrical music clips of the known emotional class (such as happy, sad, exciting, scary, etc). The soundtracks were chosen primarily from Japanese television dramas, anime, and films, and they were tested for the emotional responses they elicited. The same sound was used to create animated paintings. A study is being conducted to see if the visuals created after the music elicit the same emotions.
For overview of all of the animated chapters please visit Emote Portfolio on Vimeo:
Emote Portfolio – 01_Emerald on Vimeo (vimeopro.com)
Two sets of visuals were completed for this project using the Derivative TouchDesigner’s reactive audio tool-sets. The final collection of videos are numbered 1–20 and in 4 sets consisting of the emotions Happy, Exciting, and Frightening & Melancholy. In testing, a graphics and music clip related to the recognized emotion are selected and played to the subject, and their emotional neuro-feedback is recorded. Then, the resulting data is classified and analyzed. The future goal of these studies would be to create public arts to induce emotions that help the subject in cognitive functions such as learning, concentration, relaxation, etc.
Emote experimental animation, 50 min, 15m by 2m LED, Nanyang Technological University Singapore
In this work, music language is a means for learning about the emotions in the brain and is used as an external stimulus in response to the emotional state of the brain.
Project Background
Emotion Study was part of the earlier research done with the Wee Kim Wee School of Communications where we created an online questionnaire loaded with images designed to produce the feelings: SAD, HAPPY, ANGRY, DISGUST, MELANCHOLY. About 50 people did the test. The images were abstract paintings, photographs, and computer-generated imagery. We used the results of the test to select imagery that corresponds to peoples evaluation and our particular vision of the abstract film Emotion Study.
Mark Chavez, Yun Ke Chang, “Cinematics and Narratives: Movie Authoring & Design Focused Interaction.” Leonardo Electronic Almanac, Published 2013-07-15,
http://journals.gold.ac.uk/index.php/lea/article/view/87
Chang, Yun-Ke, Mark J. Chavez, Miguel A.
https://www.semanticscholar.org/paper/An-Active-Cinema-Experience%3A-A-Study-on-User-and-of-Chang-Chavez/bbf305de00aa2fd98a5c8a293aeb079ee9365e4b.Morales-Arroyo, and Jaime Jimenez-Guzman. “An Active Cinema Experience: A Study on User Expectation and Perceived Gratifications of a Real-time Animated Film System.” 2012 Ninth International Conference on Information Technology – New Generations (2012): 674-679.