Extension of Performativity by a BCI
Keywords:
BCI, EED-based prediction of emotions, performativity, interactive theatre, participatory art, computer generation of musicAbstract
In the project inter@FAŢA, the creative team developed various algorithms for the generation of music, starting from real time monitored EEG. Among the music generating algorithms, together with other models, an EEG alpha spectrum analysis software was used, which was based on the difference of valence of sensors AF3-AF4. The post-performance analysis signals the correlation of these valence changes (change of potential difference between the left and right brain hemispheres) with the most intense moments of the performance and, respectively, with the most marked alternations of performance styles. The use of music and of the background sound produced, based on the EEG, emphasizes, thus, a deep, stable structure of the performance, measurable and reproducible in a number of performances. The spectator’s emotional participation is thus removed from its shroud of invisibility, and it becomes an element of visible action, accessible to other spectators. This paper looks into the work stage of the project in 2014; at present, it is developed according to the conclusions described herein.
References
ADRIAN, E.D., and MATTHEWS, B.H.C. (1934) "The Berger Rhythm: Potential Changes from the Occipital Lobes in Man." Brain, 57(4): 355-8
COAN James A., ALLEN John J.B. (2004) Frontal EEG asymmetry as a moderator and mediator of emotion, Biological Psychology 67
EDUARDO Reck, Miranda; SHARMAN Ken, KILBORN Kerry, and DUNCAN Alexander (2003) "On Harnessing the Electroencephalogram for the Musical Braincap", Computer Music Journal Summer, Vol. 27, No. 2, Pages 80-102 MIT PRESS
GRAY, J. R., & Braver, T. S. (2002) “Personality predicts working memory–related activation in the caudal anterior cingulate cortex”. Cognitive, Affective, & Behavioral Neuroscience, 2, 64-75
HAMANN, S., CANLI, T.: “Individual differences in emotion processing”. Current Opinion in Neurobiology 14(2), 233–238 (2004)
Jones, N.A., Fox, N.A. (1992) "Electroencephalogram asymmetry during emotionally evocative films and its relation to positive and negative affectivity". Brain and Cognition 20(2), 280–299
MOHAMED Elgendi, BRICE Rebsamen, Andrzej CICHOCKI, Francois VIALATTE, and Justin Dauwels (2013) “Real-TimeWireless Sonification of Brain Signals”: Advances in Cognitive Neurodynamics (III), DOI 10.1007/978-94-007-4792-0 24, © Springer ScienceCBusiness Media Dordrecht
LIN, Y.P., Wang, C.H., Wu, T.L., Jeng, S.K., Chen, J.H. (2009) EEG-based emotion recognition in music listening: A comparison of schemes for multiclass support vector machine. In: ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings. pp. 489–492. Taipei
ROSENBOOM David (1997) Extended musical interface with the human nervous system assessment and prospectus leonardo monograph series International Society for the Arts, Sciences and Technology (ISAST) San Francisco, California, U.S.A. Original 1990, Revisions to Parts 5 and 6 and additional Appendixes
Zaccaria Giovanni Marco (2010/2011) "Sonification of EEG signals. A study on alpha band instantaneous coherence". UPF MASTER THESIS University Popmpeu Fabra Barcelona
YISI Liu, Olga SOURINA, Minh KHOA (2011) “Real-time EEG-based Emotion Recognition and its Applications”. Volume 6670 of the series Lecture Notes in Computer Science p. 256-277
WINKLER, Irene; Mark Jäger, Vojkan Mihajlovi´c, and Tsvetomira Tsoneva (2010) “Frontal EEG Asymmetry Based Classification of Emotional Valence using Common Spatial Patterns” World Academy of Science, Engineering and Technology 69
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2015 Studia Universitatis Babeș-Bolyai Dramatica
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.