|Search by item||HOME > Access full text > Search by item|
JBE, vol. 26, no. 5, pp.533-542, September, 2021
A Study on the Method of Creating Realistic Content in Audience-participating Performances using Artificial Intelligence Sentiment Analysis Technology
Jihee Kim, Jinhee Oh, Myeungjin Kim, and Yangkyu Lim
C.A E-mail: email@example.com
In this study, a process of re-creating Jindo Buk Chum, one of the traditional Korean arts, into digital art using various artificial intelligence technologies was proposed. The audience's emotional data, quantified through artificial intelligence language analysis technology, intervenes in various object forms in the projection mapping performance and affects the big story without changing it. If most interactive arts express communication between the performer and the video, this performance becomes a new type of responsive performance that allows the audience to directly communicate with the work, centering on artificial intelligence emotion analysis technology. This starts with ‘Chuimsae’, a performance that is common only in Korean traditional art, where the audience directly or indirectly intervenes and influences the performance. Based on the emotional information contained in the performer's 'prologue', it is combined with the audience's emotional information and converted into the form of images and particles used in the performance to indirectly participate and change the performance.
Keyword: Jindo Buk Chum, audience-participating, sentiment analysis, projection mapping
 Jindo Buk Chum (Jindo Buk Play),
https://www.culture.go.kr/knowledge/encyclopediaView.do?code_value=C&vvm_seq=423&ccm_ code=C011&ccm_subcode=C211 (accessed July 01, 2021)  Jindo Buk Chum demonstration by Park Gwan-yong, https://youtu.be/ yhMw9qjmHpQ (accessed July 01, 2021)
 Body, Movement, Language: AI Sketches with Bill T. Jones, https://youtu.be/RVyh1ewep84 (accessed July 01, 2021)
 Tensorflow, https://www.tensorflow.org/ (accessed July 01, 2021)
 Body, Movement, Language: AI Sketches with Bill T. Jones, https:// experiments.withgoogle.com/billtjonesai (accessed July 01, 2021)
 TechiEon, https://ars.electronica.art/outofthebox/en/techieon/(accessed July 01, 2021)
 [2020 Corea Impact] TechiEon - Tech見, https://youtu.be/fAgg- 6gMbmg (accessed July 01, 2021)
 INSIDE SOUL 'Badabi' ,https://youtu.be/oyz82gTT8WI (accessed July 01, 2021)
 Everybody Dance Now, https://arxiv.org/abs/1808.07371 (accessed July 01, 2021)
 Everybody Dance Now, https://youtu.be/PCBTZh41Ris (accessed July 01, 2021)
 J. Park, Design and implementation of artificial intelligence (AI) technology driven interactive fusion dance performance, 2021, Retrieved from http://hanyang.dcollection.net/public_resource/pdf/200000486219_ 20210717064955.pdf (accessed July 01, 2021)
 2020 Seven Square 7² Untact Performance_High Quality Version, https://youtu.be/iZ1D2bY3oaQ (accessed July 01, 2021)
 H. Kim, H. Oh, D. Kim, "CNN Architecture Predicting Movie Rating from Audience’s Reviews Written in Korean," Korea Information Processing Society, Vol. 9, No.1, pp17-24, 2020.
 KoNLPy, https://konlpy.org/ko/v0.4.3/morph/ (accessed July 01, 2021)
 J. Hong, Y. Jung, “Establishing the category of emotion verb and classifying emotion verbs,” Korean Studies, Vol 45, pp.387-420. 2009.
 P. Eckman, “Facial Expressions,” Handbook og Cognition and Emotion, WILEY, 1999.  Introduction to natural language processing using deep learning, https://wikidocs.net/book/2155 (accessed July 01, 2021)
 Plutchik’s Wheel of Emotions: Exploring the Emotion Wheel, https://www.6seconds.org/2020/08/11/plutchik-wheel-emotions/?gclid= CjwKCAjw3MSHBhB3EiwAxcaEu1jTbsOR9iZSjg71DaflgBR8A2ESfbKkUHRo8Npl-k1 (accessed July 01, 2021)