top of page

Digital Flesh – Scar in Data


Digital Flesh – Scar in Data by Sojung Bahng and Mike Lukaszuk, Multimedia Performance, 2023


Digital Flesh Scar in Data is a set of multimedia art projects that combines live cinema and experimental sound with installations, and AR. This project investigates how representations of our bodies and shared physical sensory information can be de/re-materialised through digital audio-visual media. We incorporate machine intelligence techniques such as computer vision into our live performance, in addition to approaching other digital art art forms such as AR and interactive installation.



The improvisational performance combines live video and live coded sound. We question the mediated imagery of the Korean female body and explore both indirect & direct encounters with transnational Korean culture.


During the performance, Sojung Bahng's body images, previously recorded, or scanned in real-time using a webcam, are responded to experimental sound played by Mike Lukaszuk. Lukaszuk incorporates sampled recordings from traditional Korean music and popular songs into his experimental sound. The body images are detected and analysed as digital data and mapped to the musical parameters created by Lukaszuk.



Bahng wears and uses a traditional Korean costume (Hanbok) and props purchased from Amazon in Canada. She raises questions about cultural authenticity, appropriation and the complexities and confusions of the diasporic experience. Additionally, she incorporates elements of Korean shamanistic performance, using a Christmas bell and a fake Halloween knife.


Lukaszuk, who encounters "Korean-ish cultures" in everyday life through his Korean family members, represents his own complex and subtle experiences with Korean cultures through a live experimental sound performance.



Live Performance, Art & Media Lab, Isabel Bader Centre for the Performing Arts, Kingston, Canada, May 2023




Audio-visual performance in collaboration with machine intelligence (demo)


We are currently experimenting with incorporating computer vision and machine intelligence into our live performance. Using computer vision, a webcam detects and analyzes body movements, shapes, and distinct objects, which are then mapped to sound parameters. The multi-layered and generative data extracted from our bodies serve as settings to transfer the recording of bodily sound. Lukaszuk created the interactive system, and Bahng re-mediated it to collaborate with machine intelligence for audiovisual performance. This collaboration with machine intelligence draws on the then tensions between the capacity for computers to interpret the world by identifying body representations as objects, and how we as humans perceive and represent our non-artificial embodied experience. We utilize the imperfection of computer automation and machine processes as a tool to emphasize the indescribable and transcendental nature of the body.

.



Live Performance (using computer vision), Seoul Artists’ Platform_New&Young (SAPY), Seoul, Korea, July 2023




Prints created with assistance from Hoopla Press & Gallery


Bahng extracted static images from the live video and created the body collage using a drypoint printmaking process. She drew her body collages on a plate with a sharp, pointed needle-like instrument and re-materialised (re-physicalised) the digitalised body images through physical printmaking activity.



MOM (몸) By Sojung Bahng, Drypoint, Printmaking, 2023




Augmented Digital Flesh by Sojung Bahng and Mike Lukaszuk, AR, 2023


Using the printmaking images, Bahng created an Augmented Reality (AR) piece to augment the re-materialized body with a digital collage video. The AR work demonstrates the iterative exchange between material and immaterial bodily representation and embodiment.



Multimedia AR Installation, Seoul Artists’ Platform_New&Young (SAPY), Seoul, Korea, July 2023


.



Digital Flesh Suffocation by Mike Lukaszuk and Sojung Bahng


Suffocation is an improvisational piece stemming from the authors’ involvement in a mixed-media ensemble, and a desire to explore spatiality using an approach that blended our respective practices. For the video content used in this piece, a 360-degree camera was placed inside a small container containing an assortment of tiny objects, such as pins, tissues, pills, and leaves, and poured or added liquids. The everyday objects are defamiliarized and de/re-materialised by creating the illusion of scale. It provokes a paradoxical physical sensation between unpleasant bodily sentiments and familiar feelings. The audio content moves through processes of defamiliarization and reimagination of conventional instrumental sounds within a digital context. This is achieved by sampling commonplace instruments such as pianos and kalimbas, while also including computer-generated emulations of these sources that could be bent and distorted by manipulating sound synthesis parameters.



In addition to the use of a quadrophonic (4.0-channel) audio playback system, additional highly directional parametric speakers were positioned in our gallery space to synchronize with the presentation of height and depth in the video content.


The setup for project uses pitch, roll, yaw, accelerometer, and keypad button OSC data transmitted from composer/programmer Kevin Schlei’s smartphone app GyrOSC to affect a Max/MSP patch that acts as a link between electroacoustic sampling and sound synthesis in the digital audio workstation Reaper, and video playback and manipulation taking place through an Isadora patch. Max/MSP is sends MIDI control change and note messages that are mapped to affect spatial parameters of projected video such as scaling and rotation, while simultaneously affecting changes such as triggering recorded audio samples, musical pitches for virtual synthesizer plugins and audio panning positions. It is the simultaneous translation of OSC to MIDI data mapped to Isadora for video and Reaper for audio that allows the smartphone to act as a digital instrument for improvisation.



Example of OSC to MIDI communication for sound spatialization in via Max/MSP (developed by Mike Lukaszuk)



Multimedia Installation, Art & Media Lab, Isabel Bader Centre for the Performing Arts, Kingston, Canada, November 2022


Suffocation was presented at ISEA2023: 28th International Symposium on Electronic Art in Paris, France.

bottom of page