λ³Έλ¬Έ λ°”λ‘œκ°€κΈ°

Music & Audio Technology/Machine Learning

AI-based Real-Time Control of Electric Guitar Effects through Facial Expression and Gesture Detection

 

Demo Video

 

π˜•π˜°π˜Έ 𝘺𝘰𝘢 𝘀𝘒𝘯 𝘀𝘰𝘯𝘡𝘳𝘰𝘭 𝘺𝘰𝘢𝘳 𝘨𝘢π˜ͺ𝘡𝘒𝘳 𝘦𝘧𝘧𝘦𝘀𝘡𝘴 𝘸π˜ͺ𝘡𝘩 𝘺𝘰𝘢𝘳 "𝘨𝘢π˜ͺ𝘡𝘒𝘳 𝘧𝘒𝘀𝘦𝘴"!

I discovered that facial expression and gesture detection can provide more expressive and intuitive (and fun) control for electric guitar effects. I believe this is because facial expressions and gestures have a stronger emotional connection to music performance than pressing foot pedals.

Facial expression has been a less explored area in sound control, as it is difficult to detect with traditional sensors. However, thanks to convenient AI computer vision tools, you can now easily detect facial expressions using just your laptop webcam or mobile phone camera. I firmly believe that facial expressions can provide a very strong, emotionally bonded mapping for a more expressive playing style.

 

 

How it works?

FaceOSC => Wekinator => ChucK

 

Face detection: FaceOSC (https://github.com/kylemcdonald/ofxFaceTracker/releases)

Neural network (face detection data -> guitar effects parameters): Wekinator (http://www.wekinator.org/)

Audio signal prcessing: ChucK (https://chuck.stanford.edu/