Demo Video
ππ°πΈ πΊπ°πΆ π€π’π― π€π°π―π΅π³π°π πΊπ°πΆπ³ π¨πΆπͺπ΅π’π³ π¦π§π§π¦π€π΅π΄ πΈπͺπ΅π© πΊπ°πΆπ³ "π¨πΆπͺπ΅π’π³ π§π’π€π¦π΄"!
I discovered that facial expression and gesture detection can provide more expressive and intuitive (and fun) control for electric guitar effects. I believe this is because facial expressions and gestures have a stronger emotional connection to music performance than pressing foot pedals.
Facial expression has been a less explored area in sound control, as it is difficult to detect with traditional sensors. However, thanks to convenient AI computer vision tools, you can now easily detect facial expressions using just your laptop webcam or mobile phone camera. I firmly believe that facial expressions can provide a very strong, emotionally bonded mapping for a more expressive playing style.
How it works?
FaceOSC => Wekinator => ChucK
Face detection: FaceOSC (https://github.com/kylemcdonald/ofxFaceTracker/releases)
Neural network (face detection data -> guitar effects parameters): Wekinator (http://www.wekinator.org/)
Audio signal prcessing: ChucK (https://chuck.stanford.edu/)
'Music & Audio Technology > Machine Learning' μΉ΄ν κ³ λ¦¬μ λ€λ₯Έ κΈ
Interactive Spatial Audio Performance: Controlling the spatial location of sound sources (3rd-order Ambisonics) in real time by head movement detection (0) | 2023.03.31 |
---|