Update: Myo production and sales has officially ended as of Oct 12, 2018.

Artificial Frequency is a software tool built on the raw sensor data and Bluetooth of MYO. Any analog sensor collection that setup as a BLE peripheral device could still work with Artificial Frequency.

The concept of this tool is to use machine learning to classify or predict gestures and tangible interactions. The raw data from MYO armband is well structured, continuous and has a pre-trained model from the community, which is the reason I chose it from the beginning.

Since it's offically ended, I'm now working on a updated version which will have a modular system and API for other sensors. Please stay tune. (Dec, 2018)

Artificial Frequency is an audio-visual performance tool taking electrical activity in the forearm muscles and hand gestures to trigger and modulate generative visuals and sound. It assembled MYO armband, MAX/MSP, TouchDesigner, and Machine learning to provide easy access to improvised audio-visual performance and real time data visualization.

Exhibition Info

WhiteBox Gallery

As part of Liminal Instruments exhibition
Opening Reception: July 7, 2018 6:00pm
July 8 – July 14, 2018

New York Hall of Science

As part of Maker Faire New York 2018
September 22 - 23, 2018

MYO armband is brought by Thalmic Labs in 2014. Artificial Frequency takes the raw sensor data and live stream it into MAX/MSP. The machine learning process provides a "guesture record" function which let Artificial Frequency goes beyond MYO's five default guestures and classify most of the forearm guestures, like snap, clap etc.

Back