blacksignal 12 hours ago

Not just a ‘theremin’ — that totally downplays the power of midi. Someone else mentioned the mimu gloves, and I love the idea of a vision based controller - almost everyone has a phone, tablet or laptop with a camera, especially if theyre making music.

I also love that this could blur the lines of music playing and dance.

Great job OP, thanks for sharing.

  • whilenot-dev 3 hours ago

    The Leap Motion Controller came out in 2014 already (11 years ago, wow!) and isn't very expensive. The SDK was lacking in the beginning if I recall correctly, but a webcam seems to be inferior. Technology isn't the limiting factor for a quite some time now. I'm sure many projects existed to translate gestures to MIDI, some less polished, some more polished[0][1].

    Reminds me... I even used two PlayStation Eyes (EUR 5 each) with OpenCV and the EVM algorithm[2] on a ThinkPad X230 for a dance performance piece back in 2015. Movements rather than gestures and OSC instead of MIDI, but it worked great!

    [0]: https://midipaw.com/

    [1]: https://uwyn.com/geco/

    [2]: https://people.csail.mit.edu/mrub/evm/

vunderba 14 hours ago

From the article: "When using AirBending for pitch control, you can lock your gestures to specific musical scales and keys. This ensures every note you play is perfectly in tune with your composition"

Reminds me of the Moog Theremini - that was a fun bit of kit.

https://en.wikipedia.org/wiki/Theremini

bjelkeman-again 5 hours ago

That looks great!

Do you have any suggestion for how to learn how to hook this up to Logic for anyone who hasn’t used midi before?

I wonder how this would perform under live stage lighting conditions, i.e coloured strong lights and high contrast.

  • bepitulaz 5 hours ago

    Open the app and open Logic Pro. Create a MIDI track on Logic, try waving to the app it should automatically receive MIDI message from all channels and all MIDI devices.

    Then if you want to filter the track to receive specific MIDI channel from specific device, for example AirBending channel 2, then find it in the dropdown in the MIDI inspector section in the same MIDI track.

drcongo 43 minutes ago

I would love this as an AUv3 for iPadOS, any plans?

Obscurity4340 11 hours ago

This instruments timbre and tone are literally dream shit to me, so wavy and I dont know—unearthly/worldly

abalaji 12 hours ago

Seems like the same software could be used as a soundtrack for Tai Chi exercises. Would be pretty neat.

  • bepitulaz 5 hours ago

    It seems possible, since Apple’s Vision framework can read body pose too. Maybe I can try it for the next update.

singularity2001 13 hours ago

Great work but wouldn't the iPhone with the lidar depth sensor be a better device?

  • bepitulaz 13 hours ago

    It’s on the plan to expand this app for iPhone. But, I haven’t tried lidar, so I decided to release for macOS first.

    Also with iPhone, I have to think how to transmit MIDI data to DAW on laptop. Well, most likely via USB or network.

    • docEdub 12 hours ago

      Apple devices can do MIDI over Bluetooth. I've used this in the past to send VisionPro hand tracking data as MIDI.

    • import 12 hours ago

      I do this with "Apple Lightning to USB Camera Adapter". iPad basically talks to the midi sequencer via USB.

irvanputra 10 hours ago

I wonder if Linux version will be available.

  • bepitulaz 5 hours ago

    Since I developed it using Vision framework from Apple, the current focus is still for Apple devices. So, not in the near future to develop for Linux and Windows.

    • irvanputra 2 hours ago

      Thanks for the confirmation!

johng 15 hours ago

This is really cool, thanks for sharing.