Suggestion on how to overcome windows app "exclusive lock " issue


Hey! As we all know, using the Instrument 1 on Windows is a little frustrating, since the software cannot be open at the same time as your DAW or other software, which makes tweaking and adjusting the instrument while working really cumbersome. I understand that this is probably based around some frustrating limitations in the windows OS, but I have an idea for a way to work around that limitation that the app developers might want to consider!

When the windows app is open, it requires exclusive access to the Artiphon Instrument 1, which prevents a DAW from getting MIDI data from the instrument. I’ll take this as a given. However, it does not prevent the DAW from getting MIDI data from the app itself.

Basically, if, while the Artiphon Instrument 1 editor is open, it could initialize a software MIDI device. While you play the Instrument 1 while the app is open, it can simply relay all MIDI data from the instrument out to this software device, which you can then access from your DAW or other music software.

This would allow us windows users to keep the software open and make tweaks on the fly, including saving presets and all that good stuff, because the DAW is completely isolated from the instrument!


Hi Rusty,

While this is a great approach, Windows does not have the facility to create virtual midi ports. But, along the same lines, if the app could be configured to relay the incoming midi to an arbitrary midi port, there are third party virtual midi port drivers that could be used to route from the app to the daw. Hopefully Artiphon will consider this. Of course, this would also incur some extra latency, as the midi messages pass through the hardware midi driver, into the app, back through the virtual midi driver, and up into the daw (and vice versa for sending from the daw).


I have definitely on a number of occasions over the years, going all the way back to windows 98, used software midi controllers. I’m pretty certain that it is possible.

From a latency standpoint I’m also certain there would be zero concern. Relaying audio can induce latency because audio data needs to be buffered and transmitted in chunks. Often it needs processed and that also has to happen on chunks of data at a time, so it’s difficuly tonjust stream audio sample-by-sample. MIDI data requires no complicated processing, and the individual packets are very, very tiny. There would be no more latency induced than, for instance, using the virtual keyboard in your DAW. The latency to translate a mouse click into a MIDI signal shoild be roughly equivalent to the latence of translating a MIDI signal into another MIDI signal.


You can definitely create virtual midi drivers, but they are drivers, not apps. And Artiphon could do this, but it is more work than leveraging and existing system (like loopMidi or MIDI Yoke).

The latency may prove to be negligible, but it needs to be considered in this type of implementation. It’s not the same a virtual keyboard in a daw. A virtual keyboard within a daw is communicating directly with the daw, not having to traverse the layers I’ve described. it’s a GUI component which takes mouse events and sends corresponding midi events.


I think Microsoft actually DID provide a multi-client MIDI API. It’s just that the API is done through UWP only, which would require all apps hook to the MIDI instrument to be updated to use UWP. Needless to say, the chance that it would happen is next to zero.

It’s funny that when Apple releases something new that requires a rewrite, every developer wants to be the first one to adopt the new thing (hell, Apple actually change CPU twice, and the third time is coming). But when Microsoft releases something new, everyone just wants the API remains the same so that they don’t need to do anything new.