Loopy Pro: Create music, your way.
What is Loopy Pro? — Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.
Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.
Download on the App StoreLoopy Pro is your all-in-one musical toolkit. Try it for free today.
Comments
How's the script coming along Wim?
Done and in first round of testing. So far no obvious show stoppers, but I only had a little time to mess with it. It should probably be ready to try no later than this time tomorrow.
I’ve no idea whether it’ll be what you’re looking for, but it seems useful to me at least...
Not that I know of. But all I did was use the search for MPE.
Yes, I tried the same and nothing came up. Great news about the script, very curious to check it out!
I realize now I should have posted in this thread about having uploaded the script yesterday. I posted in a different thread...
In response to what I think 🤔 I’ve understood from some of @Gavinski ’s ideas for adapting MPE controllers to non-MPE synths, I’ve posted MIDI MultiPlexer v0.9 Beta to PatchStorage.com. The description to this one is long, so I’ll only post the first few paragraphs here for those who would like to follow up over on patchstorage.
This one is very much beta-test. I’m not even very familiar with MPE, and only have a few basic ways to test. It seems useful to me though.
MPE MUX v0.9 (Beta) - Tools for adapting MPE controllers to non-MPE synths
Sorry for the long Description, but this one takes some serious explaining.
MPE synths have intelligence to apply expression on a per-voice basis. The MPE protocol sends MIDI CC values on different channels along with information about which note the expression applies to. Non-MPE synths don't have the ability to sort all this out, so one must turn off MPE mode to work with them. But, then then all that nice expression capability is lost.
This script can put some of that expression back to use by collapsing (Multiplexing) the multi-channel midi CC and pitch bend stream into a single channel that apps can use. No, they CAN'T use it per-voice, but at least they can receive a single stream of MIDI CC and/or pitch bend in response to movements on the controller's keys (or other control surfaces).
(Continues ...)
@wim thanks so much for getting this done in the end! I’ll check it out later when I have my controller an let you know how I get on with it
@wim I had a stab at this last night but couldn't get it working as expected. I was in a bit of a hurry so that is probably why. I'll reread the instructions more carefully and give it a go again later today. No matter what I did I couldn't get pitchbend working. Do I need to use dashboard app or something to change the channels my controller is using or just keep my seaboard on its normal mpe setting?
Without knowing what the seaboard is sending and what app you’re trying to send to, I can’t really give much advice. I don’t have a seaboard to test with but I can say for sure that pitch bend works with other MPE controllers I’ve tried.
One test would be to use a midi monitor to compare the seaboard output and the output from the app. Block should block PB, Pass Thru should not touch it, and MUX should take any MPE pitch bend and output it on a single channel.
remember to route the seaboard to Mozaic, and Mozaic to the synth. Don’t route the seaboard to the synth at all.
Sure, that's how I did it, I'll take a look again later, thnx
Hey Wim, what kind of effects did you try on this in your experimentation? I got this working but the behaviour is different from I was hoping for! It kind of works for some things, like adding reverb for example. But if I try it on a pitch-based effect I can't get it to do what I want, I guess it truly is impossible.
First observation : I couldn't get pitch bend working at all. Eventually I tried varying the PB channel in mozaic, I found that it works only on channel 2. Maybe this is something to do with the way the seaboard is set up, no idea, anyway, glad I finally got it working!
Let me explain what I am doing. Start and End points in mozaic are set to 0 and 127 respectively. In AUM the parameter I am modifying is also set for full range (0 to 100). I put qvox in the effect slot of an mpe synth in AUM. I want to, for example, play a few notes consecutively, kind of like a strumming effect. Then I want to slide on the y axis on one of the notes only. The pitch effect (it turns the note into a chord, for example) should be triggered on that note only but the other notes should continue to play, though without being modified by the effect.
I have tried all 3 of your settings ('use touch', 'pickup' and 'reset'). Let me tell you what happens in each. I've also tried varying what I set as 'midi control' for the channel in AUM. All the below descriptions apply to both, there's not difference.
'Reset' : if I strike a single key on my keyboard starting at the bottom of the y axis and going to the top I get some modulation, but it only goes from about 0 to 10%, so is basically unnoticeable. If I play several notes at the same time, I get even less modulation, maybe up to 5% max. So for whatever reason, this setting is not going to do what I need.
'Pickup' : even if I strike a key on my keyboard at the bottom of the y axis, the modulation starts from about 40% and goes up from there to a maximum of about 60 as I slide.
'Use touch': same problem as 'pick up'.
Any idea what is wrong here? I don't understand why I am not getting the full range of modulation. I have tried varying my settings for my roli keyboard 'slide' parameters in the roli dashboard app (which allows you to vary responsiveness of this) and it makes no difference.
Sorry, but as I've tried to explain, this is simply impossible. There is no way anything can adapt MPE to work on individual notes in a non-MPE synth. This is because the app receiving the information doesn't have the capability to bend any individual note; pitch-bend is global. Every control is global.
That shouldn't be happening, and doesn't in my tests on any controller. Without any look at the data coming from the seaboard before the app and the data coming out of the app for comparison, I can't offer any help. If you know how to use a monitor like MIDI Spy (free), then you might be able to do some comparisons or send me some screenshots. Blocking Notes and Pitch Bend in the Mozaic script (tap those pads until they turn red) is a great way to filter the output.
Same answer. Works in my tests, can't test on the seaboard, need to look at data.
No, sorry, I can't guess without data.
What I'd need to see is
To do this test you would want to have two instances of MIDI Spy. You would route the Seaboard to one and the Mozaic output to the other. Then you would set each MIDI Spy to only look at MIDI CC. If you're measuring MPE then you would want to block Notes and Pitchbend in Mozaic. If you're measuring Pitch Bend then block Notes and MPE.
[edit ... oh, and focus on CC-74 and Pitch Bend only. If you're getting other CC's then the seaboard may be sending poly aftertouch. I didn't write the script to deal with aftertouch. It will just pass that through. It would be useful to know if it is sending that though.]
A lot of data can flood into MIDI Spy, so you might need to look carefully and note the starting values if too much data comes in and you can't scroll back up to the top. Other monitoring apps might work better but MIDI Spy is free and works well.
The thing is, this is all just numbers flowing through the script. It will behave the same no matter what you send at it, so there are only two ends that can be variables: what the Seaboard is sending, and how the app you're sending to handles the output from the script. I'm trying to look at the first end. If it looks right from there then we focus on the apps. If it doesn't then I focus on whether or not there's a bug or if the Seaboard can be configured differently, or if the script can be adapted to handle the Seaboard output. But without data I'm stuck.
Of course, if you wanna send me your Seaboard that would work too. I might need to keep it a long time though, just to be sure.
Don’t worry, I know you explained it would be impossible to do exactly as described, but I wanted to see what would happen. I’ll look into getting round to sending you more info at some point about the midi data the seaboard is sending out, cheers
Thanks, I’ll try to do some more research into what the seaboard puts out.
You should still be able to use the pitchbend coming from the Seaboard as a global pitchbend, or to map that to any cc. That means (assuming we caN work out the Seaboard range issue) your whole key surface can act like an XY pad.