Loopy Pro: Create music, your way.
What is Loopy Pro? — Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.
Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.
Download on the App StoreLoopy Pro is your all-in-one musical toolkit. Try it for free today.
Omni midi control in AUM?
Does anyone know if there is a way to do this in AUM?:
I want to be able to assign modulation of various app filters to CC74 on my seaboard, so that
I can control filters etc with a slide up the y axis of the seaboard etc. the problem is that when I do ‘midi learn’ assignments, there is no way to assign an ‘Omni’ setting to a parameter, which means I can’t trigger the modulation with all keys on my keyboard. Any way round this?
Comments
If I understand what you want to do correctly, I think you could use mfxStrip, set to receive on any channel and to remap that to a single channel. Set mfxStrip to receive from your seaboard, then from mfxStrip to the midi control / learn.
There are also a Mozaic and Streambyter, but those would involve writing a script. mfxConvert is quick, and easy to change, with no programming.
Ignore the note mapping on the right hand side, there are no mappings set or needed on that part.
I see you are the font of knowledge when it comes to these things! I’ll look into that, sounds like a great solution, thanks man!
I already have ‘midi tools’, I’ll see first if this might be possible using that
Doesn’t look like it. Midi Tools Midi Clone & Filter can go the opposite (one to many), but you need many to one, I think.
Almost a success. I did get it to map to one channel using midi tools ‘route’ function. I linked it to the ‘mix’ control of eventide qvox which has potential for some really interesting effects! However, now if I play my seaboard, and I play several notes, all the notes are controlling this, and sending conflicting messages, so it doesn’t work in the same way as it would were i playing an MPE synth. Also: On an mpe synth, CC74 would only be triggered when I actually slide up the x axis. But using this method, if I hit a note on the seaboard half way up the keyboard, even without sliding, the QVox mix parameter is already being triggered.
So what I need is something that can trigger the qvox mix parameter in the same was as ‘slide’ would work on my seaboard using Noise app, for example. Firstly, it shouldn’t matter where on the key I press, unless I actually slide upwards, the value sent should be ‘0’. And then, if I am sliding with one note, but static on others, the static ones should not cancel out the message sent by the sliding note. This is hard to explain clearly. Can anyone suggest a solution?
I also tried using midi route sending to ‘all’ btw. This worked in exactly the same way as sending to a single channel.
Yeh, I don’t see how you’re going to get around the conflicting messages. MPE is sending CC 74 on different channels for each keypress. The only way it would work is if you played monophonically, and even then you have the slides not working right.
I can imagine a Mozaic script that could sort all that out, but it would pretty tricky to work out.
I haven’t the first clue how to write a script for Mozaic, but if someone did know how to do this, it would be so cool!! Imagine being able to control any parameters on any AU with a slide up your midi keyboard, whether virtual or real. Would be an absolute game-changer.
Like @wim suggested you can solve your issue with https://apps.apple.com/app/mfxstrip/id1451194722 at a reasonable cost of 3.49 euro.
If I understand correctly that you want to filter omni to a single channel you can use Mozaic.
You change the number to the number below the midi channel you want, so 1 means midi channel 2.
@OnMidiInput SendMIDIThruOnCh 1 @End
No, it doesn't work, as I explained in my previous post. It's not about the money, I can send to one channel or all, using 'midi route' app. It doesn't solve the problem.
Ah ok understood. Too complex for me.
Too complex for me too, but hopefully one of the mozaic experts or another midi boffin can think up a hack!
I think there is something @wim explained that you possibly aren't understanding . If you play polyphonically , there are going to be multiple conflicting messages . Because every note is going to be sending that midi cc. The reason that isn't a problem with MPE-enabled polyphonic synths is that you essentially have one synth per note played. So, each midi cc is essentially going to a separate but identical synth and only influencing that one separate voice.
You only have one Qvox. Effects boxes are essentially like a mono synth. So, MPE-style control doesn't map well because you don't have a separate set of qvox controls and processors for every different note you play.
Also, from what you describe ,your controller does send out the slide value on first touch. It is probably the receiving synth engine that determines how to respond. But until you figure out how to determine which voice's midi you want to control your cc, it doesn't make sense to start trying to solve that problem.
Hopefully, some of that made sense to clarify what the underlying issues are.
Thanks, I understand a bit better now, yes I am definitely no expert on standard midi let alone mpe lol. So basically what I am trying to do is impossible perhaps, yes?
It isn't clear. I think you are ata stage where you need to go from a vague description of what you want to thinking through the specifics to see if what you want is possible.
Think of possible cases and specify what you want to happen. If three notes are playing on your keyboard and sending different cc values, what should happen?
My understanding of what you want makes me wonder if you need one synth instance and one fx unit per voice...in which case you will need a Poly to multiple mono aspect of your solution.
But maybe I am the one not clear on what you want.
I’m very clear on what I want, I just don’t know if it is possible. Basically i want the effect - in this case the mix of qvox - to behave exactly like it would if it were the target inside an mpe compatible synth mod section and the source of the modulation were assigned to cc 74. That would mean that any key on the keyboard could trigger it independently of what the other keys were doing. Also, if a key were no longer pressed, or if there was a movement from the upper part of a key to the bottom, its value would return to zero (or whatever start value it had been assigned). It would also not be triggered by merely pressing the key higher up the Y axis, but only when there was actually a sliding motion from a lower part of the key to a higher part. If you use an mpe keyboard you will know what I mean I think. Thanks for trying to help by the way.
You still haven't said how you want you fx box to respond to multiple simultaneous control signals.
All of your audio is coming into one fx box that can only have one setting for each parameter. All of the audio for all your voices are coming into that box and processed with one set of parameters.
Unless you have one effects box per voice (which means a separate audio output per voice), you can't do what I think you are describing.
Yes, exactly, it just isn’t possible I think.
I think I understand the desired behavior, and I speculate that it should be possible to design an algorithm that would take all those conflicting cc 74 streams and filter them down to a single stream somewhat emulating the expected movement. It could probably be broken down into a set of rules for which messages to filter out.
This is what an MPE synth must be doing when it allows MPE expression on global parameters, such as FX, which, are global, not voice specific. I don’t know how they do that, but some do.
There would have to be some logic such as:
I’m sure it’s more complicated than that, but that’s the vague idea in my head of how it might work. It’s just basically an intelligent midi cc filter and combiner. I can kind of see how it could be done, but I’m not sure I’m clever enough to figure it out and to make it work well enough.
@Gavinski , It’s clear that the MIDI CC stream needs to be filtered down to a single channel, but what about notes? Should they be filtered to a single channel as well? It seems like they should, but I thought I’d ask.
I decided take a crack at it. So far it looks like it may work ... though the devil is always in the details. I’m doing this for my own interest, so no need to feel like you need to get Mozaic and or use the script even if I do get it to work.
One last question, I think:
When you hold down two keys and move one upward and the other downward at the same time, what should happen? It seems to me like they should basically average each other out. But you could also see it as:
Average each other out is the easiest, and is already done. But none should be super hard to do, really.
Hi Wim, very cool, I'd certainly buy mozaic if you could figure this out. Yes, I think in the roli seaboard system it is one channel per note. It doesn't work like, say, velocity keyboard, where you could also set one channel per string, though if you were programming something it would be good to have those two options I guess.
The slide (y axis, cc74) function on keys on seaboards generally works when you slide from bottom to top, but in some apps it can also work from top to bottom. Since these effects are generally related to pitch, there would be no cancellation.
Any key's behaviour is completely independent of other keys. Let's say the effect I was triggering was to add a 7th note on top of any given note. If I slid from bottom to top on a C note, it would add the 7th of that note, if I simultaneously slid from bottom to top (and / or top to bottom if you programmed your script so) on a D note, the 7th of that particular note would be added. If you held your finger on the note(s), the effect would sustain, along with the sound of the original note of course. Taking your finger off the note would cause release of the original note and the sustain. If you slid back down, the 7th would gradually disappear. Of course this effect is gradual, and can be tweaked by using the slide parameter in roli dashboard or the 'block dashboard' app.
I don't know if you have experimented with mpe sounds yourself, but if not you could try some of the free sounds in roli noise and use the onscreen keyboard. The only thing missing there compared to how an actual seaboard works is velocity, at least on an iPad, not sure about iPhone. Not all sounds in Noise have slide function (cc74) , and in Noise it only works bottom to top (unlike cypher 2, which works both ways, which is ideal I think). Let me know if you need any more info, man, a lot of people will love it if you can come up with a script for this.
Of course, it would also be very good if aftertouch could also be freely assignable, though I don't know how roli handles that or what cc channels it is related to
I don’t think it would matter which direction moved on the hardware keys. The script would just work based on the incoming value.
Sorry, but that part simply isn’t possible. All that can be done is to:
Sure, I’m familiar with how MPE controllers and synths work. However, that isn’t the same as what you originally mentioned ... i.e. being able to control filters, or the ribbon in the Eventide apps, etc. with your seaboard. That’s a many>one relationship (one cc on multiple channels controlling one parameter). What you’re describing is quite a bit different and I don’t think possible.
Yeh, no, I can’t make a script that would do all you say. At most, I can have it act as I described above. The smarts and expression of an MPE synth are built into it. Non MPE apps understand only incoming CC values, and those values are never per-note*.
(* Well, except for polyphonic aftertouch, but almost no apps support that.)
Yes, I thought that would not be possible after espiegel pointed out the details of all this yesterday.
As for your question about which of these would be best
Give the first key pressed precedence
Give the last key pressed precedence
Give the lowest note precedence
Give the highest note precedence
I guess the ideal would be that your script gives an option to select which of these is chosen. If I had to choose one, I think give the first key precedence would be best, as that would stop an effect being suddenly cut off when another note started a slide up on the y axis. What do you think?
After messing with it for some time, I think I’m not going to try to use precedence. It works out a lot simpler, and I think smoother, if it doesn’t matter which keys are being pressed, the direction of the cc 74 change smoothly climbs or falls according to the direction of the last change coming from the keyboard.
So, for instance, if you’re sliding up on C2 then play E2 and slide up on either or both keys, the cc will increase in a straight line (not jump around). As long as any keys are held down, the cc moves smoothly up or down according to the direction of the slides.
The only time this gets somewhat odd is when you’re sliding up on one note and down on another. That kind of movement tends to cancel each other out and you just get a small “average” changing output, but that kind of makes sense.
Getting into the other options complicates things a lot, so for now at least I plan to try it this way. Sometimes I think of easier ways to do things later, so perhaps later, I don’t know.
I actually had it working pretty well, but then decided it would be useful to try to manage pitch bend as well. That’s proving a little trickier than I thought, so will take a bit more time to get working (if possible).
Anyway, I think I like the idea. In particular, I have an MPE controller that requires me to hook up to a PC every time I want to switch between MPE and Non-MPE mode. This could be a handier way to work with that one.
Which controller is that?
Sensel Morph.
It’s not possible to make a Mozaic script for that then?
Really looking forward to trying your finished script out! This is going to be very very cool beans indeed
By the way, are there any Mozaic scripts already designed that do anything nice for mpe devices?