Loopy Pro: Create music, your way.
What is Loopy Pro? — Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.
Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.
Download on the App StoreLoopy Pro is your all-in-one musical toolkit. Try it for free today.
Using polyphonic aftertouch to control multiple effects plugins’ parameters in AUM?
I recently tried out AUM’s midi-mapping and mapped various knobs and sliders to certain sends, fx mix levels, filter cutoffs, etc. It was heaps of fun and I can see myself doing this a lot in the future. Along these lines…. Are there any apps that let you use polyphonic AT to control multiple plugins parameters? Or similarly…. Is there a way to hit a note (notes really) and have it/them trigger an envelope that controls something like an fx send? Sorry if I’m asking this poorly…
I think to some degree I can do some of this with my sensel morph but I haven’t really had much time with it.
thanks for any insight/suggestions!
Comments
I’ve never used it but I’m guessing Mozaic would probably very easily handle the note and envelope to CC that I was thinking of. Not sure about polyAT though…..🤔
Since most AUv3 don't support PolyAT yet, Streambyter or Mozaic might be your best bet.
I would probbaly convert PolyAT messages within a certain key range to CC numbers.
My choice is hosting Streambyter as a MIDI plugin inside Drambo because I can freely route the messages to any module, no matter if it's an envelope, a filter, an amp etc.
Before investing any work into this, you might want to think about how you want everything to work together.
I've done this with an Arturia BeatStep which is an easier case because I can handle PolyAT for each pad separately. For a keyboard that does PolyAT though, you might expect dynamic handling of polyphony which is a lot more effort to build in Streambyter or Mozaic.
@wim has done written quite a few excellent Mozaic scripts on patchstorage.com, maybe you can find something there for you already.
First we need to sort out if you're talking about standard MIDI Polyphonic Aftertouch event or if you're talking about MPE. The Morph sends MPE, and MPE is a different animal.
Polyphonic Aftertouch is a particular MIDI message where the first byte is hex A0 + the midi channel, the second is the note number, and the third is the pressure value. Mozaic can capture and work with these messages. It doesn't have a GUI that would be practical for creating them though. I don't know if any iOS apps do (maybe Animoog?), since the touch screen can't really capture after pressure.
MPE, which is what the Morph uses is different. MPE uses multiple MIDI channels depending on how many notes are being held down at one time. There's no association between the note number and the message itself. A C#2 could come in on Channel 1 at one time and Channel 3 another. The MPE capable synth itself has to have the smarts to sort out what goes where.
So, for a non-MPE synth receiving MPE data you don't have a nice tidy stream to work with, but but varying numbers of streams coming in on various channels. You might be sending CC 74 / Value 100 on channel 1 at the same time you're also sending CC 74 / Value 64 on channel 2, Value 127 on channel 3, etc. Non MPE synths have no idea how to deal with that, and you can't midi map a mishmash like that anyway.
I wrote a Mozaic script, MPE Multiplexer, that tries to meld those streams together into one coherent stream for use with non MPE synths. However, right now it only supports one CC (such as the Morph can send on vertical position on each key), and Pitch-Bend. I plan to add Channel Pressure to that script soon.
So ... back to the original question. Do you really mean Polyphonic Aftertouch, and do you have something that actually sends it? Or are you perhaps thinking MPE?
As for the other question:
The Midi ADSR Mosaic script is one that will do that.
A complication with Polyphonic Aftertouch is that it's associated with a specific note. In a polyphonic synth, each note is assigned to a different voice, with its own oscs, filters, envelopes, etc. Those voices are combined inside the synth to produce one (stereo) audio stream. So, if you want each note to be treated differently, according to its own aftertouch value, that processing must be done inside the synth, where the voices are still separate. Sending poly AT to FX outside the synth will not be able to treat individual notes separately.
It's actually not hard to create a setup in AUM with multiple synth instances (one for each "voice") that emulates an MPE synth pretty closely. This kind of setup has some drawbacks compared to an actual MPE synth, but one advantage it does have is that it would easily allow independent control of external FX for each voice using MPE aftertouch messages. (I think an actual MPE synth could do this, too, if it had "multi out" routing for each voice, but don't know if any do.)
Roger Linn has an interesting little article explaining MPE and how it compares to, e.g., a multi-instance synth setup in AUM (which he calls "multi-timbral): https://www.kvraudio.com/mpe
How do you deal with the unpredictable midi channel though? Let's say I play a chord C-E-G and send aftertouch. The first time I play that chord my fingers hit simultaneously, the second time the C arrives first, followed by G and E. Then time E lands first, etc. In each of those scenarios the "voices" are being sent on different channels. Now, I had a filter cutoff linked to aftertouch on three synths all with the same patch. But each time I hit the chord, the aftertouch may go to a filter that started from another position.
Maybe I'm overthinking it. I mean, it could work, but there could be unpredictable behavior the way I see it.
I'm not sure what the issue you're describing is. E.g, how does it work on an actual MPE synth vs. how you think it works with a multi-instance "multi-timbral" synth in AUM? Anything that is intended to operate on all synth voices is a problem, requires routing separately to each instance in AUM. But otherwise each instance is processing exactly the same channel data as an actual MPE synth's voice would.
I don't know how to describe it any differently. nvm.
To put it another way, maybe, why is what you describe not an issue with an actual MPE synth?
I know I've the multi-timbral AUM setup in use with my Linnstrument. With the Linnstrument it had AT on either y-axis (vertical position on pad) or z-axis (pressure) attached to filter cutoff. Haven't used it in a while, but everything worked fine. I wonder whether it's because the AT gets reset upon each key press. It was never the case, if I recall, that an AT setting on one note was automatically carried over to the next, unless the next note was manipulated in the same way as the previous note had been. Whatever it was, I didn't notice that the AT behavior was any different from what it is with an actual MPE synth.
Hey thank you everyone! Sorry I did a poor job of summing up what I was thinking... With regards to the polyAT and MPE I was thinking of a scenario like this "midi controller c3 aftertouch pressure controls effects mix on a delay plugin in first AUM channel... midi controller c3# aftertouch controls mix of a distortion plugin on the second AUM channel... D3 aftertouch might control a bus send level on second channel of AUM... etc". A specific note's aftertouch would be tied to a specific CC mapped to some element in AUM (whether it be a plugin or AUM itself). I wouldn't be playing any synths for sounds.... mostly imagine using this to manipulate/process loops that were playing back (Gauss, Audiostretch, etc)
As for my keyboards... I have a hydrasynth which outputs polyAT (and possibly limited MPE) and a sensel morph (MPE).
Anyway I think the mpe idea is probably a bit impractical and ill-conceived. Mostly I started down this path because I discovered that some of my effects weren't multitouch or were only partly so (WOOT's bands can't all be manipulated at once, Saturn 2 seems to only see one touch at time, etc). So then I mapped stuff to knobs and faders and that let me get past that limitation (SO MUCH FUN)... but I could only turn two knobs at once and then I had that odd idea about using MPE so that I could perhaps modify up to 10 CC's at once ;P
Thanks for indulging my weird curiosity. I will check out Mozaic and the script you mentioned @wim
Mostly I desire to find multi-touch means of controlling lots of effects parameters across several plugins...
Thank you!
No problem, it’s an interesting thread.
The MPE Multiplexer won’t do what you’re after. It takes MPE and normalizes it down to a single channel stream of Note, CC, and Pitch Bend messages.
Your idea of converting poly AT to CC messages per note is valid and could be done with Mozaic without a lot of trouble. I reckon I could do up such a script without much trouble. I doubt it would be of use to many though as there aren’t a lot of things that send Poly AT out there.
I don’t think you could pull this off for MPE. On the other hand, you could make a custom map for the Morph keyboard that sends Poly AT per key rather than using MPE.
.... humm, on further thought, I think you could do it for MPE as well. You’d just need to capture the last channel sent from a note and use that to parse the stream.
Interesting. 🤔
I've no way to test, but this simple script should convert Poly After Touch messages to CC's. The CC number corresponds to the note number. Note 0 sends CC 0. Note 36 sends CC 36, etc.
If that works then it would be relatively simple to add customization of the CC sent per note.
But the launchpads do! And many have acquired them after the LK+Atom integrations.
There was actually a previous discussion on using LPX to control effects using CC, started from a thread on LPX for MPE: https://forum.audiob.us/discussion/comment/960244#Comment_960244. And here is a Mozaic script from that: https://patchstorage.com/launchpad-cc-control/
Good to know. Thanks for the info.
I think I should have a basic script that will handle Poly Aftertouch, MPE Aftertouch, MPE CC, and MPE Pitch Bend done sometime tomorrow. The first iteration will convert any of those to a CC stream with the CC number corresponding to the note number. If it seems to work then I can flesh it out with assignable channels and CC's per note.
Basically the potential for up to for CC controllers times 128 notes, all from a single keyboard if it works out.
Wow @wim thank you for continuing to explore this! I’m deep in overtime crunch at work, but when I emerge in a couple weeks I will buy mozaic and check this stuff out! Thanks!!
I didn't finish up the script today. Got it working last night but planned to put the spit and polish on it today and got distracted with another project. Maybe tomorrow.
Damn. This is turning out to be a great script. @stown, your idea was brilliant. An aftertouch enabled controller will be able to control up to 128 CC's for the aftertouch and 128 for velocity. A full-function MPE controller such as a Seaboard Block, up to 768 parameters! Even a plain ol' keyboard will be able to send a CC for each key hit based on the velocity.
I need to do testing tomorrow, but It's looking good IMO.
Sounds promising. One comment from the other thread was to keep in mind that the CC value will drop to zero when you lift your finger. May or may not be what you wanted.
@wim Thank you!! That does sound incredible!
And really glad you are so excited about the idea 
@bleep You are totally right. I suppose if one wanted to get really involved with it they could set some sort of decay rate so maybe CC's don't snap back to zero. So CC value is accumulated in that channel and drains at a defined rate.... I guess sort of like smoothing... but I wouldn't even be sure... that's just what I would do in computer animation. So just guessing but an array stores all the notes and current values. Current value is set to whichever is the maximum of the "current instantaneous CC note value" and the "previous stored cc value minus the decay amount".... so if CC drops faster than decay amount the decay limits the drop. I thiiiiiink. Makin' stuff up after a couple glasses of wine!
But dropping to zero wouldn't be so bad if you were just momentarily pushing a note down to feed an fx send.
Hey @stown I didn't have any time to further test this. It seems to be working, but without further testing and input I don't want to put it up on Patchstorage yet. If/When you get Mozaic, feel free to give it a spin by copying the code and the pasting it into Mozaic and hitting the upload button.
Please don't feel obligated to get Mozaic or to try the script out. I'll polish this one up soon-isn and publish it either way. 👍🏼
https://www.dropbox.com/s/me6h381bab89ajf/Expression Kersploder v0.1.mozaic?dl=0