Loopy Pro: Create music, your way.
What is Loopy Pro? — Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.
Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.
Download on the App StoreLoopy Pro is your all-in-one musical toolkit. Try it for free today.
Comments
Yes
AUv3 parameters don't work that way. AUv3 parameter control is handled by the host. The plugin only receives the changes to the parameter as provided by the host. MPE is per note expression - there's no way for a host to pass per-note messages such as that to a plugin. MPE has to go directly to the plugin and the plugin has to sort out (based on midi channel) which note the expression values apply to.
Salome does send it's MPE out. But MPE isn't straight MIDI. It sends cc's, aftertouch, and pitch bend on different channels for each note being held down. AUv3 parameter changes are controlled by just one channel/cc message combination. You can't map several channels of incoming data to a single parameter.
So, unfortunately, what you're hoping for isn't possible for a plugin to implement. It would have to be at the host level. If it were tackled at the host level, then all those per-note messages coming from different keys would have to be simplified down to a single value.
There's a Mozaic script that attempts to multiplex MPE output down to single channel cc messages so that you can use it for what you've mentioned. You might want to give it a try. https://patchstorage.com/expression-redirector/ whoops ... wrong script. Here's the right one: https://patchstorage.com/mpe-multiplexer/
I would describe MPE somewhat differently. It is straight MIDI applied in a somewhat new way to achieve per-note modulations of polyphonic data. It achieves this by using MIDI channel to separate out the data for each voice.
The midi channel/voice allocation is often dynamic which makes overdubbing MPE a tricky tricky problem and can make it tricky to try to de-MPE an MPE stream.
It is non-standard in the sense that pre-MPE didn’t have this convention, but it still is using MIDI 1.0 pieces.
I'm not sure how that differs from my description in substance, but thanks for expanding and clarifying for those wanting the technical details. I was trying to keep the post from getting too long and overly technical and obscuring the main point, which was to explain why it's not possible to implement MPE control of non MPE plugins from Salome.
Yes, demuxing MPE to make it usable in non MPE situations is a bit involved and requires some compromises, as I discovered in writing that script.
Thank you for all your technical explanations.
I would still love to ear from @brambos, since I’m still not 100% convinced that if a developer wants it, having the “faders” on the keys send out a message (call it MPE or any way you want) that can be learned by an fx parameter is technically “not possible”.
He hasn’t answered yet, therefore I’m still hoping
You'd need to explain how you'd envisage it working, given that you can play more than one note at a time.
eg you could ask for it to output normal MIDI rather than MPE, and just have it send a normal MIDI message from the latest note that was played, or the highest note played, or the lowest note played, in the event that more than one is played at the same time.
Or you could ask for each key to output, for example, different cc messages. But that would be a bit weird, with each key altering a different fx parameter.
All of these sorts of options may confuse people though, and are not really in the spirit of what this sort of playing surface is designed for, which is MPE, which has per note expressivity at its heart. And fx are not per note.
Yes, the multiple note thing could be tricky 👍🏼
Here’s a MIDI trace of how the app behaves.
So, it you want the sliding events to be useful for AU parameters a script to catch all CC=74 events and add an associated
CC 74 on channel 16 which MPE reserves for control signals. @Brambos could optionally add these events to the MIDI output
or you can put the MIDI stream through StreamByter to add the extra CC 74 events on channel 16. It’s a StreamByter
“one liner” and Streambyter is free.
Someone will provide the correct Streambyter one liner while I go look for the correct answer and test it.
Here’s a tracing of the Salome MIDI output… it’s in reverse chronological order. I added the ! statements to show the 5 events listed above:
Note Off = 1 36 57
CC74 = 2 74 63
Note Off = 2 38 63
Pitch Bend = 2 11 64 8203
Pitch Bend = 2 10 64 8202
Pitch Bend = 2 8 64 8200
Pitch Bend = 2 7 64 8199
Pitch Bend = 2 5 64 8197
Pitch Bend = 2 4 64 8196
Pitch Bend = 2 2 64 8194
Pitch Bend = 2 1 64 8193
Pitch Bend = 1 0 64 8192
Pitch Bend = 1 2 64 8194
CC74 = 1 74 57
CC74 = 2 74 62
CC74 = 2 74 61
CC74 = 2 74 60
CC74 = 2 74 59
!5 - channel 2
Pitch Bend = 2 0 64 8192
!4
CC74 = 2 74 60
!3
Note On = 2 38 60
CC74 = 1 74 56
CC74 = 1 74 55
CC74 = 1 74 54
CC74 = 1 74 53
CC74 = 1 74 52
!5 - channel 1
Pitch Bend = 1 0 64 8192
!2
CC74 = 1 74 53
!1
Note On = 1 36 53
Oh that implementation may have issues too.
MPE has a concept of two zones. One zone uses channel 1 for global messages, and channels 2 upwards for notes and the per-note control signals like CC74, pitch bend and channel aftertouch.
The other zone uses channel 16 for global messages, and channels 15 downwards for notes and the per note control signals.
But many MPE-compatible synths probably arent used to that 2nd zone where channel 16 is the global channel, they are more likely used to a more simplistic MPE era which lacked the nuances of having an alternative zone - they are more likely to be used to just the lower zone where channel 1 is the global channel and they dont expect to see notes with cc74 etc data on channel 1, they expect to see that on channels 2 upwards. And so if this app is using channel 1 for MPE notes and control signals, some MPE instruments probably wont like it.
EDITED AFTER “Salome’ Update: @Brambos changed the default X slider to Channel Pressure so the Streambyter One liner to associate vertical slides with CC74 on channel 16 is:
DX = BF 4A X2
DX detects Channel Pressure codes
BF substitutes CC codes on channel 16 = F (actually it’s 15 but MIDI makes channel 1 ‘0’)
4A is decimal 74 in Hexadecimal
X2 copies the original Channel Pressure value over into the CC74 value. I had to Google for this solution.
The Original Streambyter One Liner when the previous rev defaulted to CC74’s on each MIDI channel was:
BX = BF +C
Every CC command will be cloned on channel 16.
If you have Mozaic, the script I linked above will do what you're envisioning for Salome and also for any app or hardware that outputs MPE. In the case where you have different values coming from various keys, it averages the values to a single output.
[edit] whoops! I linked the wrong script. The right one is: https://patchstorage.com/mpe-multiplexer/
The one I originally linked is a lot different, but also very useful in a different way.
Look at the MIDI trace above. I think the streambyter script will provide CC 74 events for any particular Note “slider” events. You don’t have to clone the CC evnts either but rather create a StreamByter output of only CC74 events on some channel:
BX = BF
Then send only this streambyter output to AU parameters.
Oooops…I think I’ve opened the proverbial Pandora’s box (without the negative aspect of it). ☺️
Well…all this discussion seems to point at an interest to the thing.
Maybe I could refine my suggestion: what about a sided control (ala the pitch bend/mod wheels) to send the midi out to control the fxs parameters? (If the “per key” thing is too complex for the aim of this app)
To clarify even further my vision, what I’m dreaming about from some time is an app allowing to do what the Expressive E Touché allows to do in the physical world, with the ability to use all the fantastic fxs apps we already have. Seeing Salome “sliders” on the keys it’s the first thing that came to mind.
It could also be a suggestion to @brambos for an ulterior app, not modifying the vision of “immediate and easy” workflow at the basis of Salome. But I’m not that sure an app like this is in the chords of brambos’ liking
Unless I’m completely missing your point, it’s very easy to make such a thing with any midi control surface app and customize it to your liking. Loopy Pro, surface builder and TouchOSC come immediately to mind.
There are more esoteric apps such as TC-Data that come to mind as well.
I tried…haven’t found yet something that allows the expressiveness a Touché gives.
None of the apps you mentioned allows you to add the variations you can accomplish by sliding up, then pushing in and out creating a sort of wobbling with e.g. a cutoff, in the meanwhile adding increasing reverb by also sliding slightly to the left,… and so on and so forth.
If you manage to create something like this with a midi control app not thought for this, please send me the results ☺️
And…very easy? I LOVE to build control surfaces with those apps, but an app thought to do a precise task can’t be beaten by arranging something with more general purpose apps.
And…even in the case it can be somehow diy, an app made by a professional with functions, GUI, interactions thought for a precise task…again, can’t be beaten.
In the end…we have apps with whom we could do even what Salome does…or apps that can recreate piano sounds not needing to buy Ravenscroft…but we are still here buying them. As I said…if bram’s app workflow is better or Ravenscroft sounds more realistic there’s a reason, and that reason is why we buy them.
Plus…let’s not forget laziness: sometimes we love building things ourselves, sometimes we just want to throw the money at someone who did it (probably better) for us 😂
Or let’s call it immediacy vs having to work on it (calling ourselves lazy doesn’t sound so good for our ego ☺️). It’s a little bit the contrast between “loving making music” and “loving learning things inside out”. Sometimes it’s one, sometimes the other. If I spend all the time building things, very few music will be done with them. Sometimes we need something ready to open and start playing. Isn’t this what we should buy music apps for? The love me and you have for building things is ok…but tends to shadow the other love we have for making music (at least in my case)
I bought Mozaic exactly with your mindset of “hey, I can do it myself and buy less apps”. But, many reasons made me change my mind (main one being ending up with always the same, awful (sorry Bram ☺️) interface and GUI) and, if I want to be honest with myself, a little laziness here and there was the nail to the coffin bringing me back to wanting apps well done, thought and designed by professional developers and not the diy lover in me.
I could try to design and build a Le Corbusier chaise-longue by myself…but if the guy is exposed in many museums around the world and I’m sitting on an awful patchwork of materials and colors, there’s a reason 😏.
Same is valid in the music world: I could build myself a Stratocaster, but buying it from the experts will (almost) surely give me a better product done with the expertise of someone doing it everyday and who studied to achieve those level of competence. And also allows me not to buy 20 different tools and machines needed to build it.
BTW: TC-Data is not that esoteric at all…just visually appealing for a touchscreen, but not giving many more weapons to your arsenal than others do ☺️
Now…something like this is something I would really love to play! With the app to the right controlling various parameters of all the fxs opened in the chain (Of course with its own GUI and not a pasted image of a Touché 😂. And maybe with a midi keyboard plugged to the iPad)
I sampled the oscillator app and it still created a very unusual “instrument”. I think Heinbach has had a significant impact on @brambos.
As always the UI is wonderful. But it makes unusual sounds. Very much like e-l-s-a by Erik Sigth but more logical UI.
Salome has basically become my Fairlight after ripping the samples from the Vogel CMI app (with help from Samu).
/sarcasm font
I can’t believe you didn’t include a Page R sequencer onboard, bram. What an oversight. But I’ll forgive you if it comes in the next update.
/s
Thanks. Though in that particular post I wasnt even dealing with the fx scenario, I was describing a potential problem with using this apps MPE output for its originally intended purpose.
An update is incoming where I have changed the Y-Axis controllers to Channel Pressure to make it more compatible with hardware controllers.
Edit: update 1.0.1 is out.
iOS is already available, I'm waiting for Apple to approve the MacOS Intel/ARM versions.
Any XY controller can send one cc output for each axis and those cc's can be used to control the wobble of a cutoff and a reverb wet/dry mix. That's standard stuff. Some XY controllers can also send a message on touch and on release. Those can be used as on/off triggers for other FX.
The only thing you mention that isn't (very) practical on iOS is pushing in and out and sensing that pressure to send control changes because we don't have pressure sensitive screens. Velocity Keyboard tries to use the width of finger touch to imitate this. I don't know how successful it is.
Anyway. I understand now that you want to engage with developers directly about your app ideas and you don't want someone pointing out where you've expressed ideas in ways that can't be accomplished on iOS. When I do so it's not to shoot down ideas but to try to help refine those requests. If I think there's a way to do some of the things people want with existing tools I try to be helpful by pointing them out. But I understand and have noted that you don't want that.
I wish you the best in grabbing the interest of developers with your ideas. 👍🏼
Ha I just commented on another thread saying I wish the Fairlight app would get some TLC. A big update and AU would make my year probably lol
That and the SP1200 finally coming to iOS.
No no…the discussion was very interesting, just I’m guessing if it’s really not possible for a developer.
And I was just pointing that I’ve never been able to achieve this with the apps suggestions you gave, probably because of a lack of tools in them (you named one: width of finger touch to imitate pressure. Is it available in any of the midi controller building apps?) or lack in my skills. And also that sometimes is comfortable/better to have apps done and ready vs having to build things.
I’m sorry if you felt left aside. I answered that I’ve tried and failed not thinking it could make you think that I didn’t want your suggestions. Eventually the opposite: maybe knowing that I tried you could give even more suggestions on how to succeed in what I failed.
I’m open to follow any instruction about how to build that, if you have any.
Did I in my answer sound so uninterested? I need to review my english, sorry for that
In any case it was just just a question/suggestion…and he is not answering, so…case closed 😉
It is a Sunday… and you can never rush the Dutch! 😎
Just chiming in because I finally bought the app (been a busy weekend). It’s really really great so far. It also fills a hole in my setup that I’ve been wanting. Simpler than AudioLayer but still powerful and fun. Seems like one of those magic sauce apps that just spits out cool stuff no matter what you throw at it.
One thing I really like is latch mode and being able to change reverse, looping, and pitch per note. Makes for some cool drones.
The only app that I know of that attempts pressure sensitivity is Velocity Keyboard. You may want to check it out because it seems to be expressive in the ways you've said you'd like to see. I just can't say whether this pseudo method of detecting pressure is effective or not. I don't have the app. None of the custom surface building apps have this feature nor any way of detecting finger pressure.
A point that may have been obscured in my admittedly overlong walls of text is no matter what app you use, the communication between apps to control parameters can only happen via midi. There is just no other way.
The hangup is MPE. If an app such as Salome outputs MPE then it can't be used to control other apps as such. This is because the output is spread over multiple midi channels that vary according to how many notes are held down. To make use of MPE requires multiplexing that output down to a single channel. I linked a Mozaic script that can do that, though I understand you prefer a purpose built app to cobbling something together. That's perfectly reasonable.
My purpose in pointing all that out is to help in framing the request better. What is impossible to have is an app that directly controls other apps' parameters. What you don't want is MPE output. It sounds like what you want is something with at least one XY controller with some kind of pressure detection. tbh, Velocity Keyboard seems like it might fit the bill. I don't own it so I can't say for sure.
No, I didn't feel put aside, but did conclude that any further input I might have is off point to what you're after.
There are no tools to build all that you want. As I see it you have Velocity Keyboard as a possible option, or use the script I recommended to try converting MPE output from something like Salome to use all but pressure as an expression. I was sincere in wishing you luck engaging a developer on this score.
No, not at all. I just had said all that I thought was useful by that time.
reposted for anyone using the older Streambyter suggested hack to put X slides out as CC74 on Channel 16.
@Brambos changed the default X slider to Channel Pressure so the Streambyter One liner to associate vertical slides with CC74 on channel 16 is:
DX = BF 4A X2
DX detects Channel Pressure codes
BF substitutes CC codes on channel 16 = F (actually it’s 15 but MIDI makes channel 1 ‘0’)
4A is decimal 74 in Hexadecimal
X2 copies the original Channel Pressure value over into the CC74 value. I had to Google for this solution.
The Original Streambyter One Liner when the previous rev defaulted to CC74’s on each MIDI channel was:
BX = BF +C
Every CC command will be cloned on channel 16.
Nicely done @brambos. I particularly like the way you handled cross fade even when the start point is at the start point, and how it takes the cross fade at the start (unless set to start+loop). It's a small consideration you don't really see in samplers with cross fade.
MPE feature is interesting for a Sampler. @brambos, So if I play this using MPE player like Geoshred in AUM, will it respond to x-axis slide for pitch bend? Or x-axis slide is limited inside this app only?
Another sampler app I used, whenever I slid along x-axis to have a smoother pitch bending, it always sounded like chipmunk if I slide more than 2 semitone. Can someone do a video of how the sliding sounds using a MPE player (Geoshred, Velocity Keyboard, Linnstrument? Will really appreciate it.