Loopy Pro: Create music, your way.
What is Loopy Pro? — Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.
Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.
Download on the App StoreLoopy Pro is your all-in-one musical toolkit. Try it for free today.
Which synths are directly controllable via MIDI CC when running as AU's?
In a different thread, @espiegel123 said:
Some synths have built in MIDI mappings regardless of whether they are standalone or AU . . .
So far I've checked with Brambos Ripplemaker, which publishes midi mappings in its docs and is indeed directly controllable in AUM, independently from AUM's AU Parameter MIDI Control system. Zeeon, apparently on the other end of the spectrum, does not publish default mappings and even if you know them they are superseded when running as AU within AUM (not sure about this, maybe someone can confirm).
Is there any list of synths, and where they fall with these behaviors? I think Brambos has a policy of enabling this until AU Midi Control is perfectly standardized and implemented by all AUs (which seems like it might be a while).
For now, directly controlling with midi CC in AUM has some flexibility that using AUMs Midi Control does not (though vice versa is also true, I think), and I'm looking for more synths that can be controlled directly as AUs. Bonus points go to synths that also publish their midi mappings.
If there is no list of synths like this, could we start one in this thread?
Ripplemaker (and all Brambos synths) --
publishes midi mappings -- YES
native midi mappings work when bypassing AU Host parameter control -- YES
Zeeon
publishes midi mappings -- NO
native midi mappings work when bypassing AU Host parameter control -- NO
Comments
What about apps that have midi learn? That's more useful as published CC's IMO. That is, provided the mappings are saved so that the same mappings will work in another host and in the standalone app, if there is one.
That is by far my preference. I'd rather take some time mapping midi controls than be forced to adapt my controller to published mappings.
IMO, AU apps with MIDI learn should be able to be overridden by the corresponding AU parameter. For instance, if I have cc20 mapped to a filter cutoff but then I map something to the AU parameter in AUM, the AU parameter should have the priority.
The reason for this is AU parameters are far more fine grained than MIDI cc's. Also, it's a non-destructive way of using other mappings instead of midi learned mappings when you need them.
MIDI Learn is fine if it exists. Effectively then I can set up any midi mapping I want. What I really want is a way to control AU synths via MIDI CC directly, without going through the AU Host Parameter control system. So whatever midi mapping I have (whether default/native or set up via MIDI learn) I want to work within an AU Host, be able to bypass the AU Host Parameter Control.
Yes, there are advantages to using an AU Host's MIDI Control system. But for the particular thing I'm trying to do (roll my own MPE implementations within AUM) the AUM system is not flexible enough and you can't automate AUM's MIDI Control settings themselves, they have to be changed manually. This is all to create an AUM/Mozaic/multi-synth-instance system that works together as essentially a single MPE synth. A way to MPE-enable non-MPE synths. Setting things up to play as an MPE synth is not hard, the problems arise when you want to automate editing, so that changes made to one "master" synth instance are automatically reflected in all the others.
Yep, I agree. I was just pointing out that publishing fixed midi mapping to controls isn't as good as having midi learn IMO. But also if there is midi learn, that it needs to be remembered so that you don't have to re-map in each instance or when you use the app in a different host.
Yeah, but maybe that gets at why I don't really care about MIDI Learn for this purpose. All I really want are some mappings I can use within AUM. MIDI Learn seems to add an extra step, where I need to do the learning to set up the initial mappings. Easier to just have a list of default mappings used by the synths, without going through MIDI Learn step. This gets at something I'm not clear about: Do synths with MIDI Learn sometimes have no default midi cc mappings? That is, are there some synths were you can't control them at all unless you've first created mappings via MIDI Learn?
See, the problem I have with fixed CCs rather than midi learn is my controllers. It's way easier to map the knobs to controls within a synth than it is to re-program my controllers to send specific CCs with specific knobs. In fact, it's a pain in the butt.
Also, I only ever care about tweaking a few controls for each synth from my controllers. I prefer mapping a few controls to having to keep listings around of every CC on every synth to look up each time I want to control something. I want to always be able to know that knob 1 can control cutoff, knob 2 controls resonance, knob 3 controls ... etc.
Sometimes too I'd like one knob to control more than one parameter.
Any synths have more than 127 parameters now too. There are only 127 cc's to choose from.
Let's say I'm not using a controller, but sending automation from Xequence 2. I set my automation up with one CC, but then I decide to change the app it's pointing to. Do I need to re-do the automation because the other app doesn't use that same CC?
Yes, synths with MIDI Learn often don't have default midi cc's. I prefer it this way because it reduces the chance that some stray CC will end up modulating something unexpected when I load a synth.
@wim: IMO, instead of remapping controls on my controllers (which is a royal pain),I use Mozaic or StreamByter (for this application SB gets my vote) scripts that sit between my controller and destinations and remap the CCs from my controllers. Once, you have set up one such translator (in StreamByter it would be just one line per CC) creating new ones with different mappings takes little time.
Shouldn't there just be some middle layer that handles CC translation (and note translation too, for instance for drum pads)? @wim, maybe you have something like this? Would be cool to be able to change input and/or output devices independently of each other.
I thought about creating a Mozaic script that would be extensible for different drum mappings, but abandoned it because it really only made sense if maps could be imported from external files rather than modifying Mozaic script code every time.
Then I set about trying to code an iOS AUv3 that would do this, but gave up in defeat at trying to learn AUv3 coding. I might pick up the idea again some day, but for now that idea is dead.
I do that sometimes too, but for my purposes it's simpler to have fewer plugins and more direct routing.
mfxConvert can be a handy way to map controls while avoiding scripting as well.
This is the sort of approach I tend to like. It has the advantage of clarity. With MIDI Learn you link things up, which is fine, but I like having a text list of control types and cc#'s that I can read easily all at once. It's also good to be able to see all at once just what synth functions are mapped and which are not.
So, what synths, pray tell, have midi learn?
A lot of them.
I'm to lazy to bother sorting through my apps to list them. It's easy enough to discover whether a given app has midi learn or not as long as you're pretty good at poking around looking for likely places for the activation button to be. I'll admit it isn't always obvious.
I think you'll have a lot of trouble finding synth apps with pre-defined MIDI mappings, let alone those that still implement them as AUv3. First, as @wim mentioned, many modern synths have too many parameters to include them all as individual CCs. Second, if they implement MIDI Learn when stand-alone, and the parameter interface when AUv3, they've provided MIDI CC access without needing a separate mapping. Third, pre-defined mapping requires a naive user to change her controller to comply; only hackers like us will know how to handle that in StreamByter.
That said, the Burns Audio synths included in the Spectrum bundle do have pre-defined maps (they are relatively simple units). Don't know if those controls can be accessed in AUv3; you could test that yourself. Edit: these are AUv3 only, so the MIDI CCs must be available in AUv3.
Yeah, that's no problem. MIDI Learn is fine in itself, for me it just involves an extra step of creating default mappings. And for my particular problem I need to bypass the host's AU parameter interface.
Thanks. I think those are already MPE-enabled, so I wouldn't have this issue with using them, but I'll check.
Model D’s stand alone assignments carries over to the AU, with fast learn. I wish that were standard.
I think iSEM carries over from stand alone MIDI learn to AU.
KQ Dixie has a very good system for controlling its parameters via MIDI CC. You can specify what ever CC you want anywhere in the sound chain or for global parameters.
The Rosetta suite has support for a lot different drum apps and common drum mappings.
GeoShred also has support for a lot of MPE synths and apps. I hope they add more.
While it might not be the most enjoyable task in the world, it certainly is possible to create a set of mappings you can control from Mozaic. You can create a set of data that the script can load in and you simply update a few global variables of your script and append the data for your synth. A spreadsheet could be used to create these synth specific mappings so you could easily sort the info by synth in a way that makes sense to you then export the data you need for the MIDI mappings.
I used these sorts of techniques when I created a script which used scala file information to convert MIDI input to the selected scale. The data structure took into account the two different methods of specifying scala structures— by frequency expressed in cents or as a ratio like in the data structure for one of the scales shown below:
@Scl7
//Partch, Partch 43T
//from Wilsonic app scala file.
InfoScl = [42, 69, 440, 1, 7] //Tones, RootNote, rootFreq, RatioType, SclSel
numerator = [1,81,33,21,16,16,12,11,10,9,8,7,32,6,11,5,14,9,21,4,27,11,7,10,16,40,3,32,14,11,8,18,5,27,12,7,16,9,20,11,15,40,64,160]
denominator = [1,80,32,20,15,15,11,10,9,8,7,6,27,5,9,4,11,7,16,3,20,8,5,7,11,27,2,21,9,7,5,11,3,16,7,4,9,5,11,6,8,21,33,81]
@End
You could create data structures to reflect your controllers and the types of synths they’re used for.
If you’re not wanting to deal with the extra layer of selecting controls, you could create a different Mozaic preset for each synth you want to map so you can use the same set of controls with your MIDI hardware or MIDI app by doing a find and replace operation for the relevant parameters in the script. It would be nice if Mozaic had support for file path info so you could organize scripts into folders which could then be navigated via a corresponding preset system to facilitate these sorts of solutions where you want to modify a basic script rather than create a menu system with your Mozaic script.