Loopy Pro: Create music, your way.
What is Loopy Pro? — Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.
Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.
Download on the App StoreLoopy Pro is your all-in-one musical toolkit. Try it for free today.
AUv3 hard limits
The user and all related content has been deleted.
Comments
Perforator ?
Auv3, Audio FX and MIDI CC output
It's definitely possible because Layr's arpeggiator throws out its midi while playing the audio to Cubasis. I wish it wouldn't tbh.
Thats 2 then
I think the limits that @Dawdles is hitting are maybe imposed by the host and not AU ?
@Dawdles
I think the technical limits of each AU are even just becoming clear to the devs, as I seem to recall many devs saying that there is little to no documentation. You will even find devs slightly in disagreement with what they believe can or cannot be done.
So, my ideas are usually based on what we already know from current apps:
We know that some midi is obviously part of any AU app, as they need to be controlled by the hosts.
We know that we now have AU midi that can be many types of controllers - really the limit has not been anywhere near reached yet. Imagine the control surface of any app you have, be it Patterning, KRFT, Animoogs keyboard, Shooms slider keys, Seline Redux keyboards, TC Data etc etc - all these interfaces could be designs for AU midi controllers. You like Patternings wheel interface? Easy connect it up and use it to send midi to any compatible AU sound source.
We know sound sources are many and varied, but some just seem stuck with really boring keyboards or really unintuitive drum pad set ups - this no longer needs to be the case. You want to use an Animoog style keys? Load it’s AU midi, load an AU sound source, either select the appropriate midi settings or select an already preprogrammed set and all of its saved for later use by the host!
We just need appropriate host, which many are already taking AU midi on board. In the future we will see more hosts, more documentation, more choice and more ease of use.
In the future if you want to open an individual program, that may just be a host. Imagine you open AUM or AudioBus 2000 and select what save you want - everything loads. Then you decide you want a different complex set up - you look on AudioBus Forums and find one of the many Macro Set Ups designed by one of us tweekers and bobs yer uncle!
Separating the controller from the sound source has major advantages though. Hosting has major advantages.
I know it’s tough sometimes for people to get their heads around, but why need internal sounds? You say imagine Patterning with internal sounds and running external sounds - why is this needed when an AU midi interface can run as many sets of sound as the host allows you to throw its midi at? You can then have a different controller that’s programmed to allow it to change any or all of those sound sets in real time. You then have a host that can save all of this and even show multiple controller screens on the iPad screen at once!
Keeping sources and controllers permanently linked is just not anywhere near as adaptable as a separate approach.
At that day, iOS music production don’t reached desktop/laptop maturity.
AU is the closest to that unified workflow. So you will have to use only AU instruments, fxs and midi devices in a host of your choice. There are still some issues with some AU which are difficult to render/freeze properly, so you’ll have sometimes to make choices and avoid some AU, or use IAA/Audiobus for recording them as audio depending on the host, this is what I do with GarageBand. You can also sample some of them in BM3.
You will also still need for the moment to route AU midi to AU instruments. And concerning all nice IAA apps, you will need to make the choice to avoid/forget them, or accept to use AB for its saving state, or save songs in different apps. If you record/render things as audio in BM3, savings will be references for eventual need for rework on something, but you can have your BM3 project full of audio recordings without need to reopen some apps.
iOS is also virtual studio oriented, as an alternative to AU workflow before it arrived. We’re in a transition. Audiobus/AUM have their strenghts, I tend to use everything that I like including IAA, except for fxs where I use only AU.
The big question is does Apple marketing approach will be able to offer devs a way to repliacate desktop vst workflow? We’re close, but apps prices and iOS lack of optimisation tend to give me some doubts about that. I try to take iOS music production for what it is now, to find creative ways to do my music stuff with it, and hope for some improvements in the AU workflow most of us want.
Separate approach - I forgot to mention the savings in development time! This could really be helpful. Even hosts don’t need to have multiple control surfaces, when they could just be loaded as AU midi controllers that someone else designed. No more need for built in synths or even some of the sequencer parts.
We are talking a totally modular environment for music making!
No more big cumbersome music programs. You buy what bits you want and put it all together!
AU mixing audio and MIDI is definitely possible. Some already do: Perforator and FAC Envolver, Moog Model 15 and Model D are just some examples.
However, separating MIDI and audio makes a lot of sense in my opinion. Sequencer and sound source are different responsibilities for music apps and being able to mix and match them per project makes all the sense in the world. Sometimes you don't want an XOX sequencer for your drums. Or you want a step-sequencer for your bass instead of a piano roll. Being able to swap them out whenever you like with the lowest possible resource overhead is only logical.
There's a reason all Rozeta plugins only use 512Kb per instance - it's because they only contain what they need without unused sound engines, needless abstraction layers, etc.
I point you at BramBos answer. We are essentially on the same page.
Every time a new IAA (that is not a host) is released, a tear falls from my aging face lol
At this point, I would say it's the hosts holding back the standard. For a while there were almost more AUv3 hosts than plugins, but right now critical mass has been reached in terms of plugins, but hosts are a bit inert.
First off, I am very happy that we're seeing a relatively fast uptake of AU MIDI (in spite of Rozeta being virtually the only plugins out there). Especially for AU MIDI instruments/generators.
But having said that, MIDI and audio routing between AU plugins is still clunky - if possible at all - in many hosts. For example:
some AU hosts still don't properly support tempo/transport/song-position sync with AU plugins
I think most hosts are willing and definitely not consciously blocking AU progress. It's often a matter of legacy design, backwards compatilibity with IAA eating up resources, Apple not being particularly helpful wrt documentation and having a lot of other things on the priority lists.
Thankfully we have the brilliant AUM to fall back onto... our unofficial reference host
Not sure I'll be here by then, we're only on 3 at the moment
Spot on, a dev who is a master of UI but maybe not so much on DSP can concentrate on UI for control, someone else who is brilliant at DSP but not so at UI can concentrate of the DSP.....
Not to mention the easier 'unit' testing of the separated components.
Imagine AU midi controllers designed by the makers of your hardware gear, mirroring them for better hardware and iPad integration!
You can do it all in one app already, use cubasis or Gadget and you can do pretty much everything you want. If however you want to use sound sources or FX that are not available in those apps then what ? You have to have a mechanism for Plug-Ins.....which is AU...
What I think @Fruitbat1919 is saying, and what I agree with is that if ALL synths and controllers were AUv3 then we could use any controller with any synth.....
All we need is a host that supports all forms of AU...I've not tried sequencism but of the others I have tried AUM is still the most flexible in terms of routing, but as a result gets quite messy...Cubasis on the other hand remains nice and tidy, but you have the limit of only being able to use MIDI Out AU's in the MIDI FX Slots and Audio Out AU's in instrument slots (which may be the restriction @benkamen is referring to)
Hardware is mostly software now anyway. The Essenes of how a single piece of hardware runs can be emulated just as easily with separate source and controllers as it can with one standalone IAA.
Even as the poor App Store stands, you can get apps in bundles.
Already have
I want a Bass Station II, Mininova and Circuit (One drum, one synth) AU MIDI Control Surfaces please @AmpifyxNovation
Bingo, this is why I also feel that 'instruments' & 'audio-tracks' should be kept separate from the 'sequencer/arranger'. (Not necessarily in the UI but 'behind the scenes').
The 'sequencer/arranger' could be used to 'trigger & control' the 'instruments' & 'audio-tracks' or other 'objets' which could be hosted in a separate 'object rack'.
A routing matrix could be used to define the 'inter-connections' between the objects.
The sequencer/arranger could then 'feed the matrix' with the required information.
Multiple tracks from the arranger could then feed the same or multiple objects if needed.
Data from multiple sequencer/arranger tracks could be 'combined/split/processed' by a 'rack-object' etc. etc.
It could quickly get quite complex due to the fully modular nature but it would be insanely flexible
It makes so much sense to me. Lots of little hardware boxes all controlled neatly by my iPad with mirrored interfaces. Just need Apple to produce a pro midi breakout box that costs less than a second hand car lol
One of the benefits of this more modular way of thinking is if done right, could help keep certain long term hosts more manageable.
Look at Cubasis. They have been developing that for years under the free updates. This is probably unsustainable long term. I do believe making hosts slimmer in nature by being modular, would help devs develop with more cost effective long term strategies.
So in summation:
AU is developing and will continue to.
We already have a good basis for better product with the current state of AU.
We need more continued development of the AU hosts to benefit what AU already can do.
We need to look forward to how this new model can improve development and make the most of scarce resources.
We need to have more open discussions like this without getting bogged down with those that are attached to certain apps so firmly that we all get labelled heretics lol
USB works fine (with CCK and Hub) for those hardware units that have it, but for 'real' midi gear with 'proper'
5 pin din, you do need a decent MIDI interface if your going to be generating a lot of traffic using controllers. I have a couple of the cheap USB to MIDI interfaces that I got for less than £5 on ebay....OK for notes, but not great with controllers and definitely crap when using MIDI Clock as well.
We maybe need a MIDI Control Surface (MDP/TouchOSC/Lemur) that runs as a MIDI AU (Full app with editor/AU Player only type thing) !!
Along with templates this would sort out the hardware controller thing.
What would also be neat, but would probably not bring any financial benefit to the manufacturers, would be to not only have an AU MIDI control surface for the hardware, but an AU of the sound engine too.....maybe when you buy the gear, in the box you get a code for the app(s) (codes can only be redeemed once right ?)
That way I can be using say Cubasis, with tracks sequencing my synths.....unplug the CCK, and instead of the hardware then be using an AU of it so i can carry on my tune while out and about, then when i am back home, plug the CCK in again and be back using the hardware....
Hi have you any links to the Drambo app mentioned in your post?
A good example for the kind of AU MIDI OUT I’m interested in is a Drum AU Roli Noise. It has its own sounds but I can use it’s touch surface to record it’s MIDI events direct to the timeline.
Modular sequencers plugins are also very cool, maybe even cooler. But basic MIDI out in other plugins are also very useful.
I think Patterning 2 is an epic missed AUpportunity — but that’s how things go.
I see what you did there