Loopy Pro: Create music, your way.
What is Loopy Pro? — Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.
Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.
Download on the App StoreLoopy Pro is your all-in-one musical toolkit. Try it for free today.
Comments
readed this multiple times last few days ... Maybe i'm missing something, but to me looks like NS2 midi/audio routing is capable same complexity like AUM/ApeMatrix, and more ... Looks like people really don't know about true routing possibilities in NS2...
Ok - there are current bugs in MIDI AUfx routing (midi is not passed from one plugin to another) - but this is just bug which will be fixed in upcoming update ... regarding pure MIDI / AUDIO routing between channels - there is literally NO LIMIT - NS2 is most complex host app in terms of midi / audio routing available at the moment .
Of course maybe i'm missing something in AUM/ApeMatrix ...
Possibly. Though I think I may be missing it too. The audio routing in Nanostudio is very good I think once you wrap your head around it. I like the Reaper model and Nanostudio does a good job implementing it. The only limitation I can really think of is that you are limited on send/aux channels. Oh and automation for effects doesn't seem to work.
There are also a couple of things that I found confusing. It took me a while to realize that if I wanted to add AU3 plugins, I had to add them in the audio strip, rather than in Obsidian/Slate. It makes sense - but it wasn't intuitive.
ApeMatrix has some interesting tricks for midi that I don't think any other app can match (BM3 is a possible exception here). LFOs for parameters being a big one. It also has an interface that makes it far easier to visualize what you're doing and switch between apps. And the interface for midi routing in both ApeMatrix and AUM is superior to anything in Nanostudio - though I'm not sure that Nanostudio really needs that complexity.
Hm, that sounds interesting, what is this ? I didn't tested ApeM so deeply .. I guess, you can control with dedicated LFO any plugin parameter ? If that is it, then this is pretty cool !
Ok ok of course. I was talking just about pure midi / audio routing. btw. you can automate build-in FX's...
Yeah.. basically just sends and aux (group) channels. But don't forget there is any limit on number of sends or on number of group channels (and also any limit on their parenting - you can have groups of groups of groups of groups ...). Another thing is that you have also midi sends not just audio sends.
And every channel is able propagate audio not just to it's parent channel but also directly to main device HW output. And you can set MIDI to be send also to parent channel if you want.
In terms of routing, there is almost nothing you can't do
But i understand that especially for more simple projects and for live jam, matrix view of AUM/ApeMatrix is lot more intuitive, no doubt about that. Actually we discussed about some kind of "matrix" overview of all mixer routings during development, but obviously there was a lot of other, more important, tasks to do, so this idea faded away.
Yeah the developer is definitely of all the school that if you have a control, then that control should have an LFO. And to be clear, I don't expect Nanostudio to have those features (Aum and Audiobus don't), just giving you an idea of quite how insanely flexible that thing is. It's basically a plugin host reinvented as a modular synth.
Sure, but isn't there a limit on how many you can have? Or am I mis-rembering this? If Aux and Sends are unlimited then yeah - it's basically reaper minus audio lanes. Which is certainly my preferred paradigm.
It’s unlimited. For example, I have 3 different delay sends that are grouped so I can apply a single filter on all of them. And that delay send group is a member of the master send group. Then I can do stuff like automate the volume of the master send and do instant all send muting for interesting breakdown effects.
I had 20 sends on this tune before I scaled back and realized I was just getting out of hand
What is the general consensus of the best all-around beginners orientation of NS2? Not one that gets into the nitty gritty, but a solid one that goes through most of the main stuff? Hopefully something under an hour?
Maybe start here?
@cian sounds pretty interesting, goimg to give ApeMatrix another round
@skiphunt there is nothing better than Platinumaudiolab videos, they are thematic so you can watch just some mased on what interests you.. they are also relative short, one hour to watch all important parts is more than enough..
https://www.blipinteractive.co.uk/learn/
Clear, concise, doesn't meander. Doesn't sound like he's wingin' it while trying to figure it out as he goes. Perfect.
Thanks!
Just btw. Steven, who made all that tutorial videos is also author of all IAP packs ;-) That guy knows how to work hard.
I wish he'd pump out some more IAPs (he says like it's nuttin
), I like them, but I also liked buying them in small payment for the excellent video series.
Right. The old who does and who doesn't publish a virtual MIDI port thing. Wish all apps did! I use MIDI FLow's custom virtual ports for this. They're dead simple to set up and you can have as many as you want. You then, for example, point AUM MIDI OUT to your new port and NS MIDI IN to the same.
If the MIDI sending app also doesn't publish a virtual port, a work around is required (or just use AB3 for hosting). Other wise, I reckon NS2 is quite ideal for controlling from the outside world. I use it with my hardware sequencers and Quantum regularly like this—Q and my midi interface both just show up in NS so no workarounds required.
I mean, I totally agree that NS should expose a virtual port but using it with apps that do publish a virtual port or creating your own with MIDI Flow (or MIDI Fire or MIDI Bridge or...) is so quick and simple it never feels 'in the way' or kludgy to me.
Yes. You must’ve missed my follow up post. Using midiflow is exactly what I did to get external stuff working.
Moving on to the sampler after finishing up the vids on send fx, etc.
Ha. I guess I should have read the rest of the thread before replying.
Glad you got it sorted.
HA 2x. I even missed your follow up about your follow up to my follow up. Closing the browser...
Well, it was your original post about driving obsidian with apeMatix mini doodads that set me off on the wild goose chase.
Playing with the obsidian sampler now... pretty cool.
Cool. Sampler OSC + FM OSC is my favorite.
One fun thing that you prolly already know but I'll say it aloud anyway because it's part of its "as a standalone multi-timbral synth strengths: you can set multiple Obsidian tracks to the same MIDI channel to layer stuff.
You can also set a key range per track. Obvious use case for this is splitting a keyboard but it can be used for fun stuff too. For instance, adding some oomph to the bass notes in an otherwise thin sound. Or making a 3 octave arpeggio play different sounds. Or, copying the same sound, having entirely different FX chains for different note ranges...
And beyond... Some rabbit holes:
Another multi-timbral rabbit hole... on the OSC VEL panel, you can set up velocity switching for each of the three oscillators. A non-obvious thing you might try:
Velocity switching is somewhat explained here: https://www.blipinteractive.co.uk/nanostudio2/user-manual/Obsidian.html#osc-panel
Of course, if the sounds aren't too complex, most of the above can be done within a single instance of Obsidian since it has three oscillators.
Oh yes! I’ll dive into these rabbit holes later as I become more familiar with ns2, but I just tried setting up 5 instances and driving them all externally via KB-1 (instant connection). Awesome! Does obsidian support mpe?
EDIT: I don't see it listed anywhere on the NS2 site specs.. so I'm guessing it doesn't. No matter, was mostly just curious.
@skiphunt No MPE at the moment.
Another layering thing to consider—you can set the AMP envelope's level to track velocity in reverse (Env->Scaling). So you could have one sound track velocity in the 'normal' way and another sound do the opposite in order sort of cross-fade them with playing dynamics (or seq vel)
You need a shed in the middle of the forest, with a blackboard, we shall come and bring cushions and tea.
It took me half an hour to figure out how to set this up... and yes, that’s a spectacular trick. Thanks!
That's a pretty cool list of tricks and tips !
You can use for randomisation also other method which saves LFO for other more fancy stuff
you can"Rand1" or "Rand2" modulation sources. These mod sources basically generates for every voice random value from 0 to x, where is is modulation depth number:
This is deeply connected to one of my most favourite tricks. By default, obsidian oscillators are phase synced (every voice starts with same phase). This is cool in many cases (especially for percussive sounds or leads where you want to start with significant transient). In combination with unison it produces "phasing" effect (especially with just small detune) - which again is great in many cases.
BUT.
In case you want to reproduce more "analog-like" sound, you need so called "free running" oscillators - their start with different (semi-random) phase for ever voice.
To obtain this, just set in mod matrix for example RAND1 > Osc1 (or Rand1 > All Osc - it depends on what character of sound you want to get, how much randomness you want).
How it affects result sound ? Significantly. In example above is exactly same patch (2 oscillators, 4 voice unison applied with small detune and stereo spread, playing 4 notes chords). In second round there is added Rand1 > Osc1 (40%) and Rand2 > Osc2 (40%) modulation. Difference is very noticeable - first round is synced, there is noticeable phasing effect, sound is sharp, cutting, second round is nice smooth, voices are more blended, it sounds more "analog like phat".
Such significant change of sound character obtained just by chaning oscillators from phase synced to phase random.
https://www.dropbox.com/s/2b19j3o1t4pz3on/SyncedAndUnsynced.wav?dl=0
Note: For "Sample" oscillator, if you use "single cycle" waveforms, you can get same effect by modulating "Rand > Oscillator sample start" instead of "Phase".
Note2: For long samples (not "single cycle" waveforms) use just very small amount of modulation (1-5). Try put same sample on osc1, osc2, pan one to left, other to right and than apply rand1 > Osc1 sample start (3%), rand 2 > Osc2 sample start (2%) - instant huge stereo sound
I’d take it one step further. If you have blocs wave. Use it to time stretch the recording you have in AudioShare to target tempo and THEN import into slate.
Also. To answer an earlier question you had. It actually is possible to send midi from AUM hosting midi AUs and record them in NS2. You just need an intermediary midi port like mf adapter. Since I already owned it I just used it. Here’s some example pictures:-
@ExAsperis99 this might interest you too. Since both apps have Ableton Link the midi is recorded in perfect time.
Now I just need to figure out if there is a way to make each track listen to a particular midi port and channel..... any help with this @dendy ?
Edit:- nevermind. Found it
I think @cian covered a lot of the differences in apeMatrix but another key aspect is that FX react differently when they're set up as channel FX to the way they do when set up on send/return busses. Both of those aspects of routing can be matched in most audio hosts. Where things change in apeMatrix is that you can then set up both parallel and series FX paths for both insert FX and send/return busses.
Add to that the same level of routing flexibility with regards to MIDI devices and that results in routing flexibility I've not seen outside of a high Eventide devices. And that's the way I view apeMatrix, it's the hub that enables me to roll my own sophisticated multi-fx where each discrete element is powered by an app of my choosing.
Once you get your head around the separate matrices, it's a walk in the park to use.
apeMatrix is so good I wish I had it available to me in Ableton on the desktop That's not to say Ableton can't do everything apeMatrix can, it's just more of a headache to achieve it. I use FL Studio as a VST within Ableton to achieve similar flexibility as the workflow is an improvement on Ableton's but it's still not as good as apeMatrix.
But it's worth saying here, Nanostudio 2 is very close to being my favourite DAW on iOS. It has a few critical things that don't work for me as yet which stop me from using it, but I'm very much looking forward to the next update.
hm, still but all this you can do also in NS2.. you can send audio/midi from one channel to any number of other channels, every target channel you can again send to any number of other channels, etc etc .. i really can't imagine any thinkable audio / midi route which will be not possible to do in NS2... don't get me wrong, i'm not saying that "ns2 is better" - i'm just curious and trying understand if there is some routing (just pure routing) which is not possible to do in NS... (maybe then i can talk with matt about improving ns routing capabilities
))
give mi some real example a i'll try reproduce it in NS
Still - main advantages of ApeMatrix are to me:
those 3 things are reaally really cool.. but just in terms of pure audio/midi routing posibilities, i don't see anything which would be possible in ApeMatrix and not in NS2
As I mentioned with regards to Ableton and FL Studio, it's not just a matter of what is and isn't possible, most hosts will get you there in the end. It's a matter or the workflow and the resulting visual representation of the complex routing network.
I've got a lot on my plate at the moment (and I've promised to write some bits for the Audiobus wiki) but multi-fx routing is part of what I'll be covering on the AB wiki so you'll see some real world examples when that gets posted.
The main reason I use aM, AUM, and AB is that I like to write my FX chains once and then have them available in all iOS DAW's. And one of my issues with Nanostudio 2 at the moment is that it's too much of an island that's attempting to be all things to all people. One of the main reasons I use iOS devices, is the modular communication between apps. Back in the day when @Michael created AB, modularity was a necessity, but it's grown to become a flexible platform USP. Being an island in a networked world is most definitely a limitation (one that Gadget suffers from too). But the new update brings AB compatibility and that can host AUM/aM so that should open things up a fair bit.
As I've mentioned elsewhere, I use iOS devices differently to most here on the AB forum in the sense that I have 4-5 permanently hooked up to my desktop DAW as external sound modules. I own all of the iOS DAWs (as I like to keep on top of their developments) and none of them excel in all areas but Nanostudio 2 is the closest to being the rounded package suited to my workflows when I'm mobile. The reason it's head and shoulders above Cubasis and Auria, is that it has a refined UX that's built for iOS. Both Cubasis and Auria suffer from desktop-style UX's badly translated to iOS. But Cubasis is still my DAW of choice when mobile as it will host most things you throw at it without problem. And whilst it's automation workflow isn't very good for non-native 'plugins', it will automate any parameter the plugin in question exposes.
ModStep/Studiomux is what I use day in day out, but only to stream audio to my desktop DAW. When ModStep 2 comes maybe that will be my iOS DAW of choice, who knows (I am an Ableton user after all). If it has fully functioning Plugin Delay Compensation from day one, that will be a major plus point. At the moment it appears that only Auria Pro has implemented glitch free PDC (this came with the last update). Really hoping that NS2 isn't far behind AP as PDC is another of those things that will bring iOS DAWs closer to desktop DAW performance and flexibility.
yeah .. i understand this factor .. i mentioned it above .. matrix view of routing (same is in AUM) is very cool ..
i'm glad i have some great infos about this issue, but first i need put my hands on latest beta, check if everything works as planned - and then will be back with some great new infos
)
NS implements sample accurate plugin delay compensation on all parts of audio graph. It even displays information if some AUfx plugin introduces bigger latency into audio graph (for example FAC transient does that, which is obvious because of what this plugin is doing).
I'm in agreement with @jonmoore here.
One of the benefits about the midi routing flexibility of apeMatrix is that it is agnostic in terms of what
type of plugin it is. That's why there's never been any compatibility issues with midi plugins in apeMatrix. All the Daws have had problems because they have to categorise everything. Which has its' advantages in a daw, But in modular routing platforms the openness means that you can easily send midi from one app to another without any worry about what category it is.
It's also the number of taps to do things as well, apeMatrix has it's advantages here. Just much faster to experiment with things. I'm also wishing there was a quicker way in Ableton Live as well.
That's why I like doing my mucking about on iOs modulars, it's more fun to try things out quickly, and most of the time I have Ableton hooked up to my ipad either AUM/apeMatrix or AB3 and I can send stuff between easily. Using the power of a desktop daw when i need it. The ipad daws are mainly designed to be worked on as standalones and that idea is less appealing to me.
If I'm wanting to sketch out a full track on iOS though then NS2 is a great option. But I like a workflow between platforms. I'd like to see Ableton Live export on NS2. But one nice thing it does have is the connection to desktop webDAV which is a great feature
I'd like to see that in the modulars too.