Loopy Pro: Create music, your way.
What is Loopy Pro? — Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.
Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.
Download on the App StoreLoopy Pro is your all-in-one musical toolkit. Try it for free today.
nanostudio 2 - recording / capturing midi. please help
i can’t figure it out, i know somewhere in this forum is the solution but i can’t find it and i’m sure others are lost as me lol
so i have sunrizer up, went to midi fx, added SPA ... everything sounds great...
but the midi isn’t being captured when i record the track...
what do i need to do!!
i feel like this should just work but i don’t understand the insides of these apps
Comments
I still can’t record midi in it so it must be a thing. I thought that was one of the “requested features”?
on ipad just put MIDI Tools Route plugin after SPA, then create new (temporary) track, select it and hit record - it records notrs to this track... (NS has protection to record duplicate notes so you can record in loop, it will record all notes just once)
after řit is recorded, just move clip to original track and disable SPA
on iphone, unfortunatey, there is not available Route plugin, but there is other (a bit more cimplicated) workaround:
https://www.blipinteractive.co.uk/community/index.php?p=/discussion/989/how-to-record-aufx-midi-out-on-iphone
Midi Tools
https://itunes.apple.com/app/id1446209019
awesome thanks @dendy !
do you know if there are any plans to have something like the midi tools app built into ns2 in the future ? or will it always require this 3rd party app...i do not own it and probably wouldn’t use it for any thing other than with ns2 so i’m curios, of course it is cheap enough that i don’t mind the price but if it will be part of a future ns2 update i may hold off
You can do it without the plugin by sending the midi out to Audiobus and then back in.
You can do it with AUM too. Just add NS2 in to AUM, then set a routing from AUM Destination to NanoStudio
3. I’m not sure why you have to add NS2 into AUM, but if you don’t, it will only play when AUM is in the foreground.
I prefer the audiobus method.
lol that's my video, i forgot i made it :-)) thanks for reminder :-))
That would probably be because you have background audio turned off in NS2.
Even though you have NS2 in the foreground you need to have background audio in NS2 enabled. If you do that you don’t need to add NS2 to the AUM session.
it is on todo list but i'm afraid there are other things with significantly bigger priority ... So MIDI tools for iPad is for now good way around ...
This and the Midi Tools trick would be great wiki fodder.
One thing that might catch you out in NS2 is that you set the output channel to AUM from double clicking the external MIDI track (that contains the MIDI AU) the Song editor/arrange window and then tapping the keyboard icon in the top left.
If you double tap the track from the mixer page you get a different set of controls. It looks for all intents and purposes that you should be able to set up AUM/Audiobus as the Output from there but you can’t.
TLDR You get to this page from double tapping on the track in the song edit/arrange window.
Ha! I had a feeling that it probably was made by you but couldn’t tell from that YT channel or anything else within the video. Thanks for making that one.. 👍 I’ve referred to it a few times when I could not for the life of me remember how it was done.. a very common issue when using NS2.
thanks guys @wim @dendy
it looks like maybe even drambo will allow this routing perhaps, we’ll see
Funny thing is - i don't use this workaround in my production - and i'm using a stepPoly Arp and rozetta arp quite lot - i just don't see any reason to record their output back to sequencer, i preffer to have it realtime all the time ...
i don’t want the spa / other sequencer playing thru the whole song , also changing the sequence parts through out the song i prefer to do one time and record it... then have the part done...i won’t remember in 3 months what i did or was doing but if its recorded then it’s all good .
i like to eventually get everything mixed down to audio and delete all the apps for when i’m ready to mix everything down for an ep or live play.. can’t wait for audio to hit ns2
but ya to me recording the sequences is crucial as i don’t want to do the work every time i open the project
will this app , midi tools, let me record mononoke inside ns2 as well , using mononokes built in pads? so strange imo that it doesnt just record this like cubasis or garageband
I don’t think so. I think that workaround only works for Midi AU FX, not instruments.
You can host Mononoke in audiobus though, and record the midi out onto a track in NS2. But it has to be switched to MIDI from MPE.
hmmm that’s so weird! i guess if you aren’t recording the mpe you might as well use a midi aufx to push mononoke in ns2 .... at least until you can record audio directly into a track, not the sampler, in ns2
still love ns2 , it is a great daw , but this midi thing is strange!!
Not trying to start something but, it sure would be nice to have some sort of roadmap as to when some of the promised features and issues would be worked out. I really really dig NS2 but the above workaround for midifx is just a real bummer.
Not gonna happen though. The developer has been pretty clear that he's not making guestimates about timing of updates. He's a perfectionist, able to work on it only on it part-time, and more importantly feels super-bad if he misses a time commitment.
It's a matter of integrity, and I respect it. One thing for sure ... it's going to be a long time. Audio tracks is the next priority.
The workaround isn't really all that horrible as iOS workarounds go.
The workaround isn't that bad -- and it is really hard even for full-time developers to give accurate projections of completion dates. It is a lot harder for developers who (like most iOS music app developers) have families and full-time jobs AND an OS that is a moving target to contend with.
The NS2 developer thought he made an accurate guesstimate as to how long it would take to get the iPhone version and audio tracks rolled out and he was off by a huge amount -- due to no bad faith on his part. Wisely, he isn't sharing any projections. He doesn't want to repeat the earlier disappointment that people felt.
If one hasn't ever had to develop software like this, it can be pretty hard to appreciate how hard it is to know how long it will take you -- particularly when you also have to content with life.
It took other DAWS a long time to get recording of AU midi FX too. Auria Pro took quite some time, and went through some really messed up iterations. BM3 didn't have it at first, and took a long time to get it. Cubasis got it right in version 2, but then broke (overlooked?) it in version 3.
Those apps were already pretty feature complete when they needed to deal with AU midi, whereas NS2 is still coming up to that stage. So, I really don't find this mystifying at all.
I know from reading other threads and forums that the changes needed are pretty deep in the architecture of NS2. Best to settle in and accept the situation if you otherwise love NS2. It's gonna be some time.
We would’ve had audio tracks and a highly efficient convolution reverb in NS2 by now if the dev hadn’t decided to support people’s requests for deeper AU support.
Although, I do question how much more efficient his convolution reverb could be than the current ones. It'll be great to have one built into NS2, but I don't know if there are big efficiency gains to be had over the lines of Rooms and Impulsation/FantasyVerb.
.
Thanks for the perspective.
I've been using Blueveek's route plugin to route midi notes inside of NS2. But midi CCs seem to work differently. I'm trying to route CCs from things like midilfo and rozetta lfo inside of NS2. Is that possible?
So can ns2 record mpe without work arounds?
Bleass and Magellan 2 have great mpe presets but i can’t seem to capture the greatness in ns2 . It won’t let up capture from the au keyboard and i don’t think the in app keyboard is mpe
Sorry, NS2 doesn’t support MPE. https://www.blipinteractive.co.uk/community/index.php?p=/discussion/1247/mpe-support
Durn it!!!
Would help anyway, at least with Magellan 2, which doesn’t output MIDI.
I’m guessing Bleass Alpha doesn’t either. You could load it in AUM, then try pointing it to the free MIDI Spy app to see if anything comes out. If it doesn’t show up as a MIDI source to MIDI Spy then you have your answer. If it does show up, then you should see MIDI notes coming in on more than one channel when you play polyphonically and also a bunch of Midi ccs, pitch bend, etc.