Loopy Pro: Create music, your way.
What is Loopy Pro? — Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.
Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.
Download on the App StoreLoopy Pro is your all-in-one musical toolkit. Try it for free today.
Comments
And/or as a super low resource multi-timbral synth/sampler with effects. Ignore the sequencer and run it from something external (say from Quantum or a set of ApeMatrix hosted AU MIDI doodads or...) and suddenly it's the most powerful/least expensive Roland JV/XP/XV series type multi-timbral synth ever. I mean, Obsidian is considerably more powerful than the synth engine in the JV2080 or the XV5080 in pretty much every way (except for poly-aftertouch support, I guess). The JV2080 goes for $400+ on ebay to this day. Obsidian still doesn't have that Roland JV/XP/XV sound library yet but that's just a question of time.
Haha ns2 has superfans, i’ll Give it that much...
Matt has to love you all very much because you support and push his product over anything and everything.
Indeed! It’s pretty symbiotic: We love Matt because he’s brought us a product that we feel good about using, supporting and promoting. Tough to find anything bad about it, I agree. IMO It helps the entire iOS music ecosystem when you have followers that believe in all the various products and go out of their way to help others or answer questions. I appreciate all of the members here that are passionate about products. It’s definitely helped me out and got me going quickly across all of these great iOS apps
Can't wait man !
You're history of comments about:
stability
scalability
productivity
Got my attention. We all want those 3. The only areas we probably disagree on are the
aesthetics of the results and that's how we choose our paths.
Some great paths won't take me to where I'm trying to reach.
Still, I like hiking just for the joy of the outdoor experience so I buy anything that works well
to help me move. Now putting on the shoes and hitting the trail requires time and a plan.
Which, for me, I feel I’ve been fortunate enough to have so many wonderful and fulfilling paths to wander through. Seen some great country. Seen some amazing views.
But I believe we all have our favorites, or ones that we remember more vividly. If we never wander, we never find the most amazing of them all!
Decided to give that a spin, ie sending midi to obsidian from externally midi doodads. Tried Quantum @nd had it working with minimal effort. Tried sending midi from Fugue Machine, riffer, and Rozeta via apeMatrix and AUM. Can’t figure out the routing. Can you or anyone else offer a little guidance? I checked the NS2 manual, but there’s not much detail there at all. And, can’t I just host those AU sequencers in NS2 like I can with all the other AU hosts?
Was able to do it easy enough from AB3 by sending to virtual midi. Still stumped with apeMatrix & AUM though
You can host them in NS2, you just can’t record the midi thats coming out. If that’s not important to you, then most will work (StepPolyArp is one exception, there may be others.)
NS2 doesn’t expose its virtual midi port as a destination, so you won’t see it to select from AUM or ApeMatrix. To be honest, I don’t know how to do it from them. From AudioBus you just need to route it to the Virtual MIDI Bridge. Audiobus will then show as an input source in NS2.
what @wim said.
few notes:
StepPolyArp issue will be fixed in upcoming update. Additionally, there is known bug where MIDI is not propagating from one AUfx to other - this will be fixed too - it means basically all available MIDI plugins should work after update.
I finally picked up NS2. Well, it would be rude not to...
I am loving the Obsidian osc and filter visualizations. The FM synthesis is really fun as well. Very nice design feel to the synth.
Not so into the presets or the sound of the IAPs but that might just be a genre thing.
More to explore!
Awesome! Will be interested to hear what you like/don’t like. Did you use NS1?
Ooh. Ooh...now that’s a useful thing I didn’t know. Right, that’s tonight’s project...
Will midi recording work?
Hmmm.... I’ve never had this much trouble trying to figure out how to send/receive midi in a host. The manual is really sparse. I just trying to send riffer to an obsidian track and can’t even figure out that. Ive got riffer on its own track and I think I’ve got all the in/outs set for both the riffer and obsidian tracks.
I don’t really want to have to watch hours of demo videos or ask for instructions on a forum every time I want to do something simple. I don’t see how anyone considers this all “intuitive”. It’s not like I haven’t done this all before in 6 other hosts without even having to crack a manual.
I’m going to put this back into the back burner for now I think. Maybe later it’ll start making sense.
Not in this update. But Photon AU midi recorder will work after update - so that would be solution for recording midi - you can then using "open in" from Photon copy recorded midi into NS and dragndrop it on timeline
yeah if controlling NS from outside by various midi sequencers is what you want to do then at the moment it's not ideal, you need to wait until NS2 will register itself as "virtual midi in" port - then it will be easy same way like in other hosts .. definitely on todolist..
It's probably just because AU MIDI isn't supported yet. Maybe after the next update it will work.
Not having sustain pedal support is such a major disappointment. Can’t play awesome AU plug-ins in it until it gets put in.
That’s not completely correct. It is supported, it’s just not working for some apps, and can’t record the midi output, and is missing some important features. Many apps such as the Rozeta suite work.
@skiphunt there’s a good manual should you want to take a look at NS2 again some time. (Hint - AU Midi plugs go in the same channel they are driving. There’s a button on the mixer channel to get to where they go. It’s a bit like Cubasis in the way Midi AU FX are handled.
But yeh, no reason to use it if it doesn’t click with ya. The midi AU stuff was kinda bolted on at the end relatively speaking. It’s definitely not a strong point of the app.
I’ve tried using IAA from external hosts to NS2 also. I must be doing something wrong... like when I forget to arm a track in Auria.
I also don’t like how the AU lists aren’t even in alphabetical order.
Frustrating.
I think I’ll punt and just go back to being content to record some loops to export into other hosts instead. And, revisit NS2 after the next update. Interface logic just isn’t clicking with me. I want to get into like others who are evidently having a ball in it, but I haven’t yet figured out it’s interface syntax yet... and I can do everything I want quickly and easily in my 5 other hosts.
Is there a better manual somewhere besides the in-app help guide? That hasn’t been helpful for me at all yet.
NS2 doesn't support IAA.
Can you confirm if riffer AU is supposed to work driving an obsidian synth within NS2? I was able to use Rozeta as a midi fx, but can’t for the life of me figure out how to use riffer inside of NS2. I couldn’t get riffer AU to NS2 to work from either AUM or apeMatrix. Only from AB3 sending it to a virtual midi bridge.
It NS2 not really hooked up quite yet like all these other more mature hosts?
Someone else will need to comment on that. I don’t have Riffer. It’s very possible it doesn’t work.
That is correct.
On the other hand, AUM and Ape Matrix could make this easier by advertising their virtual midi output port. Right now, only AudioBus does this. It’s funny how this need has popped up for a couple of other app routing issues in the past few days.
Nope. But be sure to check the tips and tricks section. Lotta good stuff there.
But it sounds like this just ain’t your thing. Thats OK.
I got Riffer to work, I think, though I had help on here. It was a pain in the arse if I remember rightly which is why I haven’t bothered since.
I sense that it will be my thing... once connectivity and more broad app support is there. It’s just frustrating when you’re trying to do something that’s really simple in the other hosts, but it simply isn’t working. You don’t know if it’s you that just doesn’t get it, or... the app doesn’t actually do that yet.
I’m not giving up on it yet. But, I figure I have plenty of other tools that already do what I want, without having to wrestle with being an early adopter.
Ok, I got it set up so that I can send/receive AU midi sequences from external hosts. I had to set up a quick local to local network on the iPad using midiflow. Send out my midi to my “local” midiflow network, and then receive from the “local” network from within NS2.
Works fine, but kind of a PITA compared to other hosts.. but not really that bad. Still, it’s a way can use all my midi sequence toys from external hosts to drive and record NS2’s obsidian synth.
Or, just use AB3 with its virtual midi bridge instead.
I was left scratching my head for a while trying to record the MIDI out from Noir until I read that NS2 couldn't do that. It's little things like that and auv3 automation which when implemented will make a big difference.
At the moment I'm finding that there are a too few many obstacles to using NS2 (beyond the obvious lacking features we all know about) but I'm looking forward to future updates to address these things.
In the meantime I am using it a little more sparingly than I thought I would when I first got it. It does have a lot of potential though.
No single app has really stuck for me yet on iOS but Logic is the foundation for me and I'm not looking to go all in on iOS. I am just enjoying using the iPad to its strengths.
Oh okay. A little annoying but workable.
The thing I like most about Nanostudio 2 is the elegance of the user experience. The thing I like least is that it doesn't allow for the hosting of AUM and apeMatrix (or Patterning 2 for that matter).
In the short term we'll have a workaround seeing as Audiobus will soon work within NS2 but I suppose my longer term wish is that AUv3's on iOS are allowed to host other AUv3's much like they are on OS X. This would mean the likes of apeMatrix and AUM could function as AUv3s. But I also wish that IAAs weren't regarded simply as iOS dinosaurs. There are plenty out there that are still musically relevant but where they'd be very difficult to be remade as AUv3's with the current AUv3 interface requirements (Patterning 2 being a good case in point).
The one thing that apeMatrix and AUM provide that NS2 doesn't, is greater flexibility with FX routing. apeMarix is particularly strong in this regard as you can easily set up FX chains that use parallel as well as series IO. They're both also great for saving FX and instrument/FX chains in a manner that isn't limited to a single iOS DAW.