Loopy Pro: Create music, your way.
What is Loopy Pro? — Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.
Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.
Download on the App StoreLoopy Pro is your all-in-one musical toolkit. Try it for free today.
Comments
When i work on the iMac (Ableton, BitWig), i use the iPad mostly as a tool or recording source.
I use the iPad as:
Above options are manageable with IDAM, but I use the iCA4+
One way to be able to make use of iOS fx in AUM is to use them on a bus. Send the signal to a separate channel with fx and record that separately. Then you have separate 100% wet and 100% dry recordings. Sometimes I have iOS fx that just work better than the desktop replacements — partly because you get used to them being part of the sound I guess.
These 100% wet fx only channels can also come in very handy for sound design — I’ve used just the fx returns from an iPad session without using the dry signal at all with great results...
NS2 also works well for this as it includes the return fx AUX tracks as if they were normal tracks when you bounce the project.
Never thought about that. Thanks for the tip
I basically think that it doesn't do any good to benchmark the iPad. OS and hardware wise, the new M1 iPad can do pretty much exactly what the M1 MacBook Air can do audio computation wise. The one big exception right now is that iOS can't do multiple audio interfaces and can't do 3rd party drivers. That alone makes macOS much more flexible for audio work.
The rest of it comes down to DAW's. It's the workflow in the DAW that does what you want or doesn't. There's a way more DAW's available on macOS at prices ranging from essentially free to incredibly overpriced. There's all sorts of other audio tools available there that you can't get on iOS at least for now. You don't have the touch based interface or the sit on the couch and use it on macOS, and for me that's the big draw of iOS. But, I expect more things like Logic Remote to be developed.
Unless you really foresee a need for the 1TB of storage on the iPad, it makes a lot more sense to me to have the smaller storage and a MacBook Air. At least for the next couple of years, it seems to me that this is going to have a lot more flexibility than the larger storage iPad.
That’s really interesting. So that means when there are RT threads the performance cores do not go to sleep, right? The two RT threads of GB seem to be a good separation of concerns. I wonder how Steinberg is doing that.
Hmmm… threads running sequentially?? I’m a Java backend server side guy but in my world you are using threads to run stuff in parallel. Well, my applications are never real time - that might make the difference. But coming from my background I would create one RT thread per performance core and then let them handle the workloads but I have no experience with RT DSP.
@NeonSilicon of course the M1 has the same capabilities - iPad or MacBook. But I think the software - iPadOS as well as the AU hosts are not really ready to make use of these capabilities. My impression is that this is mostly an issue of the AU hosts and that matches your observation of sequential execution of threads. iPadOS recently removed a few road blocks, like multiple audio threads, etc. but the AU hosts did not catch up. Can this be true? Cubasis being most advanced in this area so far? For that reason a benchmark with a real world workload seems to be good idea to me. It’s not about blaming but to see the progress and also help people making the right choices for their projects.
The performance cores do block for extended periods of time. The real time threads don't need to run for the majority of the time the DAW and AU processes are running. The real time part is just that they must complete each processing block in the time the D/A needs the samples (CoreAudio is basically a pull model).
There's different reasons to run multiple threads and even for real time threads it has a lot to do with programming logic and not just pure processing power. It's much easier to program with a relatively simple mentality of I'm writing this to run on a single realtime thread that needs to complete in this time allotment than it would be to have to think about how to multiplex a whole bunch of different processing blocks to sit across one or multiple CPU cores. In modern systems especially, only the OS really has the information to schedule efficiently anyway.
From an energy efficiency standpoint, if the OS can schedule the real time threads sequentially on one CPU core and still have everything complete on time, this is better than using two cores in parallel. That's pretty much the reason that on both macOS and iOS you don't get to ask for cores to specifically run threads on. The OS is in charge of this and it should be.
There's some historical context that comes into the running one thread per core for audio that doesn't really matter on the Apple Silicon machines. On systems with hyper threaded (I hate that name) cores and OS's that allowed for thread affinity to cores, running RT audio work on the hyper threaded cores would cause a performance drop because these pretend cores didn't include all the execution units (like floating point units in particular) so the multiple threads would end up stalling to get execution time on the shared processing units.
The thing that I found most interesting when looking at the profiling was how free the performance cores are to do the heavy processing. All the book keeping and background stuff that the OS needs to do is on the efficiency cores.
The weren't really any road blocks to using multiple threads in hosts or AU's in iOS before other than that it is somewhat hard to do right and there wasn't a lot of benefit to doing it in the way most people use audio on iOS.
The workgroup thread stuff that Apple has added for working with multiple audio (real time) threads has been added to both macOS and iOS. I think it has more to do with letting the OS have more information to do scheduling with on the Apple Silicon processors with the performance and efficiency core mix.
macOS will also run the realtime threads sequentially on one core if it can. In a big project with lots of realtime threads running, one core probably wouldn't be able to keep up, but you still can only run these in parallel if the project structure allows it. Lots of effects on a single track can't be run in parallel. A few effects on separate tracks may allow the host to run multiple threads. but things like mix busses can complicate that too. The DAW actually needs to be pretty smart to be able to do this for all the different types of project layouts that are possible.
Sounds promising. Curious to hear where the road takes you. I'm currently levitating in the Logic / Garageband / NI Komplete universe on the Mac but I'm interested what your impressions will be.
@NeonSilicon I did not mean that threads should be programmatically assigned to a certain core. I agree that no program should mess with these things and better leaves that to the OS. But I know from server side systems that want to achieve a non blocking processing of data they create as many threads as there are hardware cores so the OS can distribute them effectively 1:1 to these cores and they won't be blocked (assuming the machine runs just this software and the OS). I assumed this could be similar beneficial for RT threads.
I do not really understand why it should be better to just utilize a single core as much as possible and probably go into saturation with the core's utilization. The only advantage I can think of is energy efficiency but at the price that you are risk getting serious hickups if a thread will take longer than expected. How can you be sure of that? Because it's RT threads and therefore they have a deterministic runtime behaviour?
Update: I guess that's why they are deterministic:
I must say that this thread already influenced my vision. Thank to this great community. I think it is actually the best to use each device in this way where it excels. The laptop is superior for big linear arrangements. IMHO also for working efficiently with a piano roll editor. I believe now that I should rather see the iPad as an instrument that is tactile, utilizing the ease of touch operation. Sound design by tweaking virtual knobs with the fingers. Easy jamming with AUM.
Yes it is primarily for energy efficiency. This is a big deal on any mobile/battery-operated system and is primary focus of iOS, but it does matter on macOS too. The threads are supposed to have a set runtime. But, this is only soft real time. The system doesn't crash and nothing is forcing the behavior. If a certain process doesn't meet the deadline then the audio glitches. It's not like a hard realtime setting where the OS or some hardware is enforcing the deadlines with failover or something. MacOS isn't really good enough in its threading model to handle something like being a CAM controller.
In a server setting, the threading is often pretty obvious and somewhat easy to break up. If you have lots of users (people or hardware) initiating what's pretty much the same task then you can launch threads per task pretty easily. But even there, I was never in a setting where one task could keep a core completely busy. So, the number of threads I used was never tied directly to the number of cores.
The only setting where I've ever had a direct tie to cores and threads was in HPC when I knew that each task would completely use up the available processing of the core or CPU and so I would only launch as many threads as I had CPU's/cores plus usually one supervisor thread that was mostly asleep or waiting.
I don't think that iOS or macOS will try to saturate the single performance core before it wakes up a second core. I'm guess there is a threshold point where it will do this. Sort of like a load balancing system. I didn't see a situation where one core was completely scheduled be for a second core came in.
The only thing that is really important for the RT thread situation is that they complete on time. It doesn't matter if they run sequentially or in parallel really. One way to look at it is that macOS was using basically the same audio threading model back when the systems only had one core. The threading model helps with the logic of the construction of the programs as much as it does performance.
I meant something similar, actually. Sloppy wording (or, according to autocorrect, "slippy welding" 😅) on my side. I love my Maschine/MK3, too, although I should use it more. By Komplete I meant the NI collection, not the Komplete Kontrol sw itself, which I find a bit awkward tbh.
For fun I figured I'd go do the same test I did on the iPad with my old Intel i5 (4 cores) iMac using MainStage as the host. In this one I only have one synth running plus a couple of AUv3 (sandboxed) and two busses for reverb, plus a builtin EQ. So, basically smaller than the iPad projects and on a worse CPU. MainStage is running 24 threads of which 10 are high enough priority (97) to be real time threads. At the same time, there are at least four other realtime threads running, one for coreaudiod, one for the MOTU drivers, and one each for the AUv3 audio units.
The threads are running as a mix of parallel and sequential which seems to line up with how the MainStage busses and AU's can work.
One other fun note is that one of the AU's I had in the project was Spirangle and I know from testing before on the iPad that it completes each pass for processing about twice as fast on my old first gen iPad Pro than it does on the i5 in my iMac.
Dylan Paris laments his iPad M1 purchase.
I'd be interested in people's thoughts on his view.
I really like that guy, I love the humble and positive tone and approach in his vids.
In the current state of iOS, seems that's a reasonable choice, and it boils down to your usage of iOS.
Well actually I placed an order yesterday for a 1TB 12.9 iPad for around 1620€ and cancelled it right after asking myself : "will that make my music 1600€ better?". Sure it won't !
I still don’t regret that I returned my 1TB iPad Pro, already got back my money from Apple and bought a 256 GB model. But I must say that I did not order the MacBook Air yet. I installed a trial version of Logic Pro on my work MacBook Pro and I and made my first steps with it. My experiences are mixed and honestly I came back to the iPad for jamming.
First impression is that with Logic I have tons of options, professional features and great stock instruments BUT the UI is horrible. It feels totally old fashioned, the looks are ugly, overwhelming number of items on the screen and a lot of colorful distraction. I instantly missed my iPad apps, their simplicity and focus, beautiful design like the apps of audiomodern or Bram Bos. I did not even find out by myself how to do something simple like routing MIDI to external channels. Midi editing is great but tweaking synths with the mouse is not great. Logic‘s drum synth looks great but honestly I could not get the sound out of it that I wanted. My first try to get a techno beat going with some sidechained hats were a bit disppointing. That is much easier with Ruismaker Noir or the new FAC drum app. These experiences made my substantially doubt my combo vision. Studiomux beta is working great though.
Anyway my new smaller iPad delivers what I need. Even if the pro apps are coming I believe it is capable enough. I will continue to combine it with Logic but jamming out ideas seems to be much easier on the iPad. But I’m a total noob with Logic and I will continue to dive into it.
@krassmann You can also give Ableton Live 11 a try on the Macbook Pro. You can get the full suite edition free for 90 days. Worth to try.
Yeah right. I already have Live light installed that came with my Launchpad and I like it. The truth is I find it much more logical than Logic (sic!). BUT my plan is to buy an M1 MacBook Air and I have the hope that more and more iOS AUv3s will work on it and Ableton does not support AUs. Moreover ATM Ableton is still Intel. That might change of course.
Probably my whole idea is BS and it’s not worth to go with a DAW only because there is the hope to reuse my iPad plugins.
Bitwig is also M1 native and thanks to its sandboxing hosting, it can mix both native and Rosetta converted plugins. Plus the v4 update looks AWESOME with comping and operators. And it supports MPE since a long time.
Try Reaper. It does support sandboxed AUv3. I don't really like its UI either and it has some issues on retina displays. But, it is really usable and has some features I definitely prefer over Logic. The general state of UI/UX on desktop DAW's is pretty bad. Personally, I'd take an AUM Pro on macOS over a Logic Pro on iOS any day.
Very happy with my M1 iPad still. Everything feels smoother, from scrolling websites, switching between multiple apps, navigating around Cubasis, and especially now that I’m getting into video editing in LumaFusion. Crazy how much 4K video you can move around with no lag.
It’s not a totally different experience compared to my last iPad Pro, but I don’t get how people are not feeling a difference with the new M1s. It’s definitely noticeable here
Same here. I own a iPad Pro 12.9" M1 and a Macbook Pro M1. The display of the iPad Pro M1 is very beautiful and it has the same CPU speed that the Macbook Pro M1 offers. Connect the two of them via IDAM and you can send midi/audio both ways.
This video shows it :
I was on the first gen 9.7" pro before getting the 2021 12.9". It's a totally different experience for me. The display is incredible -- the best display I've ever seen and I've worked on some high end graphics workstations in the past. To me, it is truly mind blowing that I have this screen in a device like this.
Pretty much like the performance on my M1 Mac Mini, the iPad is totally smooth and nothing I do to it has any impact on the performance. Very cool.
The thing that has been most interesting to me though is that the app I want to play with the most is Animoog. It's making me completely rethink the way I want to use the iPad and what I want to develop for it.
I would be very interested to see what you make inspired by Animoog.
I had a similar discomfort with logic when I started using it a few years ago but now I find it the most comfortable of any of the DAW’s I’ve tried. You May want to check out the videos on the channel “why Logic Pro x rules,” I find his videos very helpful and to the point. Most of my frustrations initially with logic went away after realizing how to do things, such as adjusting clip gain, assigning consecutive inputs or outputs, etc. I’m running it on a 2012 MacBook Pro and it is still very smooth, I understand the m1’s run it really well.
Just food for thought, and it will definitely depend on what you intend to do with it. I am primarily recording and mixing live musicians with some occasional midi augmentation, and for that iOS is really not the easiest platform. But if I want to make EDM or something just for fun (I’m not very good at it) then I love gadget on the iPad or some of the other iOS tools more than I love doing that in logic.
@krassmann UI is not Logic's strong point I agree. However, any DAW will take some getting used too and none are perfect, so I'd consider persisting with it for a bit.
I'm thinking about the iPad Air too, but would like to use my existing iPad Pro for midi generation and route that to Ableton or Logic.
@mrufino1 thanks for the tipp, will watch the videos. I‘m not a total noob when it comes to desktop DAWs as I was a Cubase user in the 1990s, early 2000s. As I wrote I also installed Ableton Live light and I had no difficulties to find out how to do things. If I could not figure out myself then I never needed something else than the really good help function. But with the Midi routing of Logic I could not even extract the know-how from the manual and only a YT video helped 😆 As I am also more into making EDM tracks I believe there is a lot of potential in jamming on the iPad and then finally arrange it on the laptop.
@soundtemple yes, I agree that Logic remote is fantastic. It makes Logic touchable to a large degree. That is definitely an edge over Ableton. I love my iPad sequencers a lot and they benefit a lot from the touch UI. I‘ll also explore that combo. I must must admit that I never dived into your deep Mozaic scripts but I guess that‘s what’s on your mind to use them with Logic. Sounds like a good idea. BTW, did you ever consider to do something similar with Max4Live?
I have this to use the ipad with ableton but only used it shortly as got launchpad pro mk3 soon after. Seemed pretty slick
https://zerodebug.com/#/
@krassmann have you had a look at what Touchable can do as an ipad controller for Ableton? It's got pitfalls, but it is pretty amazing. Full piano roll editor, mixer controls and device controls even for vsts.
My idea to use the Mozaic scripts with Logic or Ableton is only because my ipad Pro can't handle enough plugins as I would like. I thought an M1 ipad would be the answer to my woes, but your experience and others I have read about have made me second guess.
Btw... I have just finished a V2 of my Mozaic scripts with full launchpad integration and new devices. Hence, I'm keen to stay with ipad so I can use them.