Loopy Pro: Create music, your way.

What is Loopy Pro?Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.

Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.

Download on the App Store

Loopy Pro is your all-in-one musical toolkit. Try it for free today.

Multi-core iOS DAWs besides Cubasis 3?

13»

Comments

  • edited June 2023

    Just wanted to say that Logic on iPad is multicore too..

    @seonnthaproducer
    Yeah, but that’s the thing…that’s sale for a specific group of people.
    Multicore is a very specific feature. Only available to those with super expensive iPads.

    Well that is just not true. With existence of Mini and Air with M1 cpu, those are definitely not “super expensive iPads”

    Also if you want to use top quality synths and FXs, and you make more synth than sample based music, music which heavy relies on advanced synth/fx sound design and layering, then multicore support may be actually crucial thing which allows you finish track at all.

    Freezing tracks is just not ideal solution especilly of ou ise lots aof parameters automations. It works, but it’s very annoying obstacle.

    You know, not eveybody makes music which consists from few channels of ssmoles with near to zero automations ;) There are other, “hw very demanding” genres 🤣

  • @fearandloathing said:
    Does this also mean m1 multi core iOS divides the CPU workload even if the DAW isn’t optimized? It sounds like multi core efficiency is optimized by the dev, but then there are also things iOS is doing. Assuming you are only running a DAW, and no other processes and internet is disabled, shouldn’t all the load go to the iOS and DAW?

    If that’s the case, and you are only running one instance of one dAW, does it make a difference which processor you are using really based on the op? Like can an iPad Air 2 and iPad Air m1 basically have the same performance on DAW that are still running on single core (BM3)?

    Or does the M1 chip distribute cpu workload on its own when it’s running one app? And how do you know when your maxing put aside from crackling if the meters aren’t really accurate ?

    I posted on AEM thread , I have an old iPad Air2 and indeed I get triple+ performance when enabling multi-core

    But the architecture performance plus efficiency cores , certainly plays a role . The Air2 had 3 same cpu cores , I don't know if the boost is the same on modern iPads

  • Ntrack supports multi core

  • The user and all related content has been deleted.
  • @tja said:

    @dendy said:

    I wrote it above: Cubasis 3 requires 3 (or more) performance cores for using multicore!

    If true , that's a major design flaw

  • The user and all related content has been deleted.
  • @tja said:

    @Korakios said:

    @tja said:

    @dendy said:

    I wrote it above: Cubasis 3 requires 3 (or more) performance cores for using multicore!

    If true , that's a major design flaw

    No, it's a totally valid decision:

    https://forums.steinberg.net/t/cubasis-3-2-multicore-rendering-latency-settings/678593

    On devices with 2 high performance CPU cores, multi-core rendering is disabled in order to keep one of the cores available for all the non-audio stuff (UI, touch, the operating system…)

    if they beta tested and actually saw that the non-audio stuff is not capable on the efficiency cores , ok
    Also judging from M1 cpus and DAW benchmarks on desktop , the efficient cores are very capable

  • edited June 2023

    @tja said:

    @dendy said:
    Just wanted to say that Logic on iPad is multicore too..

    @seonnthaproducer
    Yeah, but that’s the thing…that’s sale for a specific group of people.
    Multicore is a very specific feature. Only available to those with super expensive iPads.

    Well that is just not true. With existence of Mini and Air with M1 cpu, those are definitely not “super expensive iPads”

    I wrote it above: Cubasis 3 requires 3 (or more) performance cores for using multicore!
    That's it. No need for any M* CPU ...

    i didn’t say you NEED m1 cpu .. i reacted on claim that multicore support is just for “those with super expensive ipads” with example that M1 mini/air definitely aren’t super expensive but multicore support in daw makes huge difference on them how much you can run …

    i had mimi 5 (A12 cpu) befor and multicore in cubasis was like ok, it was possible to enable it, but it wasn’t much reliable.. on M1 it works fantastic, i can run a MUCH more Model D instances than in non multicore daw .. i think 3x as much but not sure, need to try it again

  • ok tried it
    NS2 / buffer “hight” (10ms) / 8 instances, load around 85% / no crackles

    C3 / buffer 10 ms / 26 instances, load around 85% / no crackles

    adding one more instance in both cases and crackles start appearing here and there

  • The user and all related content has been deleted.
  • @tja said:
    Going to compare Pro 12.9 2nd gen. with Air 3 ....

    Just a short MIDI loop, default / init preset?

    yes default preset playing one long note c3

  • edited June 2023

    LOGIC (with same buffer - 512, which is 10 ms) handles a bit less than Cubasis .. when i loaded 26, just switching into mixer view caused popuo about insufficient cpu, where cubasis was rock solit at 26 instances

  • edited June 2023
    The user and all related content has been deleted.
  • edited June 2023
    The user and all related content has been deleted.
  • edited June 2023
    The user and all related content has been deleted.
  • wimwim
    edited June 2023

    @fearandloathing said:
    Does this also mean m1 multi core iOS divides the CPU workload even if the DAW isn’t optimized? It sounds like multi core efficiency is optimized by the dev, but then there are also things iOS is doing. Assuming you are only running a DAW, and no other processes and internet is disabled, shouldn’t all the load go to the iOS and DAW?

    To answer that, multi-threading vs. multi-core need to be clarified.

    Multi-core processing has always been part of iOS and is handled in the background by it. The operating system allocates CPU cores as it sees fit to optimize battery and reduce heat. When the operating system has less to do tasks are shifted to fewer and lower performance cores. More and higher performance cores are brought online when performance starts to suffer. This is why CPU % meters can be misleading as they report % of available processing power at any time.

    Multi-threading is a programming technique. Rather than having one thing happen after another linearly, tasks can be set up so that they run independently without interrupting each other. It's possible to request higher performance cores for high priority threads, but the operating system ultimately decides.

    Multi-threading techniques have always been available to iOS programmers. But audio DSP is special in that great care has to be taken not to "block" the audio thread. Blocking happens when a high-priority task such as rendering audio is delayed to do something like a GUI update, causing audio glitches.

    What was added to iOS was not multi-core processing. That has always been there. What was added is multi-threaded audio processing. Prior to multi-threaded audio processing, there could only be one audio render thread and therefore could only run on a single core at a time. Multi-thread audio processing allows that rendering to be broken up into parallel threads so that they can potentially run on more than one core.

    So. Finally to the answer: Multi-threaded audio processing has no automatic benefit to audio apps. They have to be specifically written to take advantage of the feature. That isn't a simple task and isn't of benefit to all apps. It's more benefit, say, to a DAW that has many audio streams to process than a reverb that has only one.

    If that’s the case, and you are only running one instance of one dAW, does it make a difference which processor you are using really based on the op? Like can an iPad Air 2 and iPad Air m1 basically have the same performance on DAW that are still running on single core (BM3)?

    Just as a Lamborghini is faster than a Lada, faster processors perform better and can do more before bogging down. Of course you're not over tasking the processor, then it makes no difference. A Lamborghini and a Lada will arrive at the same time if sticking to a speed limit they can both obtain.

    Or does the M1 chip distribute cpu workload on its own when it’s running one app? And how do you know when your maxing put aside from crackling if the meters aren’t really accurate ?

    Hopefully the answer to the first question is clear now. The answer is "yes" but is also "yes" for any other multi-core iOS chip.

    The answer to the second question is "no". Because of the way iOS dynamically allocates processing resources, you can't really know when you're maxing out other than when you start to get audio dropouts or GUI freezes. In fact, you can get higher CPU % readings when doing less if the operating system has dialed back cores to save battery and reduce heat.

  • @wim you should be teacher, you know excellently explain things ! good job 👍

  • @dendy said:
    @wim you should be teacher, you know excellently explain things ! good job 👍

    Thanks. They key is acting like you know what you're talking about whether you do or not. 😉
    There are probably some inaccuracies in there, but directionally it should be OK.

  • @wim said:

    @fearandloathing said:
    Does this also mean m1 multi core iOS divides the CPU workload even if the DAW isn’t optimized? It sounds like multi core efficiency is optimized by the dev, but then there are also things iOS is doing. Assuming you are only running a DAW, and no other processes and internet is disabled, shouldn’t all the load go to the iOS and DAW?

    To answer that, multi-threading vs. multi-core need to be clarified.

    Multi-core processing has always been part of iOS and is handled in the background by it. The operating system allocates CPU cores as it sees fit to optimize battery and reduce heat. When the operating system has less to do tasks are shifted to fewer and lower performance cores. More and higher performance cores are brought online when performance starts to suffer. This is why CPU % meters can be misleading as they report % of available processing power at any time.

    Multi-threading is a programming technique. Rather than having one thing happen after another linearly, tasks can be set up so that they run independently without interrupting each other. It's possible to request higher performance cores for high priority threads, but the operating system ultimately decides.

    Multi-threading techniques have always been available to iOS programmers. But audio DSP is special in that great care has to be taken not to "block" the audio thread. Blocking happens when a high-priority task such as rendering audio is delayed to do something like a GUI update, causing audio glitches.

    What was added to iOS was not multi-core processing. That has always been there. What was added is multi-threaded audio processing. Prior to multi-threaded audio processing, there could only be one audio render thread and therefore could only run on a single core at a time. Multi-thread audio processing allows that rendering to be broken up into parallel threads so that they can potentially run on more than one core.

    So. Finally to the answer: Multi-threaded audio processing has no automatic benefit to audio apps. They have to be specifically written to take advantage of the feature. That isn't a simple task and isn't of benefit to all apps. It's more benefit, say, to a DAW that has many audio streams to process than a reverb that has only one.

    If that’s the case, and you are only running one instance of one dAW, does it make a difference which processor you are using really based on the op? Like can an iPad Air 2 and iPad Air m1 basically have the same performance on DAW that are still running on single core (BM3)?

    Just as a Lamborghini is faster than a Lada, faster processors perform better and can do more before bogging down. Of course you're not over tasking the processor, then it makes no difference. A Lamborghini and a Lada will arrive at the same time if sticking to a speed limit they can both obtain.

    Or does the M1 chip distribute cpu workload on its own when it’s running one app? And how do you know when your maxing put aside from crackling if the meters aren’t really accurate ?

    Hopefully the answer to the first question is clear now. The answer is "yes" but is also "yes" for any other multi-core iOS chip.

    The answer to the second question is "no". Because of the way iOS dynamically allocates processing resources, you can't really know when you're maxing out other than when you start to get audio dropouts or GUI freezes. In fact, you can get higher CPU % readings when doing less if the operating system has dialed back cores to save battery and reduce heat.

    Thanks so much for that. You really taught me something !

  • @wim said:

    @fearandloathing said:
    Does this also mean m1 multi core iOS divides the CPU workload even if the DAW isn’t optimized? It sounds like multi core efficiency is optimized by the dev, but then there are also things iOS is doing. Assuming you are only running a DAW, and no other processes and internet is disabled, shouldn’t all the load go to the iOS and DAW?

    To answer that, multi-threading vs. multi-core need to be clarified.

    Multi-core processing has always been part of iOS and is handled in the background by it. The operating system allocates CPU cores as it sees fit to optimize battery and reduce heat. When the operating system has less to do tasks are shifted to fewer and lower performance cores. More and higher performance cores are brought online when performance starts to suffer. This is why CPU % meters can be misleading as they report % of available processing power at any time.

    Multi-threading is a programming technique. Rather than having one thing happen after another linearly, tasks can be set up so that they run independently without interrupting each other. It's possible to request higher performance cores for high priority threads, but the operating system ultimately decides.

    Multi-threading techniques have always been available to iOS programmers. But audio DSP is special in that great care has to be taken not to "block" the audio thread. Blocking happens when a high-priority task such as rendering audio is delayed to do something like a GUI update, causing audio glitches.

    What was added to iOS was not multi-core processing. That has always been there. What was added is multi-threaded audio processing. Prior to multi-threaded audio processing, there could only be one audio render thread and therefore could only run on a single core at a time. Multi-thread audio processing allows that rendering to be broken up into parallel threads so that they can potentially run on more than one core.

    So. Finally to the answer: Multi-threaded audio processing has no automatic benefit to audio apps. They have to be specifically written to take advantage of the feature. That isn't a simple task and isn't of benefit to all apps. It's more benefit, say, to a DAW that has many audio streams to process than a reverb that has only one.

    If that’s the case, and you are only running one instance of one dAW, does it make a difference which processor you are using really based on the op? Like can an iPad Air 2 and iPad Air m1 basically have the same performance on DAW that are still running on single core (BM3)?

    Just as a Lamborghini is faster than a Lada, faster processors perform better and can do more before bogging down. Of course you're not over tasking the processor, then it makes no difference. A Lamborghini and a Lada will arrive at the same time if sticking to a speed limit they can both obtain.

    Or does the M1 chip distribute cpu workload on its own when it’s running one app? And how do you know when your maxing put aside from crackling if the meters aren’t really accurate ?

    Hopefully the answer to the first question is clear now. The answer is "yes" but is also "yes" for any other multi-core iOS chip.

    The answer to the second question is "no". Because of the way iOS dynamically allocates processing resources, you can't really know when you're maxing out other than when you start to get audio dropouts or GUI freezes. In fact, you can get higher CPU % readings when doing less if the operating system has dialed back cores to save battery and reduce heat.

    Very informative thanks Wim!

  • edited June 2023

    @hes said:

    @MisplacedDevelopment said:
    Looks like we can add MTS to the list as well. I guess this thread may have caused people to reach out to Giel for confirmation!

    https://www.multitrackstudio.com/forum/viewtopic.php?f=4&p=11419&sid=c4b3cce697e4d36a0e93b6991455c67b#p11419

    Is he really talking about the same kind of multicore support as is in Cubasis? He says it's been supported "since version 1.0", but that would predate by years Apple's publishing the API supporting multicore iOS apps in 2020.

    its a full-on port from desktop and the code was ready before the handshake from Apple appeared. handles 25 model d with 70% tapped out at 29 instances 94% cpu

Sign In or Register to comment.