Loopy Pro: Create music, your way.

What is Loopy Pro?Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.

Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.

Download on the App Store

Loopy Pro is your all-in-one musical toolkit. Try it for free today.

Do We Need More than 8gb for iOS Production?

2»

Comments

  • edited April 2021

    @NeonSilicon said:

    @krassmann said:
    What about the DSP performance? I always thought that for audio it is even more important than the CPU. And I thought the DSP is a piece of hardware on the SoC chip, right?

    It depends on the DSP you need to run. The DSP is on the SoC. The RAM at least for my DSP usage isn't going to help. Mostly, you only run small, around audio buffer sized chunks to the DSP at a time.

    Yeah right, RAM is not much of an issue - I think we are all on the same page about that. It must be sufficient to load the host and the plugins and most AUs are rather small. But I don‘t see clearly the relation between CPU (single and multi core) and the SoC’s DSP regarding the actual audio performance of a device. I’m an application developer, I understand software but I don’t know signal processing.

    What does the actual DSP load in AUM mean - the hardware DSP? Would more CPU cores give you better DSP performance or is the hardware DSP the only limiting factor? Speaking of a virtual analog synth I guess the model of the signal flow through the synth’s components is rather CPU while the calculation of the actual signal simulation is DSP, right? So the audio performance is a determined by both CPU and DSP?

  • @krassmann said:

    @NeonSilicon said:

    @krassmann said:
    What about the DSP performance? I always thought that for audio it is even more important than the CPU. And I thought the DSP is a piece of hardware on the SoC chip, right?

    It depends on the DSP you need to run. The DSP is on the SoC. The RAM at least for my DSP usage isn't going to help. Mostly, you only run small, around audio buffer sized chunks to the DSP at a time.

    Yeah right, RAM is not much of an issue - I think we are all on the same page about that. It must be sufficient to load the host and the plugins and most AUs are rather small. But I don‘t see clearly the relation between CPU (single and multi core) and the SoC’s DSP regarding the actual audio performance of a device. I’m an application developer, I understand software but I don’t know signal processing.

    What does the actual DSP load in AUM mean - the hardware DSP? Would more CPU cores give you better DSP performance or is the hardware DSP the only limiting factor? Speaking of a virtual analog synth I guess the model of the signal flow through the synth’s components is rather CPU while the calculation of the actual signal simulation is DSP, right? So the audio performance is a determined by both CPU and DSP?

    I don't know how AUM calculates its DSP value. I think it is unlikely that it is a direct measure of anything to do with the actual vector unit on the SoC. I think it is more of a rough estimate of the processing power available to AUM on the whole CPU.

    Lots of AU's and VST's never touch the vector units. The computation path for some effects and synths don't really work with vector computations. For example, LRC7 is 100% on the vector units. On the other hand, the AU I'm working on now is 0% on the vector unit. The computations for the new one need to go sample-by-sample. (I'll note that if I can't figure out a way to get it re-architected to use the DSP processor somehow it'll never be released. Right now, it'll eat 100% of the processor on even a fast iPad.)

    Some audio things now days will actually use the main CPU, the vector unit, the AI processor and the GPU. So, yeah it can all get involved. (The GPU isn't generally useful because even though it is incredibly fast for these kinds of things, the bit size is lower and the latency is high. But there are audio use cases like maybe generating a whole bunch of wave tables off of the main DSP loop.)

    One other thing to note is that DSP code is going to run much faster if the entire thing can stay in cache on the CPU. This also leads to trying to use less RAM and tighter loops for the processing side of things.

    Also, I'm talking about one AU. If something like Logic does come to the iPad, it'll have tons of threads doing DSP with lots of AU's and other DSP going on. This will eat RAM.

  • @dendy said:

    @krassmann said:
    This is an interesting post from another thread:

    @Charlesalbert said:
    Got a mb 16 with 32gb of ram and it works identically as my mb pro 13 with 8gb on every single audio daws, from reason, to logic, vcv rack, gadget, ableton.

    Never experienced problems related to low ram😂😂😂 despite of that I have a pc with windows with 128gb of it just to get some nuclear submarine result in benchmark 😂😂😂

    Totally not surprised... it's something i'm keeping repeat like parrot again and again for ages - for music apps RAM is not important too much. Especially on iOS it is almost totally irellevant if you have 2,3 or 1000 GB of RAM - in 99% of use cases you get out of CPU way much sooner than out of RAM. Also, some people think bigger RAM has impact to speed - this is absolutely not true (and if you feel like it has it's placebo effect).

    Literally EVERYTHING you should care when you are picking HW for you music (no matter if it is iPad or computer) is how fast CPU it has. Rest is almost irrelevant. If you plan to work with large sample libraries and eventually long audio track, then (for desktop computer, obviously for iPad you can choose this) it's wise to choose fast SSD disk.

    This is true. Totally the reason I had at the time of purchasing my current iPad. I was buying used and had to choose between a regular iPad (whatever the current version was at the time) with 128GB, or an iPad Pro 11” (2018) which only had 64GB of storage.

    I went with the Pro as it had more processing power and more RAM (4GB vs 2GB on the regular iPad).

    A little over a year later and I still feel like I made the right choice. The new Pros that are coming out with the M1 chip are right on time for me as I’ve been waiting patiently to upgrade. I’ve run completely out of storage space on this 64GB and have been working with it, deleting things and moving things around to squeeze that newest app in.

    Things start acting weird when your storage is full though. The OS needs room to breath as we all know. Apps will start crashing when the have no room to write, etc etc. So it is well past time to upgrade for me (I’m trying to motivate myself here into going and spending some money lol) and especially now that I’ve got a couple projects in DSA that are just beyond my current hardware’s capabilities. Newer and faster processor and more RAM will be the ONLY fix for these problems. Otherwise I’d have to remove elements from projects, and who wants to do that?

  • edited April 2021

    @NeonSilicon, thanks for the explanations. I thought so that many resources are being used in parallel. Obviously Cubasis 3 benefitted a lot from multi-core, so the M1 will for sure be a winner. As the user I quoted on page 1 of this thread stated that he notices no difference between the 32 GB MBP and the 8 GB MBP regardless of the DAW then this makes me optimistic that the 8 GB iPad could be good enough. I know this depends on the complexity of the projects he is running. But anyway.

  • @NeuM said:

    @McD said:

    @NeuM said:
    16 GB would be awesome.

    Sure. But 32GB would be useful for musical holograms and neural fishnets to catch ideas floating in the ether. We will be replaced by our software at some point.

    You joke, but we’re getting closer and closer to real artificial intelligence. It’s not that far off... about another 15-20 years.

    AI is a misnomer. Machines are just an extension of the human mind and body.

  • @Apex said:

    @NeuM said:

    @McD said:

    @NeuM said:
    16 GB would be awesome.

    Sure. But 32GB would be useful for musical holograms and neural fishnets to catch ideas floating in the ether. We will be replaced by our software at some point.

    You joke, but we’re getting closer and closer to real artificial intelligence. It’s not that far off... about another 15-20 years.

    AI is a misnomer. Machines are just an extension of the human mind and body.

    Children are an extension of their parents, but no one would ever say they as they grow up they aren’t capable of independent thought. So it will be with A.I.

  • Let's see what's in an M1 "System on a Chip" looking for the DSP:

    CPU = Central Processing Unit

    • four high-performance 'Firestorm' cores
    • four energy-efficient 'Icestorm' cores
      > This combination allows power-use optimizations not possible with previous Apple–Intel architecture devices. Apple claims the energy-efficient cores use one-tenth the power of the high-performance ones. The high-performance cores have 192 KB of L1 instruction cache and 128 KB of L1 data cache and share a 12 MB L2 cache; the energy-efficient cores have a 128 KB L1 instruction cache, 64 KB L1 data cache, and a shared 4 MB L2 cache.

    GPU = Graphics Processing Unit

    • eight-core graphics processing unit
    • Each GPU core contains 128 ALUs (Arithmetic Logic Units)

    The GPU contains up to 128 EUs or 1024 ALUs, which by Apple's claim can execute nearly 25,000 threads simultaneously and have a maximum floating point (FP32) performance of 2.6 TFLOPs

    Other features

    Rosetta 2 dynamic binary translation technology enables M1-equipped products to run software built for Intel x86 CPUs.

    The M1 uses 4266 MT/s LPDDR4X SDRAM[11] in a unified memory configuration shared by all the components of the processor. The SoC and RAM chips are mounted together in a system-in-a-package design. 8 GB and 16 GB configurations are available.

    The M1 also contains:

    • a dedicated neural network hardware in a 16-core Neural Engine
    • an image signal processor (ISP)
    • an NVMe (Non-Volatile Memory) storage controller
    • Thunderbolt 4 controllers
    • a Secure Enclave (where you can hide your precious from the Dark Lord)

    DONE.

    OK? No DSP.

    I think of DSP as a verb - Digital Signal Processing.

    I am Digital Signal Processing right now... call back in 10 milliseconds.

    So, AUM is telling what it thinks the CPU is spending most of it's time doing when we load it
    up with our apps.

    Did you notice we get to choose between 2 optional SoP's.
    I'm curious to know how long the battery lasts for 8 GB vs 16 GB.
    If you see a clue... let us know. It probably impacts battery to consider 1TB vs 2TB too.

    How long does a busy 16GB 2TB model run without re-charge?

    I've been leaving a lot of apps running on my phone lately and it runs out pretty quickly
    when I forget to kill them all.

  • @NeuM said:

    @Apex said:

    @NeuM said:

    @McD said:

    @NeuM said:
    16 GB would be awesome.

    Sure. But 32GB would be useful for musical holograms and neural fishnets to catch ideas floating in the ether. We will be replaced by our software at some point.

    You joke, but we’re getting closer and closer to real artificial intelligence. It’s not that far off... about another 15-20 years.

    AI is a misnomer. Machines are just an extension of the human mind and body.

    Children are an extension of their parents, but no one would ever say they as they grow up they aren’t capable of independent thought. So it will be with A.I.

    Nope, sorry. The only thing happening is actual humans are getting better at program their own creations.

  • @Apex said:

    @NeuM said:

    @Apex said:

    @NeuM said:

    @McD said:

    @NeuM said:
    16 GB would be awesome.

    Sure. But 32GB would be useful for musical holograms and neural fishnets to catch ideas floating in the ether. We will be replaced by our software at some point.

    You joke, but we’re getting closer and closer to real artificial intelligence. It’s not that far off... about another 15-20 years.

    AI is a misnomer. Machines are just an extension of the human mind and body.

    Children are an extension of their parents, but no one would ever say they as they grow up they aren’t capable of independent thought. So it will be with A.I.

    Nope, sorry. The only thing happening is actual humans are getting better at program their own creations.

    You don’t have to take my word for it, just watch what happens.

  • @McD said:
    Let's see what's in an M1 "System on a Chip" looking for the DSP:

    CPU = Central Processing Unit

    • four high-performance 'Firestorm' cores
    • four energy-efficient 'Icestorm' cores
      > This combination allows power-use optimizations not possible with previous Apple–Intel architecture devices. Apple claims the energy-efficient cores use one-tenth the power of the high-performance ones. The high-performance cores have 192 KB of L1 instruction cache and 128 KB of L1 data cache and share a 12 MB L2 cache; the energy-efficient cores have a 128 KB L1 instruction cache, 64 KB L1 data cache, and a shared 4 MB L2 cache.

    GPU = Graphics Processing Unit

    • eight-core graphics processing unit
    • Each GPU core contains 128 ALUs (Arithmetic Logic Units)

    The GPU contains up to 128 EUs or 1024 ALUs, which by Apple's claim can execute nearly 25,000 threads simultaneously and have a maximum floating point (FP32) performance of 2.6 TFLOPs

    Other features

    Rosetta 2 dynamic binary translation technology enables M1-equipped products to run software built for Intel x86 CPUs.

    The M1 uses 4266 MT/s LPDDR4X SDRAM[11] in a unified memory configuration shared by all the components of the processor. The SoC and RAM chips are mounted together in a system-in-a-package design. 8 GB and 16 GB configurations are available.

    The M1 also contains:

    • a dedicated neural network hardware in a 16-core Neural Engine
    • an image signal processor (ISP)
    • an NVMe (Non-Volatile Memory) storage controller
    • Thunderbolt 4 controllers
    • a Secure Enclave (where you can hide your precious from the Dark Lord)

    DONE.

    OK? No DSP.

    I think of DSP as a verb - Digital Signal Processing.

    I am Digital Signal Processing right now... call back in 10 milliseconds.

    So, AUM is telling what it thinks the CPU is spending most of it's time doing when we load it
    up with our apps.

    Did you notice we get to choose between 2 optional SoP's.
    I'm curious to know how long the battery lasts for 8 GB vs 16 GB.
    If you see a clue... let us know. It probably impacts battery to consider 1TB vs 2TB too.

    How long does a busy 16GB 2TB model run without re-charge?

    I've been leaving a lot of apps running on my phone lately and it runs out pretty quickly
    when I forget to kill them all.

    They do have a DSP unit in them (as in Digital Signal Processor). Or, at least the CPU has special hardware to handle the SIMD instructions. They don't talk about them much but they are good and Apple provides excellent libraries to use them. The main component of Apple's library is vDSP. The current ARM name for the instructions is Neon. The new instruction set that ARM is moving to next ups these with a much faster set of instructions.

    Lots of DSP is actually done on the main CPU instructions, but it does make sense to me that people refer to and think of the DSP usage as being on a special unit. The NeXT Cube came with a DSP coprocessor just like an FPU.

  • @NeonSilicon said:
    They do have a DSP unit in them (as in Digital Signal Processor). Or, at least the CPU has special hardware to handle the SIMD instructions.

    I think since M1's are "ARM processors" they have "NEON SIMD" instructions.

    Apple's vDSP library is available to programmers to save them from writing hardware instructions.

    But there's specific component assign to audio work as there is to Graphics for example.

    I suspect there will be uses for the 16-core Neural Engine that could benefit DSP creators.
    Like @chowdsp... maybe he will comment here to bring some expertise to this question.

    I'd expect he has a pent up desire to get access to M1 hardware for his research into DSP programming. We should encourage a Patreon or GoFundMe effort to get him one since he's a college student. His 2 apps ChowCentaur and ChowTape demonstrate great skill at DSP coding.

  • @McD said:

    @NeonSilicon said:
    They do have a DSP unit in them (as in Digital Signal Processor). Or, at least the CPU has special hardware to handle the SIMD instructions.

    I think since M1's are "ARM processors" they have "NEON SIMD" instructions.

    Apple's vDSP library is available to programmers to save them from writing hardware instructions.

    But there's specific component assign to audio work as there is to Graphics for example.

    I suspect there will be uses for the 16-core Neural Engine that could benefit DSP creators.
    Like @chowdsp... maybe he will comment here to bring some expertise to this question.

    I'd expect he has a pent up desire to get access to M1 hardware for his research into DSP programming. We should encourage a Patreon or GoFundMe effort to get him one since he's a college student. His 2 apps ChowCentaur and ChowTape demonstrate great skill at DSP coding.

    For me the best thing about the Accelerate framework and vDSP has been the portability across several platforms and changes with Apple. I've been using much of the same code in my apps for more than a decade (almost two now) and every time Apple changes processors or makes a new platform the code continues working.

    I think the neural engine in the M1 is the same as the one in the A14. I'd guess that there are going to be some musical audio applications for sure. But, I haven't had a chance to play with it yet since I didn't have access to Apple's AI processor until I recently got my Mac Mini. My iPad is too old. I've been wanting to see if I could do some polyphonic pitch recognition with it though and see if it could get up to real-time usage. It would be pretty slick if someone can pull this off.

  • edited April 2021

    Damn, and to think I started this discussion! Lol. I spoke to @Michael beforehand and he thought 8gb was probably enough for audio. I should have left well enough alone.... like on that other thread I posted
    😳😳😳😳👀👀👀👀

  • @LinearLineman said:
    Damn, and to think I started this discussion! Lol. I spoke to @Michael beforehand and he thought 8gb was probably enough for audio. I should have left well enough alone.... like on that other thread I posted
    😳😳😳😳👀👀👀👀

    8GB will be great for those that can afford the upgrade for sure. But it's all above $1000 to play. With time the entry level M1 product will drop to insure dominance in the education market. Maybe another year or 2.

  • @McD said:

    @NeonSilicon said:
    They do have a DSP unit in them (as in Digital Signal Processor). Or, at least the CPU has special hardware to handle the SIMD instructions.

    I think since M1's are "ARM processors" they have "NEON SIMD" instructions.

    Apple's vDSP library is available to programmers to save them from writing hardware instructions.

    But there's specific component assign to audio work as there is to Graphics for example.

    I suspect there will be uses for the 16-core Neural Engine that could benefit DSP creators.
    Like @chowdsp... maybe he will comment here to bring some expertise to this question.

    I'd expect he has a pent up desire to get access to M1 hardware for his research into DSP programming. We should encourage a Patreon or GoFundMe effort to get him one since he's a college student. His 2 apps ChowCentaur and ChowTape demonstrate great skill at DSP coding.

    Yes, the M1 chips definitely use the ARM NEON instruction set for their SIMD instructions (as opposed to SSE or AVX instruction sets supported by Intel and some others).

    Actually both of my iOS plugins currently have neural network inferencing happening under the hood, which is running on the CPU using some vDSP calls internally. I wonder what kind of overhead it would incur to try running some of that stuff on the Neural Engine instead...

    My partner has actually been talking about getting one of the new M1-based Macs to use for her video editing projects, so if/when that happens I could definitely experiment with some audio processing, and see what some of the possibilities are.

  • One sentence sums it up for me: hamsters on a wheel.

    I understand all the tech ins and outs of how 8gb, then 16gb and then 32gb are going to change my life. The thing is that I could do what I wanted to do on Pro 10.5 with 4gb ram and now I own pro 2020 with 6gb.

    You can probably tell that I’ve been saturated by all the next big thing game changer talk. ;).

    When it comes to multitracking iPad still doesn’t hack it for me due to screen size. I do it on my 27” 2011 iMac. Same goes for video editing.

    I doubt the m1 chip or anything above 6gb ram is going to make any difference to my iPad music making any time soon.

  • I just thumbnailed that 1GB is 23 minutes of 32/96 stereo audio - plenty for caching, but keeping all audio/samples for a project in RAM could plausibly use a chunk of 16GB. Whether the impact of streaming from SSD is enough to justify pre-loading is an open question...

  • @Apex said:

    @NeuM said:

    @Apex said:

    @NeuM said:

    @McD said:

    @NeuM said:
    16 GB would be awesome.

    Sure. But 32GB would be useful for musical holograms and neural fishnets to catch ideas floating in the ether. We will be replaced by our software at some point.

    You joke, but we’re getting closer and closer to real artificial intelligence. It’s not that far off... about another 15-20 years.

    AI is a misnomer. Machines are just an extension of the human mind and body.

    Children are an extension of their parents, but no one would ever say they as they grow up they aren’t capable of independent thought. So it will be with A.I.

    Nope, sorry. The only thing happening is actual humans are getting better at program their own creations.

    For now, that's correct. IMHO it will also be correct for quite a while. But there's no good reason I have heard that one can use to exclude the possibility of a SkyNet singularity point some time in the future. :)

Sign In or Register to comment.