Loopy Pro: Create music, your way.

What is Loopy Pro?Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.

Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.

Download on the App Store

Loopy Pro is your all-in-one musical toolkit. Try it for free today.

How can devs make use of the 3rd cpu core (Air 2) ?

edited November 2014 in General App Discussion

Title says it.I'm just wondering if apple has to"enable" it or if the devs need to update their apps for it.As it is,i have a roundabout 30% cpu boost (coming from Mini Retina) in every app i can watch a CPU meter (Auria,Cubasis etc.).That is in line with the benchmarks for 2 core performance of the Air 2 compared to previous models.But what about the 3rd core?On Mac/PC the devs are (always?) responsible for it but on iOS?I want the power i paid for :)

Comments

  • edited November 2014

    Don't worry, it is being used. There is always a lot of multi-tasking going on, even within audio apps, from disk I/O, waveform calculations, pre-calculated audio processing, the user interface itself, and the main audio pipeline. Given that and all the potential background processing from other apps and the system services, you can rest assured those CPUs are being used for something.

    So even though the critical audio processing path is rarely multi-cpu, any relief that the system can provide to our realtime audio callbacks is a good thing.

  • does that mean devs dont need to update as the ios does the allocation of cpu load?

  • i would say yes, it is system's CPU

  • edited November 2014

    @sonosaurus said:

    Don't worry, it is being used. There is always a lot of multi-tasking going on, even within audio apps, from disk I/O, waveform calculations, pre-calculated audio processing, the user interface itself, and the main audio pipeline. Given that and all the potential background processing from other apps and the system services, you can rest assured those CPUs are being used for something.

    So even though the critical audio processing path is rarely multi-cpu, any relief that the system can provide to our realtime audio callbacks is a good thing.

    Thanks for your answer although i'm not really satisfied with it ;) Auria is a really good stress test/performance meter and of course i force quit ALL other apps in the background (plus soft reset),even switching Wi-Fi off before starting serious work or even testing things.So,there must be way more power available.I just can't believe that the system now is eating up a whole core (or even more) where all other devices (the newer ones like iPhone 5s,Mini Retina,Air 1 and the new phones) handles the same tasks in no real noticeable difference.Yes,it is a little snappier than my Mini Retina in daily usage but the only thing where i notice a big difference is Ram.That is really a jump i'm glad to have now.

  • Don't forget about the 8-core GPU. GPU's can also be used for audio processing according to this series:

    https://itunes.apple.com/us/itunes-u/programming-massively-parallel/id384233322?mt=10

  • Gpu is not ok for audio because of latency

  • edited November 2014

    @Goozoon said:

    Gpu is not ok for audio because of latency

    Thats right but maybe for rendering? this should be work

    (Processing Mastering/Normalize/Trimm....)

Sign In or Register to comment.