Loopy Pro: Create music, your way.

What is Loopy Pro?Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.

Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.

Download on the App Store

Loopy Pro is your all-in-one musical toolkit. Try it for free today.

Most important news from WWDC - Audio apps can now run multiple audio threads !!!

edited June 2020 in General App Discussion

New API, new possibilities ... this looks really cool

https://developer.apple.com/videos/play/wwdc2020/10224/

«13

Comments

  • Great, new audio unit features :)
    I hope this means we should see improved performance in iOS 14 if developers update their apps.

  • If its finally for real is a great news.

  • @Michael ? Any comments?

  • wimwim
    edited June 2020

    I love it when I have to eat my pessimistic words! B)

  • Fuck, I hope this doesn’t break everything.
    Again.

  • wimwim
    edited June 2020

    @CracklePot said:
    Fuck, I hope this doesn’t break everything.
    Again.

    It changes nothing unless programmers adapt their code to use it. And that would be a big job to do. Most likely only newly developed apps will go there.

    Just buying some ghost peppers doesn't automatically make all your food too spicy.

  • @dendy said:
    New API, new possibilities ... this looks really cool

    https://developer.apple.com/videos/play/wwdc2020/10224/

    Haven’t had chance to watch this yet, but this is of most interest to hosts right, so they can separate chains across cores, like desktop ?

  • @wim said:

    @CracklePot said:
    Fuck, I hope this doesn’t break everything.
    Again.

    It changes nothing unless programmers adapt their code to use it.
    Just buying some ghost peppers doesn't automatically make all your food too spicy.

    Yeah, unless Apple decides for everyone, like they are so fond of doing.
    I just don’t want to have all my apps break, need updates, and a bunch get abandoned.
    Already went through that. It sucked.

  • wimwim
    edited June 2020

    @CracklePot said:

    @wim said:

    @CracklePot said:
    Fuck, I hope this doesn’t break everything.
    Again.

    It changes nothing unless programmers adapt their code to use it.
    Just buying some ghost peppers doesn't automatically make all your food too spicy.

    Yeah, unless Apple decides for everyone, like they are so fond of doing.

    That is literally impossible in this case.

    If your grocer starts stocking Ghost Peppers, and even sneaks a jar into your bag when you're not looking, they still don't end up in your food unless you put them there.

  • @wim said:

    @CracklePot said:

    @wim said:

    @CracklePot said:
    Fuck, I hope this doesn’t break everything.
    Again.

    It changes nothing unless programmers adapt their code to use it.
    Just buying some ghost peppers doesn't automatically make all your food too spicy.

    Yeah, unless Apple decides for everyone, like they are so fond of doing.

    That is literally impossible in this case.

    Thank you for the reassurance.
    👍🏻

  • @dendy said:
    New API, new possibilities ... this looks really cool

    https://developer.apple.com/videos/play/wwdc2020/10224/

    Well, hopefully a certain developer of a certain DAW releases an audio tracks update first before trying to work this new API into said certain DAW. ;) LOL!

  • edited June 2020

    @jwmmakerofmusic said:

    @dendy said:
    New API, new possibilities ... this looks really cool

    https://developer.apple.com/videos/play/wwdc2020/10224/

    Well, hopefully a certain developer of a certain DAW releases an audio tracks update first before trying to work this new API into said certain DAW. ;) LOL!

    Wait, what? A DAW without audio tracks? :trollface:

  • @AudioGus said:

    @jwmmakerofmusic said:

    @dendy said:
    New API, new possibilities ... this looks really cool

    https://developer.apple.com/videos/play/wwdc2020/10224/

    Well, hopefully a certain developer of a certain DAW releases an audio tracks update first before trying to work this new API into said certain DAW. ;) LOL!

    Wait, what? A DAW without audio tracks? :trollface:

    LOL! I still manage to make it work somehow (just like I made Fruity Loops work prior to it having audioclips). :mrgreen: But Audiotracks would up my level immensely. ;)

  • what does this mean for apps? i dont really understand

  • heshes
    edited June 2020

    @itsaghost said:
    what does this mean for apps? i dont really understand

    I'm not sure. I think it may not mean much for plugins, but be very important for AU hosts. Right now any apps producing audio are contained in the same "thread", which means they will all run on a single processor core, even if the computer has 4 cores (most ipads) or 8 cores (ipad pros, with their A__x chips). So there's a lot of unused processing power on these machines if all you've got running is audio apps. E.g., if you've got three different plugins running inside AUM, they're all (all the plugins plus AUM host) using the same processor core so only one of them can be working at a time, the others all stop while one is running, and iOS manages dealing out slices of processor running time to each app. Meanwhile, all the other processor cores may be sitting unused.

    My understanding is that with the upcoming multiprocessing ability a host app (e.g., a DAW, AUM) that's loading multiple AU plugins could be enabled to schedule the plugins to do their processing on different cores, and manage bringing the results of their processing back into the main thread that the host is run in. And the apps on different cores can all process at the same time; they don't "block" each other. So you can get a big increase in processing performance without upgrading hardware at all, because the processor cores that were formerly unused can now be used to process audio. It's not a terribly efficient process, so, e.g., you don't get a full doubling of power if you enable two cores to do processing instead of one, but you would get a respectable increase.

  • @hes said:

    @itsaghost said:
    what does this mean for apps? i dont really understand

    I'm not sure. I think it may not mean much for plugins, but be very important for AU hosts. Right now any apps producing audio are contained in the same "thread", which means they will all be processed on a single processor core, even if the computer has 4 cores (most ipads) or 8 cores (ipad pros, with their A__x chips). So there's a lot of unused processing power on these machines if all you've got running is audio apps. E.g., if you've got three different plugins running inside AUM, they're all using the same processor core so only one of them can be processing data at a time, the others all stop while one is running, and iOS manages dealing out slices of processing time to each app. Meanwhile, all the other processing cores may be sitting unused.

    My understanding is that with the upcoming multiprocessing ability a host app (e.g., a DAW, AUM) that's loading multiple AU plugins could be enabled to schedule the plugins to do their processing on different cores, and manage bringing the results of their processing back into the main thread that the host is run in. And the apps on different cores can all process at the same time; they don't "block" each other. So you can get a big increase in processing performance without upgrading hardware at all, because the processor cores that were formerly unused can now be used to process audio. It's not a terribly efficient process, so, e.g., you don't get a full doubling of power if you enable two cores to do processing instead of one, but you would get a respectable increase.

    thank you so much for explaining that! that helped clear things up for me a lot. it definitely seems like it’s gunna help boost the performance of apps that take advantage of it.

  • @jwmmakerofmusic said:

    @AudioGus said:

    @jwmmakerofmusic said:

    @dendy said:
    New API, new possibilities ... this looks really cool

    https://developer.apple.com/videos/play/wwdc2020/10224/

    Well, hopefully a certain developer of a certain DAW releases an audio tracks update first before trying to work this new API into said certain DAW. ;) LOL!

    Wait, what? A DAW without audio tracks? :trollface:

    LOL! I still manage to make it work somehow (just like I made Fruity Loops work prior to it having audioclips). :mrgreen: But Audiotracks would up my level immensely. ;)

    This with track-bouncing has the potential (if it has decent enough splitting and fades etc on clips) for me to load in old projects from other places (BM3, desktop etc) and finish them off on iPad. Sigh.

  • @hes thanks a lot for this explanation! Big deal indeed, hope it is not too hard for devs to adapt to.

  • @hes said:

    @itsaghost said:
    what does this mean for apps? i dont really understand

    I'm not sure. I think it may not mean much for plugins, but be very important for AU hosts. Right now any apps producing audio are contained in the same "thread", which means they will all be processed on a single processor core, even if the computer has 4 cores (most ipads) or 8 cores (ipad pros, with their A__x chips). So there's a lot of unused processing power on these machines if all you've got running is audio apps. E.g., if you've got three different plugins running inside AUM, they're all using the same processor core so only one of them can be processing data at a time, the others all stop while one is running, and iOS manages dealing out slices of processing time to each app. Meanwhile, all the other processing cores may be sitting unused.

    My understanding is that with the upcoming multiprocessing ability a host app (e.g., a DAW, AUM) that's loading multiple AU plugins could be enabled to schedule the plugins to do their processing on different cores, and manage bringing the results of their processing back into the main thread that the host is run in. And the apps on different cores can all process at the same time; they don't "block" each other. So you can get a big increase in processing performance without upgrading hardware at all, because the processor cores that were formerly unused can now be used to process audio. It's not a terribly efficient process, so, e.g., you don't get a full doubling of power if you enable two cores to do processing instead of one, but you would get a respectable increase.

    Thank you for your explanation. So this probably means we have to wait for iPados 14 to get a garageband update. I am sure they want to incorporate all new technologies.

  • @Jimmy said:
    Thank you for your explanation. So this probably means we have to wait for iPados 14 to get a garageband update. I am sure they want to incorporate all new technologies.

    Not saying it's not gonna happen, but GB hasn't exactly been a way of showcasing the latest technologies, for example it doesn't support MIDI AUv3 plugins.

  • @NoiseFloored said:

    @Jimmy said:
    Thank you for your explanation. So this probably means we have to wait for iPados 14 to get a garageband update. I am sure they want to incorporate all new technologies.

    Not saying it's not gonna happen, but GB hasn't exactly been a way of showcasing the latest technologies, for example it doesn't support MIDI AUv3 plugins.

    GarageBand definitely never seems to go for the cutting edge ... understandably since it's free. However, this could be a way of laying the groundwork for Logic on iOS ... someday. I'm not in the tinfoil hat camp that thinks it's right around the corner, but do see plenty of indication that it's headed that direction within the next few years.

  • edited June 2020

    @hes
    I'm not sure. I think it may not mean much for plugins, but be very important for AU hosts.

    @Turntablist said:
    Haven’t had chance to watch this yet, but this is of most interest to hosts right, so they can separate chains across cores, like desktop ?

    Not just host, also plugin. Both host and plugin will be capable of creating as much parallel threads as they want - although it is suggested to not create more threads that is number of physical cores .. good thing is there is simple method to check number of cores so app (both host and plugin) can run just as many threads as ideal for given device...

    From that description it sounds very cool, like Apple did leap forward.

    Note: Currently all plugins on iOS are running "off thread" - they use own audio thread, independend from host audio thread. All intances of same plugin they then share one main thread (that is place where UI is running) - that's why crash of one instance in many cases tooks down all instances

    What that guy describes in that video is even more interesting, he mentioned (if incorrectly got it) it will be possible to group also main thread with audio threads - which, based on my understanding should lead to lmore stable system, and lot less chance one instance taking down all other instances or even whole host...

    What looks very amazing to me on this new concept, is that some of those threads can run in sync with main system audio thread but they rounds can run actually lot longer, even during multiple buffer rounds of main audio theread !!! Which may drasticcaly simplify all synchronization issues between main and audio thread ... This looks like game changer for audio apps developement to me :-)

    would be interesting to hear opinions about this from @j_liljedahl @Michael @brambos and others, i'm new in this area, still just learning all this stuff so i may be not understanding all this precisely right.

    @wim
    As i mentioned in other thread, they also moved Swift standard library in stack bellow Foundation framework, which means you can use Swift for low level APIs where previously use of C/C++ was needed...

    not sure of it is related also to realtime audio thread code - but if it means no more locks during calling Swift methods, that would be even more amazing than multiple threads [take this just as my speculation, there is good chance i'm understanding it totally wrong]

  • wimwim
    edited June 2020

    @dendy said:
    What that guy describes in that video is even more interesting, he mentioned (if incorrectly got it) it will be possible to group with audio threads also main thread, so whole app wil run "out of process" - which to my umderstanding would means lot more stable system, and lot less chance one instance taking down all other instances or even whole host...

    I haven't watched the video yet, but do recall that MacOS Audio Units could already be set to run out of process. This was a feature not available on iOS. Do you recall if iOS was specifically mentioned as being able to do this with the new changes?

    I also wonder if there will be any backwards iOS version compatibility for this feature. It could be very tricky writing a plugin that doesn't only run on iOS 14 without that.

  • edited June 2020

    @wim said:

    @dendy said:
    What that guy describes in that video is even more interesting, he mentioned (if incorrectly got it) it will be possible to group with audio threads also main thread, so whole app wil run "out of process" - which to my umderstanding would means lot more stable system, and lot less chance one instance taking down all other instances or even whole host...

    I haven't watched the video yet, but do recall that MacOS Audio Units could already be set to run out of process. This was a feature not available on iOS. Do you recall if iOS was specifically mentioned as being able to do this with the new changes?

    No, this is aready now.

    On MacOS, plugin can run on-thread and off-thread. It's on plugin developer to decide (on-thread has slightly better performance, but crash very likely takes down whole host. I think host can reject on-thread run even if plugins wants it)

    On iOS is possible JUST off-thread run. This is current situation. Multiple instances of same plugin share same main (UI) thread but they do have own audio threads.

    I also wonder if there will be any backwards iOS version compatibility for this feature. It could be very tricky writing a plugin that doesn't only run on iOS 14 without that.

    This is basically just enhancing of current status. Old apis will be still working so this will not affect old plugins in any way

  • heshes
    edited June 2020

    @dendy said:
    Note: Currently all plugins on iOS are running "off thread" - they use own audio thread, independend from host audio thread. All intances of same plugin they then share one main thread (that is place where UI is running) - that's why crash of one instance in many cases tooks down all instances

    The plugins may be on a different thread, but they run on the same processor core as host thread, at least as I understand it. Which means only one can be running at once. I.e., all the threads on a processor core receive time slices from the os, and can run only during their scheduled time slice, are blocked during all time slices other than their own.

    I expect a plugin could be made to manage multiprocessing in the new system, which, as I understand it, would mean that a plugin instance could run threads of its own on different cores. But I don't see much advantage to that. There's a huge advantage to a host app with say 5 plugin instances, running each plugin instance on a separate core. Beyond that, having a plugin itself manage multiple cores, would not seem to provide much advantage; the host will presumably already have scheduled things out efficiently. Would gains from enabling this multiprocessing in a plugin outweigh the complexity of implementing it? I'm certainly no expert on this stuff, but I doubt it. Are there plugins that manage their own multiprocessing on desktop OS's?

  • edited June 2020

    Beyond that, having a plugin itself manage multiple cores, would not seem to provide much advantage; the host will presumably already have scheduled things out efficiently

    nope:-) host isn't sheduling anything, host uses own audio thread for own stuff but cannot affect threading of plugin in any way... he just sends to plugin some data (parameters+audio/midi stream) and gets from plugin result, but has zero impact on plugin's threading

  • edited June 2020

    . deleted - not 100% sure with what i wrote here .
    :-)

  • How many instances of Model D ?

  • @hes said:
    Are there plugins that manage their own multiprocessing on desktop OS's?

    No, it is all dealt with by the host.

Sign In or Register to comment.