Loopy Pro: Create music, your way.

What is Loopy Pro?Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.

Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.

Download on the App Store

Loopy Pro is your all-in-one musical toolkit. Try it for free today.

Moog goes Apple Vision Pro

2

Comments

  • @MoogMusicInc said:
    Our director of software development recorded this unscripted latency and responsiveness exploration today, if you're interested:

    ❤️ your friends at Moog

    Thanks for this! I’d been wondering about the system’s limit for responding to repeated triggers, in the context of real-time drum triggering on an MPC-style grid; this certainly suggests that the AVP should be able to keep up.

  • Anyone got $4000 I can borrow?

  • @Stuntman_mike said:
    Anyone got $4000 I can borrow?

    …or, maybe you could live the rest of your life with just one kidney? 🤔🤔

  • edited February 3

    @HolyMoses said:

    @Stuntman_mike said:
    Anyone got $4000 I can borrow?

    …or, maybe you could live the rest of your life with just one kidney? 🤔🤔

    🤣 I’ll probably wait until Temu starts selling it for $39.99 🙃

  • This article on CDM indicates that you can use the app as an MPE MIDI controller for other synths: https://cdm.link/2024/02/moog-animoog-galaxy-apple-vision-pro/

  • @MoogMusicInc said:

    @HolyMoses said:
    I just wonder what the latency is when “playing” software synthesizer or piano apps on Vision Pro?

    There’s two reasons I never will buy this kind of hardware.

    1. In current iteration 3500-4000 bucks is way too expensive for what I would pay… Sorry dear Apple.
    2. I hate to wear things on my head, even when it’s super cold outside… Have big problems to wear even Pro headphones, but, in music situations I have to do it, albeit in small portions…

    Our director of software development recorded this unscripted latency and responsiveness exploration today, if you're interested:

    ❤️ your friends at Moog

    That’s absolutely stunning. Well done

  • @celtic_elk said:
    This article on CDM indicates that you can use the app as an MPE MIDI controller for other synths: https://cdm.link/2024/02/moog-animoog-galaxy-apple-vision-pro/

    You sure can, it's got even a local off mode so that you can use the Animoog Galaxy sequencer to play its loaded presets, while using the keyboard gestures to only control other iPad synths running on the same Vision Pro, or external MIDI devices.

    You can also partially use it in immersive mode while playing on a physical controller that's connected over Bluetooth MIDI and visible in passthrough.

    For example in this picture we used a Sub25 connected with a WIDI Master plugged into its DIN MIDI ports.

    ❤️ your friends at Moog

  • @MoogMusicInc said:
    You sure can, it's got even a local off mode so that you can use the Animoog Galaxy sequencer to play its loaded presets, while using the keyboard gestures to only control other iPad synths running on the same Vision Pro, or external MIDI devices.

    You can also partially use it in immersive mode while playing on a physical controller that's connected over Bluetooth MIDI and visible in passthrough.

    For example in this picture we used a Sub25 connected with a WIDI Master plugged into its DIN MIDI ports.

    ❤️ your friends at Moog

    Slightly off-topic, but since y’all have apparently been experimenting with these capabilities: what can you tell us about the music-making experience on the Vision Pro? How well do iPad-native apps work in that environment? Do AUv3 hosts and plugins work as expected?

  • @celtic_elk said:
    Slightly off-topic, but since y’all have apparently been experimenting with these capabilities: what can you tell us about the music-making experience on the Vision Pro? How well do iPad-native apps work in that environment? Do AUv3 hosts and plugins work as expected?

    All iPad apps we tried worked exactly as expected. There's one annoyance, gaze tracking only works when using native UIKit controls, which can make it difficult to know what you're focusing on at times. Apple is aware of this and is looking at providing a solution in a future visionOS update.

    AUv3 in AUM and Drambo worked fine, GarageBand, Cubasis and Logic are not available. Sample rate is fixed to 48kHz and buffer size to 480.

    One cool thing to do is opening multiple iPad synth apps up and positioning them all around you, using a standalone sequencer like Xequence then allows you to sequence all the instruments while have immediate access to the full UI.

    Hope this helps.

    ❤️ your friends at Moog

  • edited February 3

    @NeuM said:

    @BiancaNeve said:

    @NeuM said:
    I will absolutely give them credit for getting this out ahead of everyone else. Great job.

    I think Apple probably paid them to develop it.

    And why do you think this? Everyone had access to the same developer's tools at the same time as far as I know. It's possible the biggest developers had access to a AVP headset earlier than others, but that would just be speculation.

    Well, Geert wrote (on LinkedIn if I remember correctly) that Animoog uses features that were not available in the simulator. So either they just released it untested, or they had access to the hardware.

    Seeing how I've often seen NAMM pictures with the Moog guys and a bunch of the other US-based iOS devs having a beer with the Apple guys, it's not out of the question that they were given time with the actual product.

    And why not? We need to see these things to get an idea of what this new platform can do. I love seeing it <3

  • @MoogMusicInc said:

    @celtic_elk said:
    Slightly off-topic, but since y’all have apparently been experimenting with these capabilities: what can you tell us about the music-making experience on the Vision Pro? How well do iPad-native apps work in that environment? Do AUv3 hosts and plugins work as expected?

    All iPad apps we tried worked exactly as expected. There's one annoyance, gaze tracking only works when using native UIKit controls, which can make it difficult to know what you're focusing on at times. Apple is aware of this and is looking at providing a solution in a future visionOS update.

    AUv3 in AUM and Drambo worked fine, GarageBand, Cubasis and Logic are not available. Sample rate is fixed to 48kHz and buffer size to 480.

    One cool thing to do is opening multiple iPad synth apps up and positioning them all around you, using a standalone sequencer like Xequence then allows you to sequence all the instruments while have immediate access to the full UI.

    Hope this helps.

    ❤️ your friends at Moog

    Yes, that’s super-helpful - thank you! I’m fairly certain that we’ll get, at least, a native version of GarageBand before the year is out. Steinberg would (IMO) be smart to get a version of Cubasis up and running soon - even if it’s just compatibility for the iPad version, being the first mover in a new system is a big advantage, especially since GB will probably be free.

  • I do find it odd that Logic Pro isn’t available on it. I wonder they are planning something special

  • @brambos said:

    @NeuM said:

    @BiancaNeve said:

    @NeuM said:
    I will absolutely give them credit for getting this out ahead of everyone else. Great job.

    I think Apple probably paid them to develop it.

    And why do you think this? Everyone had access to the same developer's tools at the same time as far as I know. It's possible the biggest developers had access to a AVP headset earlier than others, but that would just be speculation.

    Well, Geert wrote (on LinkedIn if I remember correctly) that Animoog uses features that were not available in the simulator. So either they just released it untested, or they had access to the hardware.

    Seeing how I've often seen NAMM pictures with the Moog guys and a bunch of the other US-based iOS devs having a beer with the Apple guys, it's not out of the question that they were given time with the actual product.

    And why not? We need to see these things to get an idea of what this new platform can do. I love seeing it <3

    One question I have about using a synth in the AVP... since one must LOOK AT whatever they want to manipulate or activate in this environment, how does this affect playing a keyboard? I think it might quickly become an impossible task unless one has a physical hardware keyboard attached.

  • @NeuM said:

    @brambos said:

    @NeuM said:

    @BiancaNeve said:

    @NeuM said:
    I will absolutely give them credit for getting this out ahead of everyone else. Great job.

    I think Apple probably paid them to develop it.

    And why do you think this? Everyone had access to the same developer's tools at the same time as far as I know. It's possible the biggest developers had access to a AVP headset earlier than others, but that would just be speculation.

    Well, Geert wrote (on LinkedIn if I remember correctly) that Animoog uses features that were not available in the simulator. So either they just released it untested, or they had access to the hardware.

    Seeing how I've often seen NAMM pictures with the Moog guys and a bunch of the other US-based iOS devs having a beer with the Apple guys, it's not out of the question that they were given time with the actual product.

    And why not? We need to see these things to get an idea of what this new platform can do. I love seeing it <3

    One question I have about using a synth in the AVP... since one must LOOK AT whatever they want to manipulate or activate in this environment, how does this affect playing a keyboard? I think it might quickly become an impossible task unless one has a physical hardware keyboard attached.

    Optical focus is a primary means of interaction, and it’s the one that everyone is raving about, but it’s not the sole means. You can physically touch virtual elements as well, and you can switch back and forth between those modalities at will - the virtual keyboard supports both look-and-tap and physically pressing keys for text input. You could easily have a virtual keyboard on a physical flat surface in your workspace, and I imagine it would function as well as (and not feel any weirder in use than), say, the capacitive keys on the MicroFreak.

  • @celtic_elk said:

    @NeuM said:

    @brambos said:

    @NeuM said:

    @BiancaNeve said:

    @NeuM said:
    I will absolutely give them credit for getting this out ahead of everyone else. Great job.

    I think Apple probably paid them to develop it.

    And why do you think this? Everyone had access to the same developer's tools at the same time as far as I know. It's possible the biggest developers had access to a AVP headset earlier than others, but that would just be speculation.

    Well, Geert wrote (on LinkedIn if I remember correctly) that Animoog uses features that were not available in the simulator. So either they just released it untested, or they had access to the hardware.

    Seeing how I've often seen NAMM pictures with the Moog guys and a bunch of the other US-based iOS devs having a beer with the Apple guys, it's not out of the question that they were given time with the actual product.

    And why not? We need to see these things to get an idea of what this new platform can do. I love seeing it <3

    One question I have about using a synth in the AVP... since one must LOOK AT whatever they want to manipulate or activate in this environment, how does this affect playing a keyboard? I think it might quickly become an impossible task unless one has a physical hardware keyboard attached.

    Optical focus is a primary means of interaction, and it’s the one that everyone is raving about, but it’s not the sole means. You can physically touch virtual elements as well, and you can switch back and forth between those modalities at will - the virtual keyboard supports both look-and-tap and physically pressing keys for text input. You could easily have a virtual keyboard on a physical flat surface in your workspace, and I imagine it would function as well as (and not feel any weirder in use than), say, the capacitive keys on the MicroFreak.

    Yes, you could have a virtual keyboard, but as soon as your gaze strayed to some other virtual surface or app, your interactions with the non-existent keyboard would stop.

  • @MoogMusicInc said:

    @abf said:

    @wim said:
    I wonder what all the people convinced inMusic would be the death of Moog are thinking these days?

    They are probably wondering if Moog is now purely a software company.

    We're most definitely not, hardware release cycles are simply a lot longer.

    ❤️ your friends at Moog

    Thanks for the reply, that is reassuring. You know we're all worried about you, worried about Moog.

    I am very excited to get on board with Vision Pro, I am impatiently waiting for a lighter weight headset, hopefully the next iteration. The existence of Animoog for Vision Pro is just fantastic. Congratulations!

  • @NeuM said:

    @celtic_elk said:
    Optical focus is a primary means of interaction, and it’s the one that everyone is raving about, but it’s not the sole means. You can physically touch virtual elements as well, and you can switch back and forth between those modalities at will - the virtual keyboard supports both look-and-tap and physically pressing keys for text input. You could easily have a virtual keyboard on a physical flat surface in your workspace, and I imagine it would function as well as (and not feel any weirder in use than), say, the capacitive keys on the MicroFreak.

    Yes, you could have a virtual keyboard, but as soon as your gaze strayed to some other virtual surface or app, your interactions with the non-existent keyboard would stop.

    I’m not certain that’s true. If the device can still locate your hands and tell that they’re intersecting with a virtual object, and if you’re not making an obvious pinch-to-click gesture to activate whatever you happen to currently be looking at, I would think that a virtual control surface would continue to function. Obviously we need more hands-on user reports on these details.

  • @NeuM said:
    Yes, you could have a virtual keyboard, but as soon as your gaze strayed to some other virtual surface or app, your interactions with the non-existent keyboard would stop.

    You only need to focus for the initial pinch, then if the app is designed for it (like Animoog Galaxy), you can control the object that was focused while holding the pinch and dragging your hand and your eyes are free to do something else. This can be a little getting used to at first, since so many of us were trained to never look at their instrument to play it, now you have to look at your instrument :wink:

    ❤️ your friends at Moog

  • edited February 3

    @cyberheater said:
    I do find it odd that Logic Pro isn’t available on it. I wonder they are planning something special

    must be, because i just tried drambo for ipad on it and it was nearly a spiritual experience ruismaker noir as well

    maybe Lpro for visionOS will pop auv3 into their own windows in the Environment….that seems a no brainer

  • @realdawei said:

    @cyberheater said:
    I do find it odd that Logic Pro isn’t available on it. I wonder they are planning something special

    must be, because i just tried drambo for ipad on it and it was nearly a spiritual experience ruismaker noir as well

    Oh wow, my apps actually work on it? Last time I tried in the emulator it was mostly non functioning B)

  • @brambos said:

    @realdawei said:

    @cyberheater said:
    I do find it odd that Logic Pro isn’t available on it. I wonder they are planning something special

    must be, because i just tried drambo for ipad on it and it was nearly a spiritual experience ruismaker noir as well

    Oh wow, my apps actually work on it? Last time I tried in the emulator it was mostly non functioning B)

    Dabbled briefly in noir and seemed okay. Will try the others but I did nothing extensive so far odds are I’ll find the issues once I figure out how to connect a BTLE controller

  • Apple vision pro costs 3499$

  • @brambos said:
    Oh wow, my apps actually work on it? Last time I tried in the emulator it was mostly non functioning B)

    It's not an emulator on Vision Pro itself, apps run natively on the M2 with the same APIs in the OS available. Each app gets a floating window as if it's its own iPad. You can use as many as you want provided your CPU can hold up.

  • edited February 4

    @GeertBevin said:

    @brambos said:
    Oh wow, my apps actually work on it? Last time I tried in the emulator it was mostly non functioning B)

    It's not an emulator on Vision Pro itself, apps run natively on the M2 with the same APIs in the OS available. Each app gets a floating window as if it's its own iPad. You can use as many as you want provided your CPU can hold up.

    That's spectacular. Everything I see about this thing makes it extra appealing.

    Yeah, I mean I only tried it in the VP emulator which ran on my Mac Mini / XCode a couple months back. I couldn't get AUv3 to run on that setup back then.

    If the Vision Pro comes to Europe for less than €4000 I may even consider it. But I fear it will end up even more expensive than that if the typical MacBook markups (compared to US prices) over here are any indication.

  • Let's hope for a headphone jack in v2 of the Vision pro.

  • @Wrlds2ndBstGeoshredr said:
    I’m not gonna buy these glasses, but it would be great with Mazetools Mutant. That app seems built for this environment, even more so than Animoog.

    +1

  • @israelite said:
    Apple vision pro costs 3499$

    😬 for 256GB> @GeertBevin said:

    @brambos said:
    Oh wow, my apps actually work on it? Last time I tried in the emulator it was mostly non functioning B)

    It's not an emulator on Vision Pro itself, apps run natively on the M2 with the same APIs in the OS available. Each app gets a floating window as if it's its own iPad. You can use as many as you want provided your CPU can hold up.

    This photo will sell a ton of AVP’s 😂

  • @realdawei said:

    @israelite said:
    Apple vision pro costs 3499$

    😬 for 256GB> @GeertBevin said:

    @brambos said:
    Oh wow, my apps actually work on it? Last time I tried in the emulator it was mostly non functioning B)

    It's not an emulator on Vision Pro itself, apps run natively on the M2 with the same APIs in the OS available. Each app gets a floating window as if it's its own iPad. You can use as many as you want provided your CPU can hold up.

    This photo will sell a ton of AVP’s 😂

    But we'd need some kind of 'sound flower / audio hijack' type of app to record the audio from all the individual apps to separate audio files.

    Guess the next LogicPro for iPad update when ever it drops will also run on the AVP or something but without proper multi-window UI for the plug-ins it remains a gimmick :sunglasses:

  • edited February 4

    Seems like a virtual Moog Theremin would be an ideal pick for the next Moog app for Apple Vision Pro. I imagine it could be quite realistic, with how good the hand tracking is on this hardware. It’d also be a cool way to honor Bob Moog’s legacy, since his first product was a DIY Theremin kit.

  • @Samu said:

    @realdawei said:

    @israelite said:
    Apple vision pro costs 3499$

    😬 for 256GB> @GeertBevin said:

    @brambos said:
    Oh wow, my apps actually work on it? Last time I tried in the emulator it was mostly non functioning B)

    It's not an emulator on Vision Pro itself, apps run natively on the M2 with the same APIs in the OS available. Each app gets a floating window as if it's its own iPad. You can use as many as you want provided your CPU can hold up.

    This photo will sell a ton of AVP’s 😂

    But we'd need some kind of 'sound flower / audio hijack' type of app to record the audio from all the individual apps to separate audio files.

    Guess the next LogicPro for iPad update when ever it drops will also run on the AVP or something but without proper multi-window UI for the plug-ins it remains a gimmick :sunglasses:

    Hopefully optimistic that this is what they have up their sleeve 👑 maybe even a video window for scoring

Sign In or Register to comment.