Loopy Pro: Create music, your way.
What is Loopy Pro? — Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.
Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.
Download on the App StoreLoopy Pro is your all-in-one musical toolkit. Try it for free today.
Comments
@sinosoidal thank you for filling a space I had on ios and something I've wanted for a long time. Loving it
As a product manager in my professional life it makes me so happy to see human-centered design fueling great products in my hobbies!
The same thing is happening with a 12.9-inch 2018 iPad Pro.
Same here -
Just one question …
Is there a way to record the video? Or to export it?
We need to investigate. I think we are not able to reproduce this any of our devices but will check. The fact that we can see the handle to push the dock, lets me think that this is just a matter of setting a property. Not sure about it. We will see what we can do.
Right now you need to rely on iOS screen capture. We don't have that feature yet.
But is it planned for a future update?
@sinosoidal
The screenshots have peaked my interest.
You’ve mentioned being
able to use video files?
How long can these files be?
Can we set markers on the video clips
so that they can be triggered using midi?
What’s the file length limit?
Are the files imported into VS or
are they streamed?
Correct! In background layer.
We are not quite sure yet. We have only tested with clips with less than 2 minutes. We should have more info anytime soon.
@Gavinski tolds us he was having problems with a file with 300 Mb. We still need to investigate if this is an isolated issue or hard limit.
No, but I can already see the utility of doing it so. You could set notes to trigger the video at some point. I think that would be awesome. Anything else that I'm seeing?
You reference the files from the hard drive. But theoretically it could also work from a remote endpoint. We are not supporting it at the moment though.
Thanks
Cool.
Possibly video blending per marker
could be quite useful.
There are quite a few things that can be done with layers reacting with each other.
ie difference, alpha channel, screen overlay
etc standard in any video editor.
Automated transitions for instance
fades, layered swipes etc.
Being able to draw in the curves would be
.........
Obviously this would be heavy
cpu stuff when done live.
Okay cool, good to know.
I’ve just seen a video of
VS in another thread
and I’m very impressed.
My rig consists of two iPads so one
could easily be dedicated for visuals
something I’ve been waiting to do.
I had tried STAELLA but it doesn’t have
support for external audio interfaces
with multiple inputs.
Looks like you’re going to have
another happy customer by
tomorrow.
Thank you.
This is gonna be so much fun.
@sinosoidal Yes render to video file that great news it’s coming in time thanks for being top developers.
First off, I have to say that this is totally cool. I love the idea of visualizing synthesis.
That said, if I am trying to create trippy visuals to go with music, I think there are better options out there.
Not trying to take away from this app, it’s really fantastic, but the visuals seem limited. Faithful, but limited.
I really like the visuals and all the options you have for creative input but without a real powerful iPad (M1) it's not much you can do. I have just a Beathawk piano and Klevgrand Pipa playing the same notes, via LK, and 5 layers of visuals in VS and the framerate is 10fps = choppy. I'm on an AIR3 which I think should be able to handle that with at least 20-30fps. So, I guess it have to wait for me saving up to a M1 Also curious about plans to be able to render to video, if there are any?
Cheers!
i reallly really hope there a plan to render video. I know they have said its planned, but i hope they follow through.
Nice one!
Even the M1, has not the performance I was expecting. But it is much more faster.
Try to lower the qualiy setting (Menu -> Settings) until you meet the FPS. By default it starts in medium. You might need to lower it to low.
This will always depende on the polyphony being played and mostly, the material being used. There are materials heavier than others.
It's planned. It is a matter of time.
Brilliant, thank you very much for such a cool tool !!!
Greetings
This is really interesting, 👏
Can you manipulate existing videos or photos via midi?. I think that’s what @Identor and @lukesleepwalker are mentioning. I’d be a lot more interested in changing levels, colorizing, blurring and so on to my own picture or video. It’d render more original material.
lotta potential here - I have been wanting a really good visualizer thing - I am getting a lot of crashes though..
So am I supposed to run this as an audio effect after a synth and then also make sure it’s receiving midi from my keyboard? It seems cool and I love imaginando
Are you using AUM? Can you send us the AUM that replicates the crashes?
Oat, it could receive midi from anything. But yeah, if u r playing the synth with your keyboard, also pipe the keyboard midi into the hamburger menu on the left of the VS icon.
@pantsofdeath yes, also getting occasional problems with crashes in aum. Sometimes quitting aum and retrying works. There's definitely some kind of bug that needs squashing.
@sinosoidal
You remember I mentioned the whole screen overlay thingys etc, etc, etc???
Yeah, well...
I’ve just had a little look.
It’s (playing it cool) awesome. 😁
I do have a question though.
Is it supposed to have a midi input in AUM and dRambo?
and when I first instantiated it I couldn’t play it using my LP X.
I closed it and ran it again and it started reacting to midi.
Which was fun.
I couldn’t have asked for more.
It’s wow. 🙌🏾
@sinosoidal
multi-input would be useful to enable track-specific audio modulations (and triggers).
The documentation mentions multiple visual voices enabling up to four materials per layer. There’s nothing in the documentation that describes how to add additional materials to a layer though and I haven’t been able to empirically derive it either.
What does the up/down arrow at the bottom-left of the materials browser supposed to do?
What are the up/down buttons at the top-right supposed to do? I just caused VS to hang by triple-or-quadruple tapping on the down button. (Not reproducible)
Ah, those buttons aren’t part of the material browser, I really couldn’t tell.
Oh, I’m on an iPad on 14.6 in AUM.
Start with the default patch. Connect your keyboard input to VS. Set layer TRIGGER MODE to MIDI. Ensure that TRIGGER channel is set to the same channel as the keyboard input. Press more than one key at the same time. You will observe polyphony.
Oh, I knew that. I thought it meant I could add multiple materials to the same layer, not multiple instances of the same material. D’oh.
If you want multiple materials, set a new material to another layer!