Loopy Pro: Create music, your way.
What is Loopy Pro? — Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.
Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.
Download on the App StoreLoopy Pro is your all-in-one musical toolkit. Try it for free today.
Comments
@SevenSystems @mjcouche @wim @espiegel123 @MisplacedDevelopment
After this conversation I end-up making my own template using AUM + Xequence, AudioBus as a glue and hosting loopy pro.
I so did this video :
to explain the workflow and template. Link to templates in video description.
Thanks.
About the SampleStart trick with Sample Wiz :
Unfortunately, I didn't fine any other audio recorder which can do that (have the SampleStart MIDI message), for me this is just something to keep in mind rather than a real smooth solution.
Hey,
just watching your first video and enjoying it a lot (also the improvisations are quite entertaining 😄)
Thanks for all your work! Lots of interesting technical ideas and hacks going on there!
One observation: At 16:52, you're showing the connection from Xequence to AUM using "Xequence Source" -- unless I've missed something, it would be better here to instead select "'AUM' Destination", and then select "AUM" as the MIDI Destination in Xequence. That way, you will get better MIDI timing. ("Xequence Source" is only meant as a last resort for routing Xequence to apps that don't have a virtual MIDI Input).
Oh, and I have no idea how to pronounce "Xequence" myself 🤣 I originally thought of it as "Ksequence" using my German brain, but most proper English speakers actually seem to say "Zequence" (soft 's'). 🎓
Thanks.
Ho, I didn't know it would make any difference in the MIDI timing, I have to look at it. The only reason I have first chosen "Xequence Source" is to make my Xequence projects (with all the instrument pre-loaded) kind of "universal". I can open the Xequence project and make it drive AUM, ApeMatrix, Loopy Pro, or something else without changing each instrument's MIDI destination.
But this is true that at the end it will not happen often.
It probably wouldn't matter in your specific case, but for rhythm-sensitive ears
, it is definitely noticeable when you have e.g. EDM where everything's quantized.
But I get your reasoning behind using Xequence Source.
Watched your second video as well. Thanks for demoing it all... the "Sample Offset Ramp" method is probably the closest one can get to audio tracks right inside Xequence... maybe I should make an AUv3 sample player just for that purpose 😂
Sorry if this is a naive question, but… What is the advantage of this seemingly complex setup of four apps versus just doing the project in a single app like Cubasis?
That’s what I was thinking. This could all be done in Loopy Pro for another example. Need a piano roll? Grab Atom 2 and call it a day.
Sometimes people want the better routing and mixing flexibility of something like AUM over Cubasis. Sometimes they don't like the piano roll and greatly prefer Xequence's. Basically overcoming shortcomings by stringing together the individual components that best fit their needs.
Definitely not for everyone, and probably not worth bothering with unless there's something hindering one's workflow in a single DAW.
Thanks, makes sense now… I should take another look at Xequence - bought it a while back but found it a bit opaque at first experience and left it alone
@zzrwood if you're not the person for manuals (Xequence's manual has a very good "general overview" section), think of Xequence like a traditional MIDI sequencer in the 1990s. That's basically exactly how it works. But if you have any spontaneous questions just drop them here and I'll gladly help out.
Thanks for the pointers - have started digging in and it looks very interesting, particularly piano roll editing…
I didn’t mean any disrespect, when I bought the iPad I bought a swag of software, trying to see if it could replace my Logic Pro workflow. Still haven’t settled with something that totally works for me, but love the portability and immediacy of the iPad.
And now that it has Scaler I am spending more time seeing if I can get it to make sense for me.
Gotta say I do love the community and support here! 👍
Oh, no offense taken at all
I'm obviously genuinely interested in improving the first steps for Xequence users as well, so I'm open to suggestions there.
Have fun exploring the app then. I've heard good things about the pianoroll editor from users yes, but I personally still think what sets Xequence apart is the arranger
Well. I am not sure any justification or arguments would work for anybody satisfied with Cubasis. It all depend also how you work, how you build your projects, how you start ideas, how many time you have etc ... I guess the main argument is to be able to use the apps you like the most, the one you are confortable with, for sequencing, mixing, etc ...
In the video I shown that all the complexity of such workflow can be digested by making templates. It is almost like building your own software using peaces of specialised app. From the templates it only take a few click to load an instrument before listening it and recording it. And it is stable. Plus AUM offsets infinite way of routing things, layering instruments, mixing, etc ...
Xequence offers a lot of tools oriented to productivity, it is quick and efficient. It is then easy to transform so explorative ideas into "something".
That would be great !!
I have kind of mentally though about this:
A clip recorder/ player:
This way one could record an audio clip by setting a MID clip with one note within the song timeline, play it back them (with the ramp include).
Easy to say
That's a neat concept!
Yeah, even when I make light-hearted jokes, they immediately turn into new projects that my users lovingly plan for me 😜
Common, ... I already did 0.00001% of the work, you just need to fill the gap
😜
>
So i did this test :
I am getting some very small jitters. But the jitter is the same for both signals. Maybe there is still a bit of software quantisation on Helium side but it shall be very small.
Getting the same result when recording back to Xequence trough all my mess in AUM
@Sylvain interesting tests. However, there should be literally zero jitter when recording Xequence's output through 'AUM'. Maybe Atom 2 doesn't use the incoming MIDI packet timestamps?
I'm not in the loop with regards to AUv3 MIDI recorders... maybe repeat the test with some others and see if any of them manages to record the notes exactly on the grid when using 'AUM' as destination?
Okay so .... it seems that AUM source/destination is filtering the time stamp:
In the figure bellow I am running fuguemachine on AUM. We are looking at two Midi Monitors, the one on top is directly connected to fuguemachine output, the second is connected to the same fugue machine session but is passing through the AUM source / AUM destination.
If I plug the MiDImonitor to Xequence source I do not see the time neither ... not sure to know if this is the time stamp but for sure something is filtered.
Anyway, for my test it was a good thing that the time stamp is not used to check at midi source difference. The jitter is of order of 1/50th of a beat (1/4 of a note), -10ms for a 120beat/min.
@Sylvain very interesting. I'm not sure what the "Time:" column means in that MIDI monitor... at least the timestamps I'm sending from Xequence are actually nanosecond timestamps on a "global" timeline, so that's definitely not what you're seeing there. Maybe the monitor is showing the derived offset in samples in the current audio buffer?
Anyway, it seems like this is still the case then -- apps can choose to receive MIDI packets with timestamps in advance from the MIDI destinations they created as soon as the packets are sent by another app, but not from MIDI sources they connect to (this is the more technical explanation of the whole situation).