Loopy Pro: Create music, your way.
What is Loopy Pro? — Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.
Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.
Download on the App StoreLoopy Pro is your all-in-one musical toolkit. Try it for free today.
Comments
Awesome idea!
Excellent, that's what I suspect as well. I'll try to squeeze it in for release.
Yes, AUM listens on changes for this property. A plugin can do like this:
Fantastic!
Really appreciate your interest in our comments and suggestions. This app is really needed and could be a game-changer in the workflows of many of us!
@j_liljedahl Does AUM support
AUScheduleMIDIEventBlock
? Looks like it isn't null, but seems to behave like a black hole.If it does work, what sample time does it expect? Does it support
AUEventSampleTimeImmediate
?Yes, Rozeta uses both.
Are there any resources on this except for the headers? Does it automatically deliver the MIDI? Or does it schedule an event trigger
internalRenderBlock
later?Wait, my bad: AUScheduleMIDIEvent is only called by hosts, not by plugins. Plugins invoke the MIDIOutput block directly.
Interesting, that was my approach so far. However, I'm wondering if there's a better way of scheduling (relatively distant) future MIDI events that's not buffer-size dependent.
E.g. a buffer size of 1024 samples means an AU can output midi events /every so often/, which might be too low resolution. I've experimented with longer offsets to the relative timestamp received by
AUMIDIOutputEventBlock
, but no dice. Apple docs also suggest that AUEventSampleTimeImmediate can only be used with a "small 4096 sample offset".You can time everything with sample accuracy. Simply use timestamps. But for distant events you’ll have to buffer it yourself. No one said it was going to be easy
Sure, it just seems that no matter what timestamp I use to output using
AUMIDIOutputEventBlock
, all MIDI is still delivered immediately (unless the relative timestamps are very small). I'll experiment some more and slip into your DMs if no luckThanks for the info on
AUScheduleMIDIEventBlock
.Ah there you go.. the timestamps should be absolute.
You can message me if you’re stuck, but I’m in Tokyo right now and I have a long ass flight coming up, so I may not be able to respond very quickly
No rush, thanks for helping out. Cheers!
Time for a new post in this series by Gene de Lisa?
Thats what I do in my AU apps - maintain a fifo buffer for future events that I can't output in the current render block.
Tricky part is maintaining timestamp order (e.g. midi files only have note on + duration - the midi note off needs to be futured). Add to this mix transposition, instancing ...
Good thing is this fifo buffer then gives you a nice timestamp render.
I don't use AUScheduleMIDIEventBlock myself, preferring to maintain it myself but Audiokit has an example of it using AUEventSampleTimeImmediate
Am I the only one who finds the devs talking gibberish to each other strangely entertaining
May I just say that seeing developers (potential competitors) helping each other warms my heart...
Individually we try and make up for Apple's silence. I learnt most of my AU from Gene de Lisa
It is precisely this, developers ensuring their products are compatible with one another with consistent coding, that makes these tools better for everyone.
http://devnotes.kymatica.com/ios_midi_timestamps
Thanks, I've already implemented all of this with
AUMIDIOutputEventBlock
My original question was whether the magical
AUScheduleMIDIEventBlock
can be used by AUs instead ofAUMIDIOutputEventBlock
(in combination withAUEventSampleTimeImmediate
), for more ergonomic code, but I guess not based on @brambos's earlier answer.It is fascinating and it makes me think, when we all grab the pitchforks and yell ‘MOAR FEATURES!!’ I think we tend to forget just how much work goes into even the most mundane of things. I think it illustrates just how much work goes into this stuff. Maybe next time we all start getting impatient and demanding features get added to our favorite apps, we should pause and remember that in a lot of cases it’s just one person coding all this. We lucky to get what we get, there’s been some outstanding apps on iOS and we wouldn’t have shit without these guys.
P.s @brambos @j_liljedahl @midiSequencer @blueveek where would you guys recommend starting if anyone were interesting in learning to code midi/audio apps? Is prior coding experience essential or could you learn specifically to do just that specific aspect and learn along the way?
@Mull not wanting to discourage you, but realtime MIDI and especially Audio programming is among the hardest stuff you'll find in an already quite "extraterrestrial" field (software engineering), so be warned you'll need to have quite good grasp of general programming concepts, data structures, know several programming languages (C, Objective-C, and for efficient code, ARM assembly as icing on the cake ), and need to learn all the relevant specialized APIs like Audio Units, CoreMIDI, etc., whose official documentation from Apple ranges from "meh" to "docuWHAT?"
@SevenSystems I would have guessed as much given the complexity of the timings and all that arcane black magic!
@SevenSystems I would have included you on the list of resident dev gurus but didnt think you we’re on the thread mate! Love Xequence,it’s my daily driver 👍🏻
Thanks, much appreciated I'm lurking on every thread here with one of my 183 personalities, so be warned
Meh..I’m too pretty to need a personality 🤣 Btw u still thinking of making Xequence an AU host?
It's still listed as a "want", yeah... especially since the whole mixer user interface is in place anyway (from the development of the Xequence-based DAW), so "only" the actual AU hosting infrastructure would need to be added... don't take it as a promise though...