Loopy Pro: Create music, your way.
What is Loopy Pro? — Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.
Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.
Download on the App StoreLoopy Pro is your all-in-one musical toolkit. Try it for free today.
iOS music making workflow creation processes: what misses you?
After one year using extensively and exclusively iOS for music making, I’ve almost found my best workflow processes but also reached some limits.
All is in what we want to do musically and how to do it: which apps to use, which workflow suits us better.
My initial artistic approach is in interaction between sounds I make, what I feel from them, and where it leads me to create next melodies, rhythms and sound textures to make a full song. I don’t have any mental sonic representation of where I go musically when I start creating, except for a global idea, and I try to be as spontaneous as possible during whole process. I truly go where sounds leads me.
At first stage I will use Audiobus, AUM or Apematrix, various AU or linked synced IAA apps, and AU fxs to shape my drums, basses, leads, keyboards sounds... usually all recorded in Blocswave as a sound receiver which also helps me to create song sections. I usually create some song base sounds and do some variations of them with midi, slicing or fxs. What misses me from Ableton Live here is clip enveloppe automations. The iOS alternative to me is in LFO apps like Rozeta’s one, KRFT or now Apematrix. Gadget can be useful too.
Now can comes song structure stage, this can take several routes. I can trigger everything live directly in AUM with some midi controller, iOS apps or external hardware. I can export everything from Blocswave to Launchpad to trigger loops live and do a song structure. Or I can use my KRFT/AUM/Reslice setup to add some movement with live AU fxs automation. Everything will be recorded as stereo or multiple channels audio files.
I will also need to add my audio recorded sax parts. I can do this at Blocswave stage, recording them directly inside it or using Loopy synced with Link. Or I can record my sax solo part with song structure stereo file player in an AUM file player, or even trigger instrumental loops with Loopy and pedalboard while playing sax and record whole performance in AUM. It’s often what I do. But if I want to add some complex saxes sections or solo parts, it become more complicated. I will need at this stage a DAW timeline, and import song structure stereo file in this DAW, or choose to import all audio parts one by one usually in GB loops mode and do my song structure from there. Time consuming, can be workflow killer. This is when Ableton Live or desktops can be time and inspiration savers, as you will do everything in one and only environment without and export/import stuff.
Also, when going GB route, not a lot of options as an iPhone user, there will be AU rendering issue. So I will need here to use as IAA inputs AUM with Midi link sync or Apematrix with midi clock sync, mostly for audio recording with fxs processing. Big difference again with desktop DAW’s here: everything is destructive, can’t go back. I can manage this but it’s nice at times really to have a reliable freezing/rendering system like Ableton Live. I know all iOS DAWS are impacted in variable proportions at this time by AU rendering issues.
Last stage here that truly misses too is mixes automations. GB only do levels ones. Again Ableton Live is so powerful here, like most desktop DAW’s. Same thing for mastering levels automations if needed. Adding movement to music is so important.
It’s a more or less my whole iOS workflow. Sometimes a true DAW misses me, when you can go back and forth between pure sound creation and song structure, audio recordings, solo improvisations, fine mixing or sound shaping.
iOS devices form factor and modular environment offer great music creation freedom, but does its software/apps limitation also puts chains around our musicians necks?
And what about you, what misses to you today on iOS workflow creation processes?
Comments
Nice thread really for midi oriented workflow. Minimalism is the key, mastering few apps is better than don’t know how to do things with too much.
About Cubasis, is it able to record a midi/audio track as new audio track in real-time as AUM does? I use a lot audio damage plugs and I know there are some rendering issues with them in that host. I myself use midi sequencers like Xequence or Gadget mainly for quantize options but also editing. But at some stage, I almost always render midi as audio if I want to do some slicing for example. I’ve get used to that with Live too when using grain modulation enveloppes clips automations for example. What will however miss me in Cubasis is session/loop mode. Don’t know yet if I’ll go iPad route or laptop’s one. GB can be powerful but I know some features will miss me too much at some stage, it’s not a professional tool.
Mostly rhythmic jams recorded in AUM, then chopped up on the desktop in Reason. Occasionally I’ll record a CPU synth monster (LayR etc.) played live and recorded directly onto the desktop.
So why do you use your desktop and Reason, could you explain if there are some things you can't do on iOS? Workflow preferences?
Quite a few:
I don’t have enough CPU on the iPad to run apps the way I’d like
I prefer a bigger screen, and using a mouse for audio editing
Though I’m a big fan of Auria, it can’t compete with Reason feature-wise as a DAW
I’ve got better synths, samplers, FX etc. on my desktop
When bandmates are here working on stuff, it’s easier for us to plug into the desktop, and all see what’s going on on the big screen
File management is much easier, and I can burn to CD
Etc.
That’s not to say I’ve stopped loving my iPad, and a lot of the drawbacks are down to running an older Air 2.
It’s not an either/or thing for me, I find both platforms fit nicely into whatever workflow I need for a particular project.
So for example I prefer the sample manglers, drum apps and sequencers on the iPad.
I would not be able to finish a song in iOS currently. I come from a background of being a multi instrumentalist, sing, play guitar, bass, keys well; drums, banjo, accordion, and a few others okay.
So there is a lot of outboard gear I’m gonna use. Of course, I could record those into the iPad. BUT, way easier on a desktop and, unlike straight iOS music making, the portability factor is of course trashed by virtue of the instrument being larger than the iPad.
And not sure I could record a full drum set...normally use 10-12 tracks.
I also have a ton of outboard gear (rack compressors, preamps, amps, pedals) that I could probably get close to most of in iOS but, frankly, the quality is not there yet for the whole array. It is for a few things, BIAS amps that are cross platform e.g., but there are not the quality of saturation tools, compressors, or modulation options on iOS that I’m used to, just for example, at least in AU that works with my host app.
Nor is there the processing power. I’m hitting 50% CPU on stuff that would be 1% on my desktop.
But I’m actually not thinking negatively about all this. It is a nice retreat for me to see what I can make sound good with minimalistic approaches. Reminder of some of the simple arrangements we did when I was a kid, totally live, in a 8 track reel to reel studio ha.
Nor was it ever my thought that I COULD complete songs in iOS. Rather, I set my sites on the iPad as a live drum, machine, synth module, sequencer. And it does all that in a tiny mobile package better than dedicated units (once I got over some learning curve)
Right now I just want AUM to get to 9\4 and also have \8 available for the rest as well. Further would be nice, but using all this Rozeta stuff and other step sequencers it’s harder to get it all in sync properly. That said, Rozeta Bassline would really be nice if it went to 18 steps like the others can. It’s not so bad chaining 18 step patterns together, but chaining 9 step patterns is kind of awful. Apple needs to rethink their file system too, and that’s all I can think of for now. I’m only on iOS now (for music), so not using desktop or anything like that. Same for Cubasis and Garageband give us time signatures!! Also, while I’m at it a DAW other than Garageband that can record MPE without some extensive setup.
CPU power and management, time signatures and other advanced features, difficulties to finish complex songs, it seems that most of us reach same limits. But great mobility form factor, refreshing and intuitive unique apps, touchscreen interaction, minimalist and live approches inspiring new creativity, this seems to be what we like from iPhone and iPad.
Now, why some of us seems to want to do everything on iOS, despite those limitations which can make everything so challenging if not impossible to manage at times? Is this challenge? The device and its apps themselves? Form factor? Going back and forth between iPad/iPhone and laptop/desktop which can be tedious too? Ego and/or obstinacy? In my case a bit of each.