Loopy Pro: Create music, your way.
What is Loopy Pro? — Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.
Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.
Download on the App StoreLoopy Pro is your all-in-one musical toolkit. Try it for free today.
Orchestrating multiple MIDI channels with Atom -- discussion of approach and theoretical advancement
Been exploring using Atom as a persistence layer for a concept I've been working on called 'Algorithmic Orchestration' (bit of info in this reply here: https://forum.audiob.us/discussion/comment/945744/#Comment_945744
). This has lead me to exploring the use of Atom for recording 14-16 synched MIDI channels of data at a time. This does work well, although I'm finding that the workflow is not entirely clear conceptually, requiring either workarounds or highly repetitive actions with a lot of window switching.
I've gone through my own thoughts on how I've approached the methodology here as well as outlined suggestions as to how I feel the ability to track/manipulate/play back many interrelated MIDI channels might be enhanced through future functionality developments. I'd love to hear thoughts, suggestions and experiences from others.
Hopefully this won't come across as complicated feature demands. I'm exploring the conceptual space in a forum of public discourse to widen my own understanding of the software through exploration of my own and others perspectives and experience, as well as to hopefully spread ideas that might take on their own life and return to me in the form of future Atom2 functionality that enhances my workflow!
Also, full disclaimer that I didn't fully read the manual yet, so there's a chance I might state something foolish. Feel free to correct me openly!
Different approaches to this that I can imagine include:
- With a quick scan of the manual I see that the import method for multi channel MIDI import is to assign each MIDI channel to a different pattern. A conceptually concordant approach then, might be to allow for patterns to be played in parallel as as well in series -- IE. you toggle on the patterns you want to play and then they all play together. This would allow multiple channels to be recorded, edited and played back inside one unit with great flexibility, independence and simplicity.
^^ Something resembling this method isn't really doable at the moment as I can't see any way to capture, associate or differentiate different notes on the timeline with different MIDI channel inputs or outputs.
- Another potential 'modular' approach could be to 'link' transport/timeline controls across multiple Atom2 units (if this was possible). What I mean by this is that all the 'linked' units would respond to record/arm/timeline/playhead edits together, working in parallel while receiving, operating on and outputting different channels. This is pretty much the same thing as above really, with more
^^ You could do this kind of thing manually now, but you'd need to open up every unit individually and make the same minor tweaks to each. This is what I'm doing with a MIDI button mapped to all record to help make sure that all 16 units are armed without excessive unit switching. Deleting the same parts across all the channels and retaking things takes a while but the layers functionality is beautiful here.
- Another approach could simply be a version of the kind of workflow that Koala Sampler and BM3 sequencer use (where each channel exists on a different 'lane' or 'pad' that you 'drill down' into). Works great in many MIDI sequencers but does add complexity to the workflow. Maybe this is veering away from Atom2's design vision too -- don't know the project well enough to know.
Ok, I'm going to go use the manual now! Awesome software and I'm enjoying exploring it!
Comments
It is deep stuff that you are doing there! It did not click until I watched the videos that you posted in the other thread that you are performing using this system rather than just composing.
I would have said the second choice is the one I would go for as Atom is built today and it looks like that is what you are doing. I read that you have 16 instances, one listening on each channel and some co-ordination logic which controls the instances. It is the bit next that I haven’t understood yet. You record some MIDI in, then start playing it and then want to make the same tweaks to some/all of the instances?
Yeah nailed it! That's been the central idea! Abstracting and automating (code snippets, higher order functions etc.) things to the point that you can perform and even improvise to the level of the cumulative output of an entire orchestra of hundreds of years of development and hundreds of hands of work.
In the future I'm hoping to add a few live orchestral musicians and conduct from the interface, so that audiences can experience the blurring of algorithmic and material worlds in real time.
What I'm describing here is that I'm using Atom2 as a 'persistence layer' (IE. to capture, preserve and reproduce the cumulation of an abstraction of a predetermined length).
The notes of each instance thenwould be entirely independent, as it exists now, but it'd be a great flow to be able to make 'global' adjustments such as the looped in/out locations, total bar length, record arm & loop status, quantisation (etc.) which affect all instances simultaneously. As it is, if I want to adjust one of these parameters (which all instances need to keep identical or lose synch on reproduction), then I need to open 16 instances one at a time and move the loop start location by the same few beats (as a hypothetical example) then double check to make sure I actually did perform the same action accurately in each instance.
Is the modification something you would be happy to do manually if all 16 instances were in the same Atom or would something like be able to schedule the change of one or more of the parameters on some quantum get you to where you want to go? So, for instance, you could send a “move loop back 1 bar” command to some/all of the 16 and have it occur on the next beat of the timeline?
Yeah good point — if the different channels were in the same Atom then it’d still require manual editing in each pattern and probably be much more complicated in the end.
I can’t imagine actually making any changes while the timeline was running but Beatmaker3 has an option in the sampler for ‘duplicate change on all layers’ which I think is somehow conceptually close to what we’re discussing.