Loopy Pro: Create music, your way.
What is Loopy Pro? — Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.
Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.
Download on the App StoreLoopy Pro is your all-in-one musical toolkit. Try it for free today.
Comments
@blueveek
Out of interest, is there any way that you can hook directly into AUv3 floating point parameter automation values or will it be treated as 128 MIDI CC values?
Yup! I've answered this in another comment, but pasting it here again:
This is planned for the immediate to medium term future. The plan is to first release a polished piano roll that everyone can instantly understand, then focus my energy on finishing two additional AUs with more complex featuresets:
MIDI CC output is limited to the 128 values which MIDI provides, but automation values are also readable by hosts via the plugin's parameter values (which are floating point and have much better resolution). Routing can be done either way, using either MIDI or AUv3 parameter values, and is up to the user.
This looks like it's off to an incredible start. I'm actually surprised this vacuum went unfilled as long as it did, as I'm sure many of us have been looking for something like this for quite a while now.
p.s. This might be unrealistic to expect to see in any iOS daws/sequencers anytime soon (Ableton likely has it patented?) but I'd love to see something like their "midi capture" eventually picked up on other platforms. It's so nice to never miss those unexpected little inspired moments that randomly pop up during noodle/practice sessions since it's constantly recording the 10-minute buffer, as long as the daw is running.
That's great to hear. Having access to the parameter values published to the host will make a world of difference.
@ZenKier I remember the same midi capture function in FL Studio, from about 5 years ago when I was still using that primarily.
You should check out the Midi Recorder app. It can run in the BG, capturing all the Midi flowing through it, and it even has a timer to end the recording after a specific time period.
The only thing is you have to load it up and start it manually, so using it has to become like a habit.
MIDI Recorder with E.Piano by Ryouta Kira
https://itunes.apple.com/us/app/midi-recorder-with-e-piano/id1448577506?mt=8
bumping this really needs to be on the frontPage, Gamechanger...
I don't think there's any existing host that allows routing parameter-readings from one plugin to setting parameter values of another plugin. That would be cool, due to the floating point resolution. However, I don't think there's any realtime safe way to read a parameter value from the audio thread, and/or getting the timestamp corresponding to a reading. Also there would be no way to send multiple values during the same render cycle (which can be long with large buffer sizes).
In short, I think using MIDI CC output is the way to go, since it allows multiple timestamped events during the same render cycle. Resolution could be improved by using 14-bit CC (not supported in AUM, yet).
That's the way I'd expected things to be. It's much the same problem on the desktop. Only the host itself has access to host automation floating point values.
I thought it was worth asking nonetheless as I wondered whether the modern frameworks of iOS Audio had answered the problem in an elegant manner.
14-bit MIDI is only implemented by a few iOS instruments (e.g. the Moog stuff), but it's a pain to set up. Yet again this isn't something that only affects iOS audio. Getting 14-bit MIDI set up on the desktop is a PITA too!
Thank goodness MIDI 2.0 is (comparatively) just around the corner, as this (apparently) will specify a permanent solution to MIDI CC's being quantized to 128 possible values. This is one of those areas that give DAW based hosts an unfair advantage - they're the only environment where artists can modulate values at floating point rates.
Thanks for updating the thread with more accurate information.
Excellent, thanks for the clarification. I agree about synchronisation concerns.
After release, it would be good to sync up / discuss more about 14-bit CC values. I had someone requesting support for it in the automation tool as well, so maybe this is the best way to go (which is still standards compliant).
Yay, another shout for HiRes (14 bit) CC's - My BassStation 2 is feeling a little sad at not being able to have it's mixer, LFO's and filter automated as smoothly as it could
@blueveek @j_liljedahl
Having reflected on this I'd be interested to see how things function with ApeMatrix and AUv3 MIDI FX (it's own LFO's aren't limited to 128 CC values when modulating AUv3's AFAIK). I've been meaning to set up a 6 voice template for Ruismaker Noir in AUM, ApeMatrix, Audiobus 3 and Cubasis to compare the UX differences and modulation capabilities. On paper, Cubasis is just another IAA application that can host AUv3 MIDI FX so it will be interesting to see how things play out.
My expectations are that Cubasis will win with regards to floating point modulation resolution but it wouldn't surprise me if AUv3 MIDI FX is still quantized to 128 values.
I use 14-bit MIDI with hardware synths via a Max for Live device, and with bespoke patch management plugins that facilitate modulation of parameter values at 14-bit resolution. But having owned a couple of Behringer BCR2000's for a good number of years, I've learned the hard way that having a 14-bit capable MIDI controller doesn't necessarily mean that you'll be able to modulate 14-bit capable plugin instruments with ease.
@blueveek This looks awesome, exactly what I've been waiting for... MIDI looping inside of AUM or AB with real-time recording and clip launching!
So just to be clear, I'd want to use this also with external synths. So say I have my Sub 37 connected and set the input and outputs to the external ports... Can I start recording a loop into your plugin by punching in on the beat (like you said quantized launch) then record notes and knob CCs as long as I want (let's say 12 bars) then punch out and have it immediately start looping? If so this is genius.
May I also suggest that there is a way to turn off MIDI input monitoring (if it is even needed) so that external synths can stay in Local On mode.
Really looking forward to this!
That's quite handy! Can't beat the price either
I know it would be another IOS hack, but couldnt you add an extra audio input to any plugin as a modulator linkable to any AU parameter, that would be high resolution, sample accurate.
Will it be universal on release? Interested by iPhone 7 Plus beta tester?
YES! Triggering of a sequencer to advance steps, not just slaved to a master clock, is awesome. It allows for really cool transformations of the same basic series of notes, but in a new rhythm and is a great way to generate variety.
Again, we go to the MIA Master of MIDI, Dr. Piz... Check out his Midisteps sequencer. you lay out your notes on the piano roll, assign a note to trigger, and every time it gets that note, it advances the sequencer a step. This of course assumes that you treat the piano roll as a fixed grid.
@MonkeyDrummer
Quantum can do the triggered advance trick too.
I think the Groove clips in Photon do something really similar, but you need the rhythm/dynamics info to be in a Midi clip to use it. I don’t think Photon can do it with live input from the user hitting a key.
I know that. You know how I know that? Because he put it in after I bugged him enough!
But having it in an AU tool will be more handy, as I almost always get midi hiccups when I move back and forth between AUM and Quantum. I pretty much only use Quantum as a stand-alone now to seq hardware...
Having the step trigger in an au, and being able to feed in something like Rozetta Rhythm (multiple tracks all sending same midi note to trigger, but having non-equal lengths, etc. is cool...).
BTW, WTF is your avatar anyway?
Blob sculpin (Psychrolutes phrictus)
Aka Blobfish
https://www.earthtouchnews.com/wtf/wtf/blobfish-might-be-a-gooey-mess-out-of-water-but-check-out-a-living-one-video
Lol, that WAS you. I remember the thread. I think you told me about MidiSteps having this feature at the time you were requesting it be added to Quantum.
If you have Photon, the Groove feature is similar, but you have to work from pre-existing Midi Files, or I suppose you could make them on the fly by recording live into Photon, but still not entirely real-time. You still have to work within the buffer save/load paradigm.
My Avatar is a Blobfish. A real, deep sea creature.
I laughed when I saw this photo. I thought it looked a lot like that whore-porate asshole Ted Cruz, the Senator from Texas.
I think the fish has more backbone.
Out of water, it's a spitting image of Soros.
They both belong with the fishes...
Oh shit.
Sorry about the politics, ya’ll.
🤭
You're fired.
The way it's set up, "punch in/out" works a little differently than what you described, and I also think you have a very valid use-case listed here. This plugin goes to great lengths to be absolutely synced up with the host, so once the host starts playing, the playhead matches. If the host pauses, playhead remains where the host is paused etc. If the host moves its own playhead (if one exists), this plugin matches it etc.
All of this is, of course, modulo looping duration (literally). So if the host's playhead is at bar 9, while the piano roll clip's duration is set to 4 bars, then: host time is bar 9, plugin time is bar 1 etc.
Now of course, you can also start playing notes after the loop repeats, which get recorded normally, and which seems to me like it matches your use-case, but please correct me if I'm wrong.
Auto-extending loop duration while recording, then "punching out" to start playing, might be tricky. The question is when to stop recording and start playing? The easiest approach is to listen some CC which says "I'm done recording, start playing". I think this is a valid approach, and curious to know your thoughts.
So thru off?
Universal on release, yes. The UI is responsive at any resolution or screen size.
@j_liljedahl Can AUM be notified to update the name it displays under plugin icons, based on the
audioUnitShortName
the plugin sends? Does it check it for each instance, and/or can it be updated at any moment?It would be nice to be able to name each instance of this Piano Roll, instead of all of them looking the same on every AUM track.
I think I may have used the terms punch in and punch out incorrectly. I was just confirming the basic function is to press record and start recording MIDI into the piano roll, the press record again at any time when finished (the loop is quantized to the next bar) and it will begin looping? Essentially that, like Ableton, you don’t have to predefine the loop length.
And yes, if you include an option to disable MIDI thru, then that would allow external instruments to use their own keyboards/pads and avoid double-triggering.
Looping happens automatically, based on a predefined number of bars ("duration") you've set up beforehand. I like the idea of auto-growing this duration though. Sounds like there should be an option, between either:
1. Auto-growing duration by 1 bar while recording and playhead goes past the predefined duration.
2. Auto-looping back to start while recording. This allows building-up drum patterns and is a common workflow.
How does that sound?
Sorry I’m interfering in the conversation but it sounds great to me!
If you don’t know the loop length in advance the default “auto-growing” duration could be just 1 bar and it will work as Ableton.