Loopy Pro: Create music, your way.

What is Loopy Pro?Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.

Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.

Download on the App Store

Loopy Pro is your all-in-one musical toolkit. Try it for free today.

AudioVeek Piano Roll MIDI sequencer AU

13468917

Comments

  • edited March 2019

    @blueveek said:

    @Samu said:
    Care to tease what the CC editing will look like?
    (From what I can see & read this will replace most of the AUv3 'sequencers' I have on my iPad)

    The look and feel isn't 100% set in stone, so I'll tease it when I'm absolutely sure it'll be what's actually shipped. Maybe in a week or so ;)

    @blueveek

    Out of interest, is there any way that you can hook directly into AUv3 floating point parameter automation values or will it be treated as 128 MIDI CC values?

  • @nonchai said:
    thanks for taking the plunge blueveek! someone was bound to do this I guess - but congrats for being first on the starting line.

    I think this is a kind of mini game changer. I think Jonatan at Kymatica has both thought about and been pestered to do an AUM II that incorporates MIDI piano-roll - which would make it a DAW - but your plugin fills a gap meanwhile.

    I was about to suggest having an option whereby whenever more than one track has MIDI created by your plugin - then all tracks can be saved to one MIDI or proprietary file - but in a sense merely by using AUv3 State Saving we have much that we need.

    Can wait to try this out and will be glad to be a beta/alpha tester. ( i'm already a beta tester for AUM, AB3 and Quantiloop so wold be good to give it a hammering with these key iOS apps )

    Yup! I've answered this in another comment, but pasting it here again:

    This is planned for the immediate to medium term future. The plan is to first release a polished piano roll that everyone can instantly understand, then focus my energy on finishing two additional AUs with more complex featuresets:

    • Multi-tracks that mimic classic DAW workflows (multiple clips per track, each clip has midi notes, each clip has access to the piano roll editor etc.)
    • Scenes/clip launcher that mimics Ableton-like workflows (similar to the above, but for clip launcher heads)
  • @jonmoore said:

    @blueveek said:

    @Samu said:
    Care to tease what the CC editing will look like?
    (From what I can see & read this will replace most of the AUv3 'sequencers' I have on my iPad)

    The look and feel isn't 100% set in stone, so I'll tease it when I'm absolutely sure it'll be what's actually shipped. Maybe in a week or so ;)

    @blueveek

    Out of interest, is there any way that you can hook directly into AUv3 floating point parameter automation values or will it be treated as 128 MIDI CC values?

    MIDI CC output is limited to the 128 values which MIDI provides, but automation values are also readable by hosts via the plugin's parameter values (which are floating point and have much better resolution). Routing can be done either way, using either MIDI or AUv3 parameter values, and is up to the user.

  • This looks like it's off to an incredible start. I'm actually surprised this vacuum went unfilled as long as it did, as I'm sure many of us have been looking for something like this for quite a while now.

    p.s. This might be unrealistic to expect to see in any iOS daws/sequencers anytime soon (Ableton likely has it patented?) but I'd love to see something like their "midi capture" eventually picked up on other platforms. It's so nice to never miss those unexpected little inspired moments that randomly pop up during noodle/practice sessions since it's constantly recording the 10-minute buffer, as long as the daw is running.

  • @blueveek said:

    @jonmoore said:

    @blueveek said:

    @Samu said:
    Care to tease what the CC editing will look like?
    (From what I can see & read this will replace most of the AUv3 'sequencers' I have on my iPad)

    The look and feel isn't 100% set in stone, so I'll tease it when I'm absolutely sure it'll be what's actually shipped. Maybe in a week or so ;)

    @blueveek

    Out of interest, is there any way that you can hook directly into AUv3 floating point parameter automation values or will it be treated as 128 MIDI CC values?

    MIDI CC output is limited to the 128 values which MIDI provides, but automation values are also readable by hosts via the plugin's parameter values (which are floating point and have much better resolution). Routing can be done either way, using either MIDI or AUv3 parameter values, and is up to the user.

    That's great to hear. Having access to the parameter values published to the host will make a world of difference.

  • @ZenKier said:
    This looks like it's off to an incredible start. I'm actually surprised this vacuum went unfilled as long as it did, as I'm sure many of us have been looking for something like this for quite a while now.

    p.s. This might be unrealistic to expect to see in any iOS daws/sequencers anytime soon (Ableton likely has it patented?) but I'd love to see something like their "midi capture" eventually picked up on other platforms. It's so nice to never miss those unexpected little inspired moments that randomly pop up during noodle/practice sessions since it's constantly recording the 10-minute buffer, as long as the daw is running.

    @ZenKier I remember the same midi capture function in FL Studio, from about 5 years ago when I was still using that primarily.

    You should check out the Midi Recorder app. It can run in the BG, capturing all the Midi flowing through it, and it even has a timer to end the recording after a specific time period.
    The only thing is you have to load it up and start it manually, so using it has to become like a habit.

    MIDI Recorder with E.Piano by Ryouta Kira
    https://itunes.apple.com/us/app/midi-recorder-with-e-piano/id1448577506?mt=8

  • bumping this really needs to be on the frontPage, Gamechanger...

  • @blueveek said:

    @jonmoore said:

    @blueveek said:

    @Samu said:
    Care to tease what the CC editing will look like?
    (From what I can see & read this will replace most of the AUv3 'sequencers' I have on my iPad)

    The look and feel isn't 100% set in stone, so I'll tease it when I'm absolutely sure it'll be what's actually shipped. Maybe in a week or so ;)

    @blueveek

    Out of interest, is there any way that you can hook directly into AUv3 floating point parameter automation values or will it be treated as 128 MIDI CC values?

    MIDI CC output is limited to the 128 values which MIDI provides, but automation values are also readable by hosts via the plugin's parameter values (which are floating point and have much better resolution). Routing can be done either way, using either MIDI or AUv3 parameter values, and is up to the user.

    I don't think there's any existing host that allows routing parameter-readings from one plugin to setting parameter values of another plugin. That would be cool, due to the floating point resolution. However, I don't think there's any realtime safe way to read a parameter value from the audio thread, and/or getting the timestamp corresponding to a reading. Also there would be no way to send multiple values during the same render cycle (which can be long with large buffer sizes).

    In short, I think using MIDI CC output is the way to go, since it allows multiple timestamped events during the same render cycle. Resolution could be improved by using 14-bit CC (not supported in AUM, yet).

  • In short, I think using MIDI CC output is the way to go, since it allows multiple timestamped events during the same render cycle. Resolution could be improved by using 14-bit CC (not supported in AUM, yet).

    That's the way I'd expected things to be. It's much the same problem on the desktop. Only the host itself has access to host automation floating point values.

    I thought it was worth asking nonetheless as I wondered whether the modern frameworks of iOS Audio had answered the problem in an elegant manner.

    14-bit MIDI is only implemented by a few iOS instruments (e.g. the Moog stuff), but it's a pain to set up. Yet again this isn't something that only affects iOS audio. Getting 14-bit MIDI set up on the desktop is a PITA too!

    Thank goodness MIDI 2.0 is (comparatively) just around the corner, as this (apparently) will specify a permanent solution to MIDI CC's being quantized to 128 possible values. This is one of those areas that give DAW based hosts an unfair advantage - they're the only environment where artists can modulate values at floating point rates.

    Thanks for updating the thread with more accurate information.

  • edited March 2019

    @blueveek said:

    @Peblin said:

    @lukesleepwalker said:

    @Hansson said:
    you can basically build your own modular DAW with AUM/APE and all goodies like this one

    THE FUTURE IS NOW!

    Almost :)
    Now all we need are patch cables in AUM to connect AUv3 parameters between AUs... then we're modular for realz!

    This is something I'd really really like as well. One of the things I'm hoping to polish enough for release is an automation editor (mentioned this in my first post as well). This automation editor can (not just) change things like velocity, aftertouch etc, but also draw automation for any other CC message, which the plugin can then output. Begin able to easily connect those CC messages to other AUv3 parameters in other plugins would be excellent.

    Hoping @j_liljedahl can help us out here :)

    @j_liljedahl said:

    @blueveek said:

    @jonmoore said:

    @blueveek said:

    @Samu said:
    Care to tease what the CC editing will look like?
    (From what I can see & read this will replace most of the AUv3 'sequencers' I have on my iPad)

    The look and feel isn't 100% set in stone, so I'll tease it when I'm absolutely sure it'll be what's actually shipped. Maybe in a week or so ;)

    @blueveek

    Out of interest, is there any way that you can hook directly into AUv3 floating point parameter automation values or will it be treated as 128 MIDI CC values?

    MIDI CC output is limited to the 128 values which MIDI provides, but automation values are also readable by hosts via the plugin's parameter values (which are floating point and have much better resolution). Routing can be done either way, using either MIDI or AUv3 parameter values, and is up to the user.

    I don't think there's any existing host that allows routing parameter-readings from one plugin to setting parameter values of another plugin. That would be cool, due to the floating point resolution. However, I don't think there's any realtime safe way to read a parameter value from the audio thread, and/or getting the timestamp corresponding to a reading. Also there would be no way to send multiple values during the same render cycle (which can be long with large buffer sizes).

    In short, I think using MIDI CC output is the way to go, since it allows multiple timestamped events during the same render cycle. Resolution could be improved by using 14-bit CC (not supported in AUM, yet).

    Excellent, thanks for the clarification. I agree about synchronisation concerns.

    After release, it would be good to sync up / discuss more about 14-bit CC values. I had someone requesting support for it in the automation tool as well, so maybe this is the best way to go (which is still standards compliant).

  • @j_liljedahl said:

    @blueveek said:

    @jonmoore said:

    @blueveek said:

    @Samu said:
    Care to tease what the CC editing will look like?
    (From what I can see & read this will replace most of the AUv3 'sequencers' I have on my iPad)

    The look and feel isn't 100% set in stone, so I'll tease it when I'm absolutely sure it'll be what's actually shipped. Maybe in a week or so ;)

    @blueveek

    Out of interest, is there any way that you can hook directly into AUv3 floating point parameter automation values or will it be treated as 128 MIDI CC values?

    MIDI CC output is limited to the 128 values which MIDI provides, but automation values are also readable by hosts via the plugin's parameter values (which are floating point and have much better resolution). Routing can be done either way, using either MIDI or AUv3 parameter values, and is up to the user.

    I don't think there's any existing host that allows routing parameter-readings from one plugin to setting parameter values of another plugin. That would be cool, due to the floating point resolution. However, I don't think there's any realtime safe way to read a parameter value from the audio thread, and/or getting the timestamp corresponding to a reading. Also there would be no way to send multiple values during the same render cycle (which can be long with large buffer sizes).

    In short, I think using MIDI CC output is the way to go, since it allows multiple timestamped events during the same render cycle. Resolution could be improved by using 14-bit CC (not supported in AUM, yet).

    Yay, another shout for HiRes (14 bit) CC's - My BassStation 2 is feeling a little sad at not being able to have it's mixer, LFO's and filter automated as smoothly as it could ;)

  • @blueveek @j_liljedahl

    Having reflected on this I'd be interested to see how things function with ApeMatrix and AUv3 MIDI FX (it's own LFO's aren't limited to 128 CC values when modulating AUv3's AFAIK). I've been meaning to set up a 6 voice template for Ruismaker Noir in AUM, ApeMatrix, Audiobus 3 and Cubasis to compare the UX differences and modulation capabilities. On paper, Cubasis is just another IAA application that can host AUv3 MIDI FX so it will be interesting to see how things play out.

    My expectations are that Cubasis will win with regards to floating point modulation resolution but it wouldn't surprise me if AUv3 MIDI FX is still quantized to 128 values.

    I use 14-bit MIDI with hardware synths via a Max for Live device, and with bespoke patch management plugins that facilitate modulation of parameter values at 14-bit resolution. But having owned a couple of Behringer BCR2000's for a good number of years, I've learned the hard way that having a 14-bit capable MIDI controller doesn't necessarily mean that you'll be able to modulate 14-bit capable plugin instruments with ease. :)

  • @blueveek This looks awesome, exactly what I've been waiting for... MIDI looping inside of AUM or AB with real-time recording and clip launching!

    So just to be clear, I'd want to use this also with external synths. So say I have my Sub 37 connected and set the input and outputs to the external ports... Can I start recording a loop into your plugin by punching in on the beat (like you said quantized launch) then record notes and knob CCs as long as I want (let's say 12 bars) then punch out and have it immediately start looping? If so this is genius.

    May I also suggest that there is a way to turn off MIDI input monitoring (if it is even needed) so that external synths can stay in Local On mode.

    Really looking forward to this!

  • @CracklePot said:

    @ZenKier said:
    This looks like it's off to an incredible start. I'm actually surprised this vacuum went unfilled as long as it did, as I'm sure many of us have been looking for something like this for quite a while now.

    p.s. This might be unrealistic to expect to see in any iOS daws/sequencers anytime soon (Ableton likely has it patented?) but I'd love to see something like their "midi capture" eventually picked up on other platforms. It's so nice to never miss those unexpected little inspired moments that randomly pop up during noodle/practice sessions since it's constantly recording the 10-minute buffer, as long as the daw is running.

    @ZenKier I remember the same midi capture function in FL Studio, from about 5 years ago when I was still using that primarily.

    You should check out the Midi Recorder app. It can run in the BG, capturing all the Midi flowing through it, and it even has a timer to end the recording after a specific time period.
    The only thing is you have to load it up and start it manually, so using it has to become like a habit.

    MIDI Recorder with E.Piano by Ryouta Kira
    https://itunes.apple.com/us/app/midi-recorder-with-e-piano/id1448577506?mt=8

    That's quite handy! Can't beat the price either ;)

  • @j_liljedahl said:

    @blueveek said:

    @jonmoore said:

    @blueveek said:

    @Samu said:
    Care to tease what the CC editing will look like?
    (From what I can see & read this will replace most of the AUv3 'sequencers' I have on my iPad)

    The look and feel isn't 100% set in stone, so I'll tease it when I'm absolutely sure it'll be what's actually shipped. Maybe in a week or so ;)

    @blueveek

    Out of interest, is there any way that you can hook directly into AUv3 floating point parameter automation values or will it be treated as 128 MIDI CC values?

    MIDI CC output is limited to the 128 values which MIDI provides, but automation values are also readable by hosts via the plugin's parameter values (which are floating point and have much better resolution). Routing can be done either way, using either MIDI or AUv3 parameter values, and is up to the user.

    I don't think there's any existing host that allows routing parameter-readings from one plugin to setting parameter values of another plugin. That would be cool, due to the floating point resolution. However, I don't think there's any realtime safe way to read a parameter value from the audio thread, and/or getting the timestamp corresponding to a reading. Also there would be no way to send multiple values during the same render cycle (which can be long with large buffer sizes).

    In short, I think using MIDI CC output is the way to go, since it allows multiple timestamped events during the same render cycle. Resolution could be improved by using 14-bit CC (not supported in AUM, yet).

    I know it would be another IOS hack, but couldnt you add an extra audio input to any plugin as a modulator linkable to any AU parameter, that would be high resolution, sample accurate.

  • Will it be universal on release? Interested by iPhone 7 Plus beta tester?

  • @lukesleepwalker said:

    @blueveek said:

    @lukesleepwalker said:
    @blueveek Did you say that the user can trigger the sequence without engaging the transport via MIDI input?

    I have a separate AU which is responsible with these kinds of triggers, yes. Planning to release it more or less at the same time as this Piano Roll (they're intended as a bundle).

    At the moment it requires the host transport to play in order to actually keep things in sync. Do you have some other behaviour in mind?

    I like to take some sequences off the grid to create interesting poly rhythms by manually playing them with a controller. There are other available tools I can use for this but I like the look of that piano roll! Icing rather than cake...

    YES! Triggering of a sequencer to advance steps, not just slaved to a master clock, is awesome. It allows for really cool transformations of the same basic series of notes, but in a new rhythm and is a great way to generate variety.

    Again, we go to the MIA Master of MIDI, Dr. Piz... Check out his Midisteps sequencer. you lay out your notes on the piano roll, assign a note to trigger, and every time it gets that note, it advances the sequencer a step. This of course assumes that you treat the piano roll as a fixed grid.

  • @MonkeyDrummer
    Quantum can do the triggered advance trick too.
    I think the Groove clips in Photon do something really similar, but you need the rhythm/dynamics info to be in a Midi clip to use it. I don’t think Photon can do it with live input from the user hitting a key.

  • @CracklePot said:
    @MonkeyDrummer
    Quantum can do the triggered advance trick too.
    I think the Groove clips in Photon do something really similar, but you need the rhythm/dynamics info to be in a Midi clip to use it. I don’t think Photon can do it with live input from the user hitting a key.

    I know that. You know how I know that? Because he put it in after I bugged him enough! :)

    But having it in an AU tool will be more handy, as I almost always get midi hiccups when I move back and forth between AUM and Quantum. I pretty much only use Quantum as a stand-alone now to seq hardware...

    Having the step trigger in an au, and being able to feed in something like Rozetta Rhythm (multiple tracks all sending same midi note to trigger, but having non-equal lengths, etc. is cool...).

    BTW, WTF is your avatar anyway?

  • @MonkeyDrummer said:

    @CracklePot said:
    @MonkeyDrummer
    Quantum can do the triggered advance trick too.
    I think the Groove clips in Photon do something really similar, but you need the rhythm/dynamics info to be in a Midi clip to use it. I don’t think Photon can do it with live input from the user hitting a key.

    I know that. You know how I know that? Because he put it in after I bugged him enough! :)

    But having it in an AU tool will be more handy, as I almost always get midi hiccups when I move back and forth between AUM and Quantum. I pretty much only use Quantum as a stand-alone now to seq hardware...

    Having the step trigger in an au, and being able to feed in something like Rozetta Rhythm (multiple tracks all sending same midi note to trigger, but having non-equal lengths, etc. is cool...).

    BTW, WTF is your avatar anyway?

    Lol, that WAS you. I remember the thread. I think you told me about MidiSteps having this feature at the time you were requesting it be added to Quantum.
    If you have Photon, the Groove feature is similar, but you have to work from pre-existing Midi Files, or I suppose you could make them on the fly by recording live into Photon, but still not entirely real-time. You still have to work within the buffer save/load paradigm.

    My Avatar is a Blobfish. A real, deep sea creature.
    I laughed when I saw this photo. I thought it looked a lot like that whore-porate asshole Ted Cruz, the Senator from Texas.

  • @CracklePot said:

    @MonkeyDrummer said:

    @CracklePot said:
    @MonkeyDrummer
    Quantum can do the triggered advance trick too.
    I think the Groove clips in Photon do something really similar, but you need the rhythm/dynamics info to be in a Midi clip to use it. I don’t think Photon can do it with live input from the user hitting a key.

    I know that. You know how I know that? Because he put it in after I bugged him enough! :)

    But having it in an AU tool will be more handy, as I almost always get midi hiccups when I move back and forth between AUM and Quantum. I pretty much only use Quantum as a stand-alone now to seq hardware...

    Having the step trigger in an au, and being able to feed in something like Rozetta Rhythm (multiple tracks all sending same midi note to trigger, but having non-equal lengths, etc. is cool...).

    BTW, WTF is your avatar anyway?

    Lol, that WAS you. I remember the thread. I think you told me about MidiSteps having this feature at the time you were requesting it be added to Quantum.
    If you have Photon, the Groove feature is similar, but you have to work from pre-existing Midi Files, or I suppose you could make them on the fly by recording live into Photon, but still not entirely real-time. You still have to work within the buffer save/load paradigm.

    My Avatar is a Blobfish. A real, deep sea creature.
    I laughed when I saw this photo. I thought it looked a lot like that whore-porate asshole Ted Cruz, the Senator from Texas.

    I think the fish has more backbone.

    Out of water, it's a spitting image of Soros.

    They both belong with the fishes...

  • Oh shit.
    Sorry about the politics, ya’ll.
    🤭

  • @CracklePot said:
    Oh shit.
    Sorry about the politics, ya’ll.
    🤭

    You're fired.

  • edited March 2019

    @soundshaper said:
    @blueveek This looks awesome, exactly what I've been waiting for... MIDI looping inside of AUM or AB with real-time recording and clip launching!

    So just to be clear, I'd want to use this also with external synths. So say I have my Sub 37 connected and set the input and outputs to the external ports... Can I start recording a loop into your plugin by punching in on the beat (like you said quantized launch) then record notes and knob CCs as long as I want (let's say 12 bars) then punch out and have it immediately start looping? If so this is genius.

    The way it's set up, "punch in/out" works a little differently than what you described, and I also think you have a very valid use-case listed here. This plugin goes to great lengths to be absolutely synced up with the host, so once the host starts playing, the playhead matches. If the host pauses, playhead remains where the host is paused etc. If the host moves its own playhead (if one exists), this plugin matches it etc.

    All of this is, of course, modulo looping duration (literally). So if the host's playhead is at bar 9, while the piano roll clip's duration is set to 4 bars, then: host time is bar 9, plugin time is bar 1 etc.

    Now of course, you can also start playing notes after the loop repeats, which get recorded normally, and which seems to me like it matches your use-case, but please correct me if I'm wrong.

    Auto-extending loop duration while recording, then "punching out" to start playing, might be tricky. The question is when to stop recording and start playing? The easiest approach is to listen some CC which says "I'm done recording, start playing". I think this is a valid approach, and curious to know your thoughts.

    @soundshaper said:
    May I also suggest that there is a way to turn off MIDI input monitoring (if it is even needed) so that external synths can stay in Local On mode.

    Really looking forward to this!

    So thru off?

  • @Janosax said:
    Will it be universal on release? Interested by iPhone 7 Plus beta tester?

    Universal on release, yes. The UI is responsive at any resolution or screen size.

  • edited March 2019

    @j_liljedahl Can AUM be notified to update the name it displays under plugin icons, based on the audioUnitShortName the plugin sends? Does it check it for each instance, and/or can it be updated at any moment?

    It would be nice to be able to name each instance of this Piano Roll, instead of all of them looking the same on every AUM track.

  • @blueveek said:

    @soundshaper said:
    @blueveek This looks awesome, exactly what I've been waiting for... MIDI looping inside of AUM or AB with real-time recording and clip launching!

    So just to be clear, I'd want to use this also with external synths. So say I have my Sub 37 connected and set the input and outputs to the external ports... Can I start recording a loop into your plugin by punching in on the beat (like you said quantized launch) then record notes and knob CCs as long as I want (let's say 12 bars) then punch out and have it immediately start looping? If so this is genius.

    The way it's set up, "punch in/out" works a little differently than what you described, and I also think you have a very valid use-case listed here. This plugin goes to great lengths to be absolutely synced up with the host, so once the host starts playing, the playhead matches. If the host pauses, playhead remains where the host is paused etc. If the host moves its own playhead (if one exists), this plugin matches it etc.

    All of this is, of course, modulo looping duration (literally). So if the host's playhead is at bar 9, while the piano roll clip's duration is set to 4 bars, then: host time is bar 9, plugin time is bar 1 etc.

    Now of course, you can also start playing notes after the loop repeats, which get recorded normally, and which seems to me like it matches your use-case, but please correct me if I'm wrong.

    Auto-extending loop duration while recording, then "punching out" to start playing, might be tricky. The question is when to stop recording and start playing? The easiest approach is to listen some CC which says "I'm done recording, start playing". I think this is a valid approach, and curious to know your thoughts.

    @soundshaper said:
    May I also suggest that there is a way to turn off MIDI input monitoring (if it is even needed) so that external synths can stay in Local On mode.

    Really looking forward to this!

    So thru off?

    I think I may have used the terms punch in and punch out incorrectly. I was just confirming the basic function is to press record and start recording MIDI into the piano roll, the press record again at any time when finished (the loop is quantized to the next bar) and it will begin looping? Essentially that, like Ableton, you don’t have to predefine the loop length.

    And yes, if you include an option to disable MIDI thru, then that would allow external instruments to use their own keyboards/pads and avoid double-triggering.

  • edited March 2019

    @soundshaper said:

    @blueveek said:

    @soundshaper said:
    @blueveek This looks awesome, exactly what I've been waiting for... MIDI looping inside of AUM or AB with real-time recording and clip launching!

    So just to be clear, I'd want to use this also with external synths. So say I have my Sub 37 connected and set the input and outputs to the external ports... Can I start recording a loop into your plugin by punching in on the beat (like you said quantized launch) then record notes and knob CCs as long as I want (let's say 12 bars) then punch out and have it immediately start looping? If so this is genius.

    The way it's set up, "punch in/out" works a little differently than what you described, and I also think you have a very valid use-case listed here. This plugin goes to great lengths to be absolutely synced up with the host, so once the host starts playing, the playhead matches. If the host pauses, playhead remains where the host is paused etc. If the host moves its own playhead (if one exists), this plugin matches it etc.

    All of this is, of course, modulo looping duration (literally). So if the host's playhead is at bar 9, while the piano roll clip's duration is set to 4 bars, then: host time is bar 9, plugin time is bar 1 etc.

    Now of course, you can also start playing notes after the loop repeats, which get recorded normally, and which seems to me like it matches your use-case, but please correct me if I'm wrong.

    Auto-extending loop duration while recording, then "punching out" to start playing, might be tricky. The question is when to stop recording and start playing? The easiest approach is to listen some CC which says "I'm done recording, start playing". I think this is a valid approach, and curious to know your thoughts.

    @soundshaper said:
    May I also suggest that there is a way to turn off MIDI input monitoring (if it is even needed) so that external synths can stay in Local On mode.

    Really looking forward to this!

    So thru off?

    I think I may have used the terms punch in and punch out incorrectly. I was just confirming the basic function is to press record and start recording MIDI into the piano roll, the press record again at any time when finished (the loop is quantized to the next bar) and it will begin looping? Essentially that, like Ableton, you don’t have to predefine the loop length.

    Looping happens automatically, based on a predefined number of bars ("duration") you've set up beforehand. I like the idea of auto-growing this duration though. Sounds like there should be an option, between either:
    1. Auto-growing duration by 1 bar while recording and playhead goes past the predefined duration.
    2. Auto-looping back to start while recording. This allows building-up drum patterns and is a common workflow.

    How does that sound?

  • @blueveek said:

    @soundshaper said:

    @blueveek said:

    @soundshaper said:
    @blueveek This looks awesome, exactly what I've been waiting for... MIDI looping inside of AUM or AB with real-time recording and clip launching!

    So just to be clear, I'd want to use this also with external synths. So say I have my Sub 37 connected and set the input and outputs to the external ports... Can I start recording a loop into your plugin by punching in on the beat (like you said quantized launch) then record notes and knob CCs as long as I want (let's say 12 bars) then punch out and have it immediately start looping? If so this is genius.

    The way it's set up, "punch in/out" works a little differently than what you described, and I also think you have a very valid use-case listed here. This plugin goes to great lengths to be absolutely synced up with the host, so once the host starts playing, the playhead matches. If the host pauses, playhead remains where the host is paused etc. If the host moves its own playhead (if one exists), this plugin matches it etc.

    All of this is, of course, modulo looping duration (literally). So if the host's playhead is at bar 9, while the piano roll clip's duration is set to 4 bars, then: host time is bar 9, plugin time is bar 1 etc.

    Now of course, you can also start playing notes after the loop repeats, which get recorded normally, and which seems to me like it matches your use-case, but please correct me if I'm wrong.

    Auto-extending loop duration while recording, then "punching out" to start playing, might be tricky. The question is when to stop recording and start playing? The easiest approach is to listen some CC which says "I'm done recording, start playing". I think this is a valid approach, and curious to know your thoughts.

    @soundshaper said:
    May I also suggest that there is a way to turn off MIDI input monitoring (if it is even needed) so that external synths can stay in Local On mode.

    Really looking forward to this!

    So thru off?

    I think I may have used the terms punch in and punch out incorrectly. I was just confirming the basic function is to press record and start recording MIDI into the piano roll, the press record again at any time when finished (the loop is quantized to the next bar) and it will begin looping? Essentially that, like Ableton, you don’t have to predefine the loop length.

    Looping happens automatically, based on a predefined number of bars ("duration") you've set up beforehand. I like the idea of auto-growing this duration though. Sounds like there should be an option, between either:
    1. Auto-growing duration by 1 bar while recording and playhead goes past the predefined duration.
    2. Auto-looping back to start while recording. This allows building-up drum patterns and is a common workflow.

    How does that sound?

    Sorry I’m interfering in the conversation but it sounds great to me!
    If you don’t know the loop length in advance the default “auto-growing” duration could be just 1 bar and it will work as Ableton.

Sign In or Register to comment.