Loopy Pro: Create music, your way.

What is Loopy Pro?Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.

Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.

Download on the App Store

Loopy Pro is your all-in-one musical toolkit. Try it for free today.

ATOM Piano Roll update is coming soon

1373840424357

Comments

  • @espiegel123 CC.

    It would be very annoying to record CC and not be able to edit/delete it (e.g. if a new layer or a different phrase is recorded).

    Regarding channels, you can pick whichever channel to output on (there's a new toolbar on the left that was shown in previous screenshots). Atom listens to everything it receives, and filtering input can be done at the host level.

  • I personally wouldn’t use it without automation. I automate everything :lol: But I’m not a randomization person, I’m a sequence what I want it to do person.

  • edited March 2021

    @drez said:
    I personally wouldn’t use it without automation. I automate everything :lol: But I’m not a randomization person, I’m a sequence what I want it to do person.

    To clarify: in my post, "automation" meant CC editing. Atom itself is automate-able (parameters exposed etc.).

  • @blueveek said:
    @espiegel123 CC.

    It would be very annoying to record CC and not be able to edit/delete it (e.g. if a new layer or a different phrase is recorded).

    Regarding channels, you can pick whichever channel to output on (there's a new toolbar on the left that was shown in previous screenshots). Atom listens to everything it receives, and filtering input can be done at the host level.

    I disagree somewhat. I would strongly recommend an option to be able to record all received information and play it back even if it weren’t editable (other than say maybe select all and moving everything in time).

    It is the single biggest hole in the MIDI recording AU world..the ability to simply capture and playback everything that comes in.

    Sure there are lots times where one wants to edit the CC stuff...but I’d rather be able to capture and play everything without editing than not have that.

  • Definitely release it! the current feature set looks amazing and music will be made!

  • I vote for going with option #1. While automation is great, it's not like there aren't other ways we can do it right now on iOS, so that's not really a deal breaker. Plus, it means that we'll get multiple updates, which is always more exciting! It's why I always like Hanukah over Christmas - more days of presents!

  • Number one - out now. improve over time.

    I want to be able to IMPORT and edit MIDI without having to jump to a full DAW
    That’s enough for me right now. I know you’ll get the rest right eventually and it sounds like you need some time to think about it passively and inspirationally without forcing a decision.

  • @espiegel123 said:

    @blueveek said:
    @espiegel123 CC.

    It would be very annoying to record CC and not be able to edit/delete it (e.g. if a new layer or a different phrase is recorded).

    Regarding channels, you can pick whichever channel to output on (there's a new toolbar on the left that was shown in previous screenshots). Atom listens to everything it receives, and filtering input can be done at the host level.

    I disagree somewhat. I would strongly recommend an option to be able to record all received information and play it back even if it weren’t editable (other than say maybe select all and moving everything in time).

    It is the single biggest hole in the MIDI recording AU world..the ability to simply capture and playback everything that comes in.

    Sure there are lots times where one wants to edit the CC stuff...but I’d rather be able to capture and play everything without editing than not have that.

    I 100% agree with this

  • @espiegel123 I'll think about it!

  • @blueveek said:
    @espiegel123 I'll think about it!

    That would be great! In this 'mode', ATOM could be really dumb.

  • @blueveek said:
    @espiegel123 I'll think about it!

    There is a lit of interest I. Recording mpe at the moment after the release of the swam sounds. Loopbud was released but currently does not record ccs or record all channels, whichever one of you does this will definitely get sales. I'll certainly be happy to explain in a video why a mode like this would be so useful, even without editing.

  • Unless doing option 1 delays releasing w modular automation by a lot, it seems to be the most optimal one? It will make ppl like me, who doesn't need the full fledged version right now, very happy. And for people that are willing to wait, it would be no different.

  • @Gavinski said:

    @espiegel123 said:

    @blueveek said:
    @espiegel123 CC.

    It would be very annoying to record CC and not be able to edit/delete it (e.g. if a new layer or a different phrase is recorded).

    Regarding channels, you can pick whichever channel to output on (there's a new toolbar on the left that was shown in previous screenshots). Atom listens to everything it receives, and filtering input can be done at the host level.

    I disagree somewhat. I would strongly recommend an option to be able to record all received information and play it back even if it weren’t editable (other than say maybe select all and moving everything in time).

    It is the single biggest hole in the MIDI recording AU world..the ability to simply capture and playback everything that comes in.

    Sure there are lots times where one wants to edit the CC stuff...but I’d rather be able to capture and play everything without editing than not have that.

    I 100% agree with this

    +1!

  • MPE is becoming essential. More and more apps and DAW's support it at this moment. So please give it support. And a +1 for MacOS M1 support.

  • Option 1 for me.

    MPE would be great in a future update.

    Thanks for all your hard work.

  • Selfishly I’d like option 1, mostly because I wanna start experimenting with how it fits into my workflow.

    But there’s some good points raised here re. Option 2 and if that feels right to you then it’s the right option.

    PS. Fwiw LK shipped with very basic automation which works just fine. I have every faith it’ll improve but it’s not stopped me finishing a track with it.

  • edited March 2021

    Needs CC too so option 2.

  • One 👊🏼🙃

  • edited March 2021

    2, or release the future version as ATOM 3 with CC editing. ATOM 4 with MPE.

  • edited March 2021

    @blueveek said:

    @drez said:
    I personally wouldn’t use it without automation. I automate everything :lol: But I’m not a randomization person, I’m a sequence what I want it to do person.

    To clarify: in my post, "automation" meant CC editing. Atom itself is automate-able (parameters exposed etc.).

    Yeah, that’s what I figured. If I can’t automate the thing I am sequencing, it’s useless for my use case. I see what other people are saying as useful for importing midi, but my plan was to use it as a sequencer in AUM to sequence midi notes and automate AU’s parameters like I can in a DAW. If I can’t do that, then yeah it’s not doing anything for me. Doesn’t mean people don’t need the stuff available in option 1, but for me I will be waiting.

  • I voted for option 2 initially, but in reading other’s comments I realize that further updates would stoke interest in the app. I really do want automation, but can live without it for now. Whatever - it will be great either way.

  • Option 1, no matter what features are present at launch you will still get flooded with requests here. This crowd is never satisfied lol.

  • Can we define what we mean by automation?

    For instance: I have a synth that is being sent midi from an instance of Atom. As this clip plays back, I manually open and close the filter on the synth. That gesture could not be recorded by Atom2, correct? Or would it be recorded but editing it would not be possible? Would love to see that in the future.

    I still vote for No. 1. By a mile.

  • @ExAsperis99 said:
    Can we define what we mean by automation?

    For instance: I have a synth that is being sent midi from an instance of Atom. As this clip plays back, I manually open and close the filter on the synth. That gesture could not be recorded by Atom2, correct? Or would it be recorded but editing it would not be possible? Would love to see that in the future.

    I still vote for No. 1. By a mile.

    What you’d like to achieve won’t be possible, even later. Not without lengthy mapping etc. Software synths rarely transmit/generate midi, there is nothing Atom can do about that.

  • I vote for option 1...

    I'd like to have an AU Piano-Roll Midi editor that will enable me to record Midi notes or open a Midi file made by another App. Edit the Midi notes. Then export a Midi file from it that I can use in other Apps.

  • @0tolerance4silence said:

    @ExAsperis99 said:
    Can we define what we mean by automation?

    For instance: I have a synth that is being sent midi from an instance of Atom. As this clip plays back, I manually open and close the filter on the synth. That gesture could not be recorded by Atom2, correct? Or would it be recorded but editing it would not be possible? Would love to see that in the future.

    I still vote for No. 1. By a mile.

    What you’d like to achieve won’t be possible, even later. Not without lengthy mapping etc. Software synths rarely transmit/generate midi, there is nothing Atom can do about that.

    He may have meant opening and closing it via CC event from a controller -- which for me is the critical case. Capturing the CCs generated by the knobs and wheels and sliders on my controllers (and aftertouch, etc)

  • Option 1 for sure. People will always want more. From an agile perspective, big releases are more few and far between for many companies. More frequent, smaller releases are the way to go when you don't have an army of devs, or more importantly, QA folks to ensure a big release has all the bugs worked out.

  • @espiegel123 said:

    @0tolerance4silence said:

    @ExAsperis99 said:
    Can we define what we mean by automation?

    For instance: I have a synth that is being sent midi from an instance of Atom. As this clip plays back, I manually open and close the filter on the synth. That gesture could not be recorded by Atom2, correct? Or would it be recorded but editing it would not be possible? Would love to see that in the future.

    I still vote for No. 1. By a mile.

    What you’d like to achieve won’t be possible, even later. Not without lengthy mapping etc. Software synths rarely transmit/generate midi, there is nothing Atom can do about that.

    He may have meant opening and closing it via CC event from a controller -- which for me is the critical case. Capturing the CCs generated by the knobs and wheels and sliders on my controllers (and aftertouch, etc)

    You are right!
    But most of these things can already be done. The whole point is to deliver a complete experience. We have enough half solutions. If dev need to get some weight off his chest 👍 but releasing it too early has its risks.
    I think this is a completely different case from f.e LK - open ended development
    Atom was built with specific goal, ‘adding missing features’ is not the same as ‘adding new features’ and that will be felt on the user end.

  • @0tolerance4silence said:

    @espiegel123 said:

    @0tolerance4silence said:

    @ExAsperis99 said:
    Can we define what we mean by automation?

    For instance: I have a synth that is being sent midi from an instance of Atom. As this clip plays back, I manually open and close the filter on the synth. That gesture could not be recorded by Atom2, correct? Or would it be recorded but editing it would not be possible? Would love to see that in the future.

    I still vote for No. 1. By a mile.

    What you’d like to achieve won’t be possible, even later. Not without lengthy mapping etc. Software synths rarely transmit/generate midi, there is nothing Atom can do about that.

    He may have meant opening and closing it via CC event from a controller -- which for me is the critical case. Capturing the CCs generated by the knobs and wheels and sliders on my controllers (and aftertouch, etc)

    You are right!
    But most of these things can already be done. The whole point is to deliver a complete experience. We have enough half solutions. If dev need to get some weight off his chest 👍 but releasing it too early has its risks.
    I think this is a completely different case from f.e LK - open ended development
    Atom was built with specific goal, ‘adding missing features’ is not the same as ‘adding new features’ and that will be felt on the user end.

    This!

This discussion has been closed.