Loopy Pro: Create music, your way.

What is Loopy Pro?Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.

Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.

Download on the App Store

Loopy Pro is your all-in-one musical toolkit. Try it for free today.

Audio Evolution Mobile Studio

1252628303140

Comments

  • @tja said:
    Ahh, great!

    And you used my tip about a developer tipp 😅🤗

    I did send a tener... the first, I presume

    Thanks! :smiley:

  • edited January 2023

    How does the MIDI and audio file editing in AEM compare to that in Cubasis?
    And how easy is it to change tempo and time signatures?

  • edited January 2023

    @TimRussell said:
    How does the MIDI and audio file editing in AEM compare to that in Cubasis?
    And how easy is it to change tempo and time signatures?

    Or in general:
    http://www.extreamsd.com/index.php/video-tutorials

    As author I cannot comment on comparison, I leave that to others. :)

  • I love the big fader and I used the new tip feature.
    You’re doing a an amazing work, thanks for it.
    Any chance you can expand the frame rate in aem?
    Graphics wise it feels “unnatural” because of this low frame rate when moving on the timeline/midi piano roll.

  • @bargale said:
    I love the big fader and I used the new tip feature.
    You’re doing a an amazing work, thanks for it.
    Any chance you can expand the frame rate in aem?
    Graphics wise it feels “unnatural” because of this low frame rate when moving on the timeline/midi piano roll.

    Thanks. There is no frame rate adjustment. It just draws at the maximum speed possible. Perhaps one day I can convert it to JUCE. There are currently 3 different graphics systems involved: the timeline is drawn with Skia, which was done because in the initial revisions, CoreGraphics was used which was painfully slow (software rendering). Skia is what Android uses in the back-end for graphics, so porting the graphics from Android to iOS was made much easier with the use of Skia. And Skia uses OpenGL I believe, so it was much faster. The piano roll is still CoreGraphics, but there is less to draw usually than the timeline. But I agree, graphics in Cubasis for example feels faster. There is just no easy fix.

  • @dwrae said:

    @bargale said:
    I love the big fader and I used the new tip feature.
    You’re doing a an amazing work, thanks for it.
    Any chance you can expand the frame rate in aem?
    Graphics wise it feels “unnatural” because of this low frame rate when moving on the timeline/midi piano roll.

    Thanks. There is no frame rate adjustment. It just draws at the maximum speed possible. Perhaps one day I can convert it to JUCE. There are currently 3 different graphics systems involved: the timeline is drawn with Skia, which was done because in the initial revisions, CoreGraphics was used which was painfully slow (software rendering). Skia is what Android uses in the back-end for graphics, so porting the graphics from Android to iOS was made much easier with the use of Skia. And Skia uses OpenGL I believe, so it was much faster. The piano roll is still CoreGraphics, but there is less to draw usually than the timeline. But I agree, graphics in Cubasis for example feels faster. There is just no easy fix.

    I appreciate the fast response and you taking the time to explain it.
    I got wiser just by reading it.

  • @bargale said:

    @dwrae said:

    @bargale said:
    I love the big fader and I used the new tip feature.
    You’re doing a an amazing work, thanks for it.
    Any chance you can expand the frame rate in aem?
    Graphics wise it feels “unnatural” because of this low frame rate when moving on the timeline/midi piano roll.

    Thanks. There is no frame rate adjustment. It just draws at the maximum speed possible. Perhaps one day I can convert it to JUCE. There are currently 3 different graphics systems involved: the timeline is drawn with Skia, which was done because in the initial revisions, CoreGraphics was used which was painfully slow (software rendering). Skia is what Android uses in the back-end for graphics, so porting the graphics from Android to iOS was made much easier with the use of Skia. And Skia uses OpenGL I believe, so it was much faster. The piano roll is still CoreGraphics, but there is less to draw usually than the timeline. But I agree, graphics in Cubasis for example feels faster. There is just no easy fix.

    I appreciate the fast response and you taking the time to explain it.
    I got wiser just by reading it.

    Oh, and I just got an idea by rethinking this.. hmm.. I'll check it out.

  • edited January 2023

    @bargale said:
    I appreciate the fast response and you taking the time to explain it.
    I got wiser just by reading it.

    Ok, so I did some measurements on an iPad Pro 1st gen from 2015 (in release mode) with a project with 9 stereo audio tracks, almost filling the entire timeline, so lots to draw.
    The time it takes to render a frame is under 1 millisecond, so like 400 to 800 microseconds. This would mean a possible frame rate of over 100fps (on the 2015 device!). The actual frequency it calls the render function is 40 times, so 40fps, when swiping a finger left and right through the project like crazy.

    On my iPad PRO M2 (with a bit smaller screen, 11" compared to the 12.x" of the older device), the render times are more like 150 microseconds and the render function is called about 120 times per second.

    I had an idea to improve the render times: let's say the timeline area has a resolution of 2000x1500 and the scale factor of most iPads is '2'. Due to historic reasons, the app draws the waveforms at the high resolution. With JUCE, you would get an area of 1000x750 and would therefore render at half the resolution and it would basically draw the lines 2 pixels thick. This would speed up drawing, but as you can see, the render time on the old device is already below a millisecond, so not sure if this is going to help, but I can try it by rendering the lines in Skia with 2 pixels thickness and reducing the amount of calculation by a factor 2.

    Edit: setting the line thickness to 2 and drawing 2 times less lines resulting in the render times increasing 10 times, so it got much worse..

  • The user and all related content has been deleted.
  • @dwrae said:

    @bargale said:
    I appreciate the fast response and you taking the time to explain it.
    I got wiser just by reading it.

    Ok, so I did some measurements on an iPad Pro 1st gen from 2015 (in release mode) with a project with 9 stereo audio tracks, almost filling the entire timeline, so lots to draw.
    The time it takes to render a frame is under 1 millisecond, so like 400 to 800 microseconds. This would mean a possible frame rate of over 100fps (on the 2015 device!). The actual frequency it calls the render function is 40 times, so 40fps, when swiping a finger left and right through the project like crazy.

    On my iPad PRO M2 (with a bit smaller screen, 11" compared to the 12.x" of the older device), the render times are more like 150 microseconds and the render function is called about 120 times per second.

    I had an idea to improve the render times: let's say the timeline area has a resolution of 2000x1500 and the scale factor of most iPads is '2'. Due to historic reasons, the app draws the waveforms at the high resolution. With JUCE, you would get an area of 1000x750 and would therefore render at half the resolution and it would basically draw the lines 2 pixels thick. This would speed up drawing, but as you can see, the render time on the old device is already below a millisecond, so not sure if this is going to help, but I can try it by rendering the lines in Skia with 2 pixels thickness and reducing the amount of calculation by a factor 2.

    Edit: setting the line thickness to 2 and drawing 2 times less lines resulting in the render times increasing 10 times, so it got much worse..

    Would there be any difference if you’ll use metal rendering instead of OpenGL?
    Although I’m just speculating it’s OpenGL and I might be wrong.
    I saw on the skia GitHub page that’s it’s possible to use runtime checks to only use the Metal backend

  • @dwrae
    Found a bug
    Black project => loaded auv3 instrument Animoog z, recorded playing from it’s ui to the timeline, playing back fine, recording another take on top doesn’t let me hear the previous recording(for instance if I want to record the automation separately to the same pattern I’d like to listen to it while playing) + when I try to record again, sometimes it works, sometimes I get an error message, I attached a screenshot.
    The error message happens either I record on top off an existing pattern -or- when it’s recording on a blank space on the timeline

  • The user and all related content has been deleted.
  • @tja said:

    @bargale said:

    @dwrae said:

    @bargale said:
    I appreciate the fast response and you taking the time to explain it.
    I got wiser just by reading it.

    Ok, so I did some measurements on an iPad Pro 1st gen from 2015 (in release mode) with a project with 9 stereo audio tracks, almost filling the entire timeline, so lots to draw.
    The time it takes to render a frame is under 1 millisecond, so like 400 to 800 microseconds. This would mean a possible frame rate of over 100fps (on the 2015 device!). The actual frequency it calls the render function is 40 times, so 40fps, when swiping a finger left and right through the project like crazy.

    On my iPad PRO M2 (with a bit smaller screen, 11" compared to the 12.x" of the older device), the render times are more like 150 microseconds and the render function is called about 120 times per second.

    I had an idea to improve the render times: let's say the timeline area has a resolution of 2000x1500 and the scale factor of most iPads is '2'. Due to historic reasons, the app draws the waveforms at the high resolution. With JUCE, you would get an area of 1000x750 and would therefore render at half the resolution and it would basically draw the lines 2 pixels thick. This would speed up drawing, but as you can see, the render time on the old device is already below a millisecond, so not sure if this is going to help, but I can try it by rendering the lines in Skia with 2 pixels thickness and reducing the amount of calculation by a factor 2.

    Edit: setting the line thickness to 2 and drawing 2 times less lines resulting in the render times increasing 10 times, so it got much worse..

    Would there be any difference if you’ll use metal rendering instead of OpenGL?
    Although I’m just speculating it’s OpenGL and I might be wrong.
    I saw on the skia GitHub page that’s it’s possible to use runtime checks to only use the Metal backend

    If I remember correctly, Audulus was changed to use Metal instead of OpenGL because of performance issues.

    I was just suggesting as he said he is using skia.
    I have no clue if it's using metal or not so i might be wrong.

  • In case anyone hasn't noticed yet, Yovop is a bot, albeit a very entertaining one.

  • @bargale said:
    Would there be any difference if you’ll use metal rendering instead of OpenGL?
    Although I’m just speculating it’s OpenGL and I might be wrong.
    I saw on the skia GitHub page that’s it’s possible to use runtime checks to only use the Metal backend

    @SevenSystems said:
    In case anyone hasn't noticed yet, Yovop is a bot, albeit a very entertaining one.

    I was about to write that this looked like a ChatGPT answer :smiley:

  • @dwrae said:

    @bargale said:
    Would there be any difference if you’ll use metal rendering instead of OpenGL?
    Although I’m just speculating it’s OpenGL and I might be wrong.
    I saw on the skia GitHub page that’s it’s possible to use runtime checks to only use the Metal backend

    @SevenSystems said:
    In case anyone hasn't noticed yet, Yovop is a bot, albeit a very entertaining one.

    I was about to write that this looked like a ChatGPT answer :smiley:

    Bots are becoming too good these days

  • @dwrae said:

    @bargale said:
    Would there be any difference if you’ll use metal rendering instead of OpenGL?
    Although I’m just speculating it’s OpenGL and I might be wrong.
    I saw on the skia GitHub page that’s it’s possible to use runtime checks to only use the Metal backend

    @SevenSystems said:
    In case anyone hasn't noticed yet, Yovop is a bot, albeit a very entertaining one.

    I was about to write that this looked like a ChatGPT answer :smiley:

    😊 I've set it a trap which readily gave it away:

  • Nice to have @dwrae and @SevenSystems in the same thread! Any chance of a tech swap? It’d be nice to get Xequence piano roll and instrument controllers in AEM! That’d be a god tier collab! $50 midi pro suite IAP? No problem, done!

  • @ipadbeatmaking said:
    Nice to have @dwrae and @SevenSystems in the same thread! Any chance of a tech swap? It’d be nice to get Xequence piano roll and instrument controllers in AEM! That’d be a god tier collab! $50 midi pro suite IAP? No problem, done!

    +1, it would be amazing.

  • Unfortunately, not as easy as an oil filter swap 😉😭

  • @ipadbeatmaking said:
    Nice to have @dwrae and @SevenSystems in the same thread! Any chance of a tech swap? It’d be nice to get Xequence piano roll and instrument controllers in AEM! That’d be a god tier collab! $50 midi pro suite IAP? No problem, done!

    Exactly, “shut up and take my money” ;)

  • Im giving AEM a serious chance and noting all the little UX things that drive me mad but i think this make it unusable for me.

    AEM doesn’t auto fade the audio clips making them click when you loop them. All audio should have micro fade that ensures no clicks happen when pasting takes.

    I created this with the repeat feature.

    @dwrae is there a way of addressing this I’m missing? Maybe there is a setting? But if there is it should be activated by default. I’ve tried manually adding fades in and out but it is a very tedious work and it doesn’t work all the time :(

  • @cokomairena said:

    Im giving AEM a serious chance and noting all the little UX things that drive me mad but i think this make it unusable for me.

    AEM doesn’t auto fade the audio clips making them click when you loop them. All audio should have micro fade that ensures no clicks happen when pasting takes.

    I created this with the repeat feature.

    @dwrae is there a way of addressing this I’m missing? Maybe there is a setting? But if there is it should be activated by default. I’ve tried manually adding fades in and out but it is a very tedious work and it doesn’t work all the time :(

    It would be a serious disaster for most people if the app started to fade clips automatically.

  • The user and all related content has been deleted.
  • @dwrae said:

    @cokomairena said:

    Im giving AEM a serious chance and noting all the little UX things that drive me mad but i think this make it unusable for me.

    AEM doesn’t auto fade the audio clips making them click when you loop them. All audio should have micro fade that ensures no clicks happen when pasting takes.

    I created this with the repeat feature.

    @dwrae is there a way of addressing this I’m missing? Maybe there is a setting? But if there is it should be activated by default. I’ve tried manually adding fades in and out but it is a very tedious work and it doesn’t work all the time :(

    It would be a serious disaster for most people if the app started to fade clips automatically.

    Only if it faded clips that don't sit on top with other clips
    Otherwise it would be a blessing.
    All daws on desktop work this way and even the latest fl studio update highlighted this feature.

    It's not a feature for a small niche but rather a standard for many years already.

    While we want it, we also understand you're only human and you already have plans.

    I think the best thing to do to make the development roadmap transparent by creating a roadmap like what loopy pro has.
    This way you could also see what is more important to us users and what will serve a niche audience.

    This is the loopy pro roadmap as a reference -
    https://loopypro.com/roadmap

  • @bargale said:

    @dwrae said:

    @cokomairena said:

    Im giving AEM a serious chance and noting all the little UX things that drive me mad but i think this make it unusable for me.

    AEM doesn’t auto fade the audio clips making them click when you loop them. All audio should have micro fade that ensures no clicks happen when pasting takes.

    I created this with the repeat feature.

    @dwrae is there a way of addressing this I’m missing? Maybe there is a setting? But if there is it should be activated by default. I’ve tried manually adding fades in and out but it is a very tedious work and it doesn’t work all the time :(

    It would be a serious disaster for most people if the app started to fade clips automatically.

    Only if it faded clips that don't sit on top with other clips
    Otherwise it would be a blessing.
    All daws on desktop work this way and even the latest fl studio update highlighted this feature.

    It's not a feature for a small niche but rather a standard for many years already.

    While we want it, we also understand you're only human and you already have plans.

    I think the best thing to do to make the development roadmap transparent by creating a roadmap like what loopy pro has.
    This way you could also see what is more important to us users and what will serve a niche audience.

    This is the loopy pro roadmap as a reference -
    https://loopypro.com/roadmap

    That's the thing, if you move a clip on top of another clip, it will ask for a cross-fade or not. Just putting clips after each other should never do this.

    I am very sorry, I won't publish any roadmaps.

  • @Yovop said:
    @bargale
    Mobile DAWs that you can use on your iPhone or iPad to make music have a special feature called "fading clips" which lets you adjust the sound of the music you are making. This makes it easier to make professional sounding music from your phone or tablet.

    Fading clips is especially useful for making smooth transitions between different parts of a song, or for creating a gradual fade in or out. With this feature, you can easily adjust the volume of specific parts of a song, allowing you to create a desired effect. Additionally, you can use fading clips to transition between different audio effects, such as reverbs or delays, for a more professional sounding mix.

    Fading clips can also be used to create a more complex soundscape within the song, by blending different elements together. For example, you could use a fade in and out on a vocal sample, and then layer it with a guitar track for a unique effect. This technique can be used to create a variety of textures, making it a great tool for sound designers and producers. Furthermore, you can also use fading clips to transition between different sections of a track, adding an interesting dynamic to the song.

    One of the issues you might face when using fading clips is that the transition may sound too abrupt or too slow. To avoid this, you should use automation to precisely control the speed and intensity of the fade. Additionally, you should make sure that the two elements you are blending together are complementary, otherwise the transition may feel unnatural or jarring. Finally, you should be aware of the context of your song and how the fading clips will fit into the overall structure. By taking the time to consider all these factors, you can ensure that your fading clips add to the overall soundscape of the song.

    To that end, you should experiment with different fade lengths and speeds to determine what works best for your song. For instance, if your track is more laid back and mellow, you may want to use a longer and slower fade. Alternatively, if your track is more upbeat and energetic, you may opt for a shorter and more abrupt fade. Additionally, you should be mindful of the sound of your fading clips and adjust the volume accordingly. For example, if your fading clip is louder than the section it is transitioning into, you may want to use a faster fade to ensure that the volume levels don’t become too overwhelming. By taking the time to experiment, you can ensure that your fading clips create a smooth and balanced transition.

    Can you/this thing be turned off??

  • The user and all related content has been deleted.
  • @dwrae said:

    @bargale said:

    @dwrae said:

    @cokomairena said:

    Im giving AEM a serious chance and noting all the little UX things that drive me mad but i think this make it unusable for me.

    AEM doesn’t auto fade the audio clips making them click when you loop them. All audio should have micro fade that ensures no clicks happen when pasting takes.

    I created this with the repeat feature.

    @dwrae is there a way of addressing this I’m missing? Maybe there is a setting? But if there is it should be activated by default. I’ve tried manually adding fades in and out but it is a very tedious work and it doesn’t work all the time :(

    It would be a serious disaster for most people if the app started to fade clips automatically.

    Only if it faded clips that don't sit on top with other clips
    Otherwise it would be a blessing.
    All daws on desktop work this way and even the latest fl studio update highlighted this feature.

    It's not a feature for a small niche but rather a standard for many years already.

    While we want it, we also understand you're only human and you already have plans.

    I think the best thing to do to make the development roadmap transparent by creating a roadmap like what loopy pro has.
    This way you could also see what is more important to us users and what will serve a niche audience.

    This is the loopy pro roadmap as a reference -
    https://loopypro.com/roadmap

    That's the thing, if you move a clip on top of another clip, it will ask for a cross-fade or not. Just putting clips after each other should never do this.

    I am very sorry, I won't publish any roadmaps.

    I have no malice intentions, it's your software and your choice.
    P.S
    For the faded clips, adding it as an option in the settings and being able to set it as default could become a goos compromise, thus not affecting anyone who doesn't want it.

    P.S
    Did you see a post i added earlier in regards to a bug recording midi from animoog z?
    I posted the steps to recreate it.

  • @bargale said:

    P.S
    For the faded clips, adding it as an option in the settings and being able to set it as default could become a goos compromise, thus not affecting anyone who doesn't want it.

    Sure, I just don't understand the need or reasoning yet. If you fade-in the start of a kick-loop, you would lose the attack. If you put two loops after each other, you cannot cross-fade since cross-fading requires data that isn't there. If you purchase loops like the ones within AEMS, then the loops are made in such a way that they start and end properly, so no clicks can occur.

    P.S
    Did you see a post i added earlier in regards to a bug recording midi from animoog z?
    I posted the steps to recreate it.

    Yes, it's just incredibly busy, sorry. I'll need to look at it.

Sign In or Register to comment.