Loopy Pro: Create music, your way.

What is Loopy Pro?Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.

Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.

Download on the App Store

Loopy Pro is your all-in-one musical toolkit. Try it for free today.

Sound Quality Audio Evolution vs Cubasis 3

Hello dear iOS music producers and artists, have any of you ever noticed the fact that the Recorded tracks in AEM sounds much better than the Cubasis 3. I’ve been an avid user CB 3, but couple of days ago just wanted to checkout the other daws.

I might be wrong, but please kindly look into the matter. Also I’ve decided to shift my focus towards AEM. Cubasis is pretty fast but I don’t know why it delivers such poor quality recorded audio. My interface is Mtrack Air 2in2 out.

Comments

  • It “technically” should be the same (ya know…just 1’s and 0’s).

    I don’t have any experience with AEM but I have noticed that the volume in Cubasis 3 seems to be lower than other apps. Could that be what you’re hearing? In audio, your brain almost always thinks the louder signal sounds better.

  • There’s been discussion about this in the past, I’ve actually asked the AEM developer why this was the case. He told me he had no idea and he suspects all iOS DAWs probably just have a direct input for audio like he does.
    ??????

  • @NoiseHorse said:
    There’s been discussion about this in the past, I’ve actually asked the AEM developer why this was the case. He told me he had no idea and he suspects all iOS DAWs probably just have a direct input for audio like he does.
    ??????

    Would you mind linking that discussion? I can’t seem to find it

  • @mtenk said:
    It “technically” should be the same (ya know…just 1’s and 0’s).

    I don’t have any experience with AEM but I have noticed that the volume in Cubasis 3 seems to be lower than other apps. Could that be what you’re hearing? In audio, your brain almost always thinks the louder signal sounds better.

    Dear mtenk, exactly, in addition,it’s not about the loudness, I’ve also noticed some differences in clarity and the overall texture of the recorded tracks. I’m so scared( cause I too know logically it’s quite impossible to happen) to say that the sound quality changes. I don’t know, also I’d like to state that I’m running an ipad pro m1. I haven’t got any clue.

  • Please don’t kill me, I’ve also noticed a pretty deep difference in how the AU are processed in both of those stated daws. AEM nails them. What’s the sorcery here. I wished if someone could explain this event.

  • wimwim
    edited September 2022

    @helzfire said:
    Please don’t kill me, I’ve also noticed a pretty deep difference in how the AU are processed in both of those stated daws. AEM nails them. What’s the sorcery here. I wished if someone could explain this event.

    Can you clarify what you mean? What do you think AEM does better regarding AUs than Cubasis?

  • The user and all related content has been deleted.
  • @ehehehe said:
    Post some a/b examples then, please.

    Yes, this ☝️

  • @Sabicas said:

    @NoiseHorse said:
    There’s been discussion about this in the past, I’ve actually asked the AEM developer why this was the case. He told me he had no idea and he suspects all iOS DAWs probably just have a direct input for audio like he does.
    ??????

    Would you mind linking that discussion? I can’t seem to find it

    https://forum.audiob.us/discussion/25776/which-daw-gets-you-your-sound-easiest#latest

  • A good test is to render each DAW with an identical setup out to audio. Make sure that neither is clipping / going over 0db on the master, and peaking roughly the same. Then, using AudioShare or something else, normalize each exported file. Then do your best to a/b compare them. A blind test is best, if you can get someone to randomly play them back for you.

    It's likely that one DAW is simply playing back louder than the other, which invariably sounds "better". The exported files should be indistinguishable. If not, then there's something worth looking into further.

    If the exported files sound the same, but the two sound different live after doing your best to match the volume levels, then it may be that the two use different quality for live playback. This could be the case if one is trying to optimize CPU use during live play.

  • @MobileMusicPro : What’s your take on this?

  • edited September 2022

    @Telstar5 said:
    @MobileMusicPro : What’s your take on this?

    Volume is generally low in any daw until you begin applying a channel strip, AEM might have a channel strip enabled by default for example. The sound quality though should be relatively similar as we're just talking about 1s and 0s processing audio the same since synths in the 80s, just at a higher bit rate. I've used all the major DAWs but never tried AEM and all of them sound slightly different in volume but the same in quality. There are different pieces of hardware that can certainly color a sound differently like maschine vs mpc for example, but it's all subjective to taste. @wim's A/B test should put to rest any doubt or confusion.

  • Even easier than an aural A/B test: Simply invert one of the (normalized) recordings, and then mix the two. If they're identical, there should be total silence. If there's not, then you'll actually hear the differences between them (roughly speaking).

  • @MobileMusicPro : Thanks for your response here . Makes sense, however I just had a play w FL Studio mobile , a d damned if those bass drums weren’t the punchiest and fullest sounding I’ve heard on iOS .

  • @SevenSystems said:
    Even easier than an aural A/B test: Simply invert one of the (normalized) recordings, and then mix the two. If they're identical, there should be total silence. If there's not, then you'll actually hear the differences between them (roughly speaking).

    Yep a null test with the normalised recordings would be a good way to see any differences if any exist.

  • Nulling will only work if the latency is precisely the same in the two signals. If it is off by even one sample , it won’t work. A blind test of the normalized signal will work.

    It also will only work if you start with identical signals. Synths and whatnot may not generate the same signal even from identical midi synth it may randomize the phase of the oscillators.

  • @espiegel123 said:
    Nulling will only work if the latency is precisely the same in the two signals. If it is off by even one sample , it won’t work. A blind test of the normalized signal will work.

    Of course, I assumed that the comparison would be done fully technically correct, which assumes exactly aligning the signals ;)

    It also will only work if you start with identical signals. Synths and whatnot may not generate the same signal even from identical midi synth it may randomize the phase of the oscillators.

    That's true. This could be somewhat mitigated by comparing the spectra instead (but this won't easily catch any time-dependent components, like transient response etc.)

  • @SevenSystems said:

    @espiegel123 said:
    Nulling will only work if the latency is precisely the same in the two signals. If it is off by even one sample , it won’t work. A blind test of the normalized signal will work.

    Of course, I assumed that the comparison would be done fully technically correct, which assumes exactly aligning the signals ;)

    It also will only work if you start with identical signals. Synths and whatnot may not generate the same signal even from identical midi synth it may randomize the phase of the oscillators.

    That's true. This could be somewhat mitigated by comparing the spectra instead (but this won't easily catch any time-dependent components, like transient response etc.)

    Honestly, I think it will be an easier and more useful test to just record and normalize the signals and compare them blindly. It is a simple process and what matters is if the signals sound identical rather than if they are identical -- which in this sort of situation isn't guaranteed.

  • @espiegel123 said:

    @SevenSystems said:

    @espiegel123 said:
    Nulling will only work if the latency is precisely the same in the two signals. If it is off by even one sample , it won’t work. A blind test of the normalized signal will work.

    Of course, I assumed that the comparison would be done fully technically correct, which assumes exactly aligning the signals ;)

    It also will only work if you start with identical signals. Synths and whatnot may not generate the same signal even from identical midi synth it may randomize the phase of the oscillators.

    That's true. This could be somewhat mitigated by comparing the spectra instead (but this won't easily catch any time-dependent components, like transient response etc.)

    Honestly, I think it will be an easier and more useful test to just record and normalize the signals and compare them blindly. It is a simple process and what matters is if the signals sound identical rather than if they are identical -- which in this sort of situation isn't guaranteed.

    That goes against my scientifically minded mindedness... 😜

  • Btw, at some point (because it was relevant for a couple of apps that I was comparing), I built some test files with carefully placed impulses at known distances from the file beginning and from each other and it is not uncommon for hosts to have slightly different offsets from each other. This would be exacerbated if one is triggering the sounds from sequencers -- even if there weren't variations in the sounds that the same synths produce when being triggered by the same sequence played by the same sequencer.

  • I use Cubasis and the renders I have checked to the original do sound different in terms of texture. There is a latency issue so I always need to micro drag the audio off grid into place to have it synced properly. But the sound is indeed different, slightly. Hard to tell what could be causing this...

  • I started out bouncing between a cassette deck and a reel-to-reel. These kinds of discussions always bring a smile to my face. 😁

  • @anickt said:
    I started out bouncing between a cassette deck and a reel-to-reel. These kinds of discussions always bring a smile to my face. 😁

    Indeed, it's important to note that with a proper mastering chain in place anything being rendered by Cubasis should be just as good as any other DAW especially when using desktop class plugins like Toneboosters and FabFilter.

  • @Telstar5 said:
    @MobileMusicPro : Thanks for your response here . Makes sense, however I just had a play w FL Studio mobile , a d damned if those bass drums weren’t the punchiest and fullest sounding I’ve heard on iOS .

    Yes if you look by default they have a limiter enabled on the master channel.

  • edited October 2022

    @MobileMusicPro said:

    @Telstar5 said:
    @MobileMusicPro : Thanks for your response here . Makes sense, however I just had a play w FL Studio mobile , a d damned if those bass drums weren’t the punchiest and fullest sounding I’ve heard on iOS .

    Yes if you look by default they have a limiter enabled on the master channel.

    Where are you seeing that? There’s no limiter on a default new project.

  • @anickt said:

    @MobileMusicPro said:

    @Telstar5 said:
    @MobileMusicPro : Thanks for your response here . Makes sense, however I just had a play w FL Studio mobile , a d damned if those bass drums weren’t the punchiest and fullest sounding I’ve heard on iOS .

    Yes if you look by default they have a limiter enabled on the master channel.

    Where are you seeing that? There’s no limiter on a default new project.

    Sorry that looks like it's only on the default demo project which comes up upon first use.

  • @MobileMusicPro said:

    @anickt said:

    @MobileMusicPro said:

    @Telstar5 said:
    @MobileMusicPro : Thanks for your response here . Makes sense, however I just had a play w FL Studio mobile , a d damned if those bass drums weren’t the punchiest and fullest sounding I’ve heard on iOS .

    Yes if you look by default they have a limiter enabled on the master channel.

    Where are you seeing that? There’s no limiter on a default new project.

    Sorry that looks like it's only on the default demo project which comes up upon first use.

    I never looked at any of the demo projects. 😆

Sign In or Register to comment.