Loopy Pro: Create music, your way.

What is Loopy Pro?Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.

Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.

Download on the App Store

Loopy Pro is your all-in-one musical toolkit. Try it for free today.

Sloppy timing in Cubasis!

2

Comments

  • @SevenSystems
    Learning a lot right here.
    Definitely NOT annoyed.
    Thank you for the insight.
    B)

  • I've added the 16th note pattern from BeatMaker 3.

    So far Cubasis 2 is the ONLY one that suffers from timing problems. The other apps play quite solid. Their recorded audio and rendered audio have the same timing..

    Track 1: Cubasis recorded from the iPad's audio interface to my laptop
    Track 2 Cubasis Mixdown
    Track 3: Caustic recorded from the iPad's audio interface to my laptop
    Track 4: Caustic Mixdown
    Track 5: BeatMaker 3 recorded from the iPad's audio interface to my laptop
    Track 6: BeatMaker 3 Mixdown

  • @SlowwFloww : Excellent post. I hope that you will submit your graphic and test results to the Steinberg folks.

  • edited August 2018

    @SlowwFloww said:
    I've added the 16th note pattern from BeatMaker 3.

    So far Cubasis 2 is the ONLY one that suffers from timing problems. The other apps play quite solid. Their recorded audio and rendered audio have the same timing..

    Track 1: Cubasis recorded from the iPad's audio interface to my laptop
    Track 2 Cubasis Mixdown
    Track 3: Caustic recorded from the iPad's audio interface to my laptop
    Track 4: Caustic Mixdown
    Track 5: BeatMaker 3 recorded from the iPad's audio interface to my laptop
    Track 6: BeatMaker 3 Mixdown

    I know this is frustrating for you but thanks for bring this topic up and making the effort to post an overview.

  • edited August 2018

    @SlowwFloww oh, Beatmaker is right on the spot! Not bad! (actually Beatmaker 2 (!) is still the only DAW on iOS which can sync to MIDI clock/start/stop/spp/continue, so maybe it's not too surprising...)

  • The user and all related content has been deleted.
  • The user and all related content has been deleted.
  • The user and all related content has been deleted.
  • @SevenSystems said:
    @SlowwFloww oh, Beatmaker is right on the spot! Not bad! (actually Beatmaker 2 (!) is still the only DAW on iOS which can sync to MIDI clock/start/stop/spp/continue, so maybe it's not too surprising...)

    Well - there's Loopy, Genome and Gadget. Not DAWs in the classic sense but they all sync tighter than BM2.

    I guess it's because most developers struggle with knowing/understanding how to properly implement rock-solid slave sync of audio tracks, not the least thanks to the minimal SDK documentation and the lack of sophisticated examples.

    Why has it been mostly different with Ableton LINK?
    I guess it's because
    A) Part of the timing-critical stuff is handled inside the library itself and
    B) The documentation states in detail what you have to do to make your app sync tightly.

    Loopy is a great exception with its two-phase time-stretching concept and apparently sane UI vs audio vs MIDI scheduling.

  • @SlowwFloww said:
    I've added the 16th note pattern from BeatMaker 3.

    So far Cubasis 2 is the ONLY one that suffers from timing problems. The other apps play quite solid. Their recorded audio and rendered audio have the same timing..

    Track 1: Cubasis recorded from the iPad's audio interface to my laptop
    Track 2 Cubasis Mixdown
    ...

    Would you mind sending me the Cubasis project file and let me check if timing is different on older iOS versions? I'm really curious to know.

  • edited August 2018

    I did a new more thorough test... couldn't sleep last night.... Plus, I couldn't stand that Caustic showed slightly lacking timing :-)
    (The wavefiles of the last tests weren't aligned properly in Ableton - in this new test they are, as you can see)

    This time I used a square wave as sample as it is easier to align the wave files. After aligning in Ableton I made close ups of 3 sections (marker 1, 2, 3) to show the timing of the 16th notes.

    Outcome of this new test:
    • Only Cubasis 2 shows sloppy timing when playing back a project... my ears were right..

    iPad 2017
    iOS 11.3.
    Cubasis polyphony: 24
    Cubasis hardware latency: medium
    Used 1 track only with internal Mini Sampler
    Audio interface: Zoom u22
    Only 1 app active at the time
    iPad is in flight mode

    Total view of the wavefiles:

    1. Zoom-in at marker 1

    1. Zoom-in at marker 2

    1. Zoom-in at marker 3

  • edited August 2018

    Here's a zip-file with the projectfiles

  • edited August 2018

    I see discrepancy between recording and mixdown in Caustic, too. Obviously, Cubasis needs fixing for the EDM crowd, but as a workaround, what if you either freeze the midi tracks or reimport a mixdown of the midi tracks and use that as a backing while you work on the song?

  • The user and all related content has been deleted.
  • @SevenSystems Getting MIDI events ahead of time is not so easy, as far as I know. CoreMIDI automatically delivers the packets "on time", which is actually the same as "too late" if you want to be prepared for producing audio at the correct time in a coming render cycle. One can however configure a virtual midi input to receive the packets directly as they are sent. This gives two other problems though: the received packets might not come in chronological order, so you need to implement your own realtime-safe priority-list scheduler or the like, so you can buffer the events and act upon them at their correct time later. And, it only works for virtual midi inputs, so you can't just connect directly to some other apps virtual midi output as usually done. One must either select your virtual input as the destination in the sending app, or use a trick with CoreMIDI "MIDIThru" objects. That is also the only way one can receive events ahead of time from hardware sources, at least in theory (not tested).

    (I'd love to discuss this in more detail and see what solutions can be found, feel free to send me an email!)

  • The user and all related content has been deleted.
  • edited August 2018

    @j_liljedahl said:
    @SevenSystems Getting MIDI events ahead of time is not so easy, as far as I know. CoreMIDI automatically delivers the packets "on time", which is actually the same as "too late" if you want to be prepared for producing audio at the correct time in a coming render cycle. One can however configure a virtual midi input to receive the packets directly as they are sent. This gives two other problems though: the received packets might not come in chronological order, so you need to implement your own realtime-safe priority-list scheduler or the like, so you can buffer the events and act upon them at their correct time later. And, it only works for virtual midi inputs, so you can't just connect directly to some other apps virtual midi output as usually done. One must either select your virtual input as the destination in the sending app, or use a trick with CoreMIDI "MIDIThru" objects. That is also the only way one can receive events ahead of time from hardware sources, at least in theory (not tested).

    (I'd love to discuss this in more detail and see what solutions can be found, feel free to send me an email!)

    This could be an awesome sharing of knowledge! Best thing about audiobus forum is when chats like this happens :smile:

  • edited August 2018

    @tja said:
    Fantastic work, @SlowwFloww
    Many thanks for that!

    Let me ask for clarification:

    Mixdown is a mixdown in Cubasis.
    But what exactly means recording?

    And does a track freeze differ from a mixdown?

    @tja Hey thanx !

    'Recording' is just a 'realtime' recording of the iPad's audio output while playing a Cubasis project. It's the audio that I hear when I'm working on my track.. Let's say I want to record a bassline over my drumtrack... the drumtrack's timing sucks because it sounds as if it's not playing right.. the tempo is off. It's like when you're lightly touching a vinyl record when it plays so it slows the record down. But in reality there is nothing wrong with the drumtrack at all.. It's just the software that has problems playing back at a contstant speed...

    So if you are doing a live performance playing your Cubasis project, the timing of your songs will suck... same goes for playing midi events to external hardware synths..

    So I recorded the headphone/audio output (Zoom u22) of the iPad playing my Cubasis project in Audacity on my Macbook so I could check timing visually....

    'Mixdown' is an audio rendering by the software and doesn't suffer from tempo changes.. it's just printing de project to an audiofile...

    A 'trackfreeze' is the same but only per track: you're rendering the audio to a file so the cpu doesn't have to calculate the audio that a plugin generates in realtime...

  • edited August 2018

    @j_liljedahl said:
    @SevenSystems Getting MIDI events ahead of time is not so easy, as far as I know. CoreMIDI automatically delivers the packets "on time", which is actually the same as "too late" if you want to be prepared for producing audio at the correct time in a coming render cycle. One can however configure a virtual midi input to receive the packets directly as they are sent. This gives two other problems though: the received packets might not come in chronological order, so you need to implement your own realtime-safe priority-list scheduler or the like, so you can buffer the events and act upon them at their correct time later. And, it only works for virtual midi inputs, so you can't just connect directly to some other apps virtual midi output as usually done. One must either select your virtual input as the destination in the sending app, or use a trick with CoreMIDI "MIDIThru" objects. That is also the only way one can receive events ahead of time from hardware sources, at least in theory (not tested).

    (I'd love to discuss this in more detail and see what solutions can be found, feel free to send me an email!)

    On the virtual input, I assume we're both talking about kMIDIPropertyAdvanceScheduleTimeMuSec being non-zero to get MIDI packets immediately when they're sent.

    About the problems: the order in which you receive events should already be chronological, because the CoreMIDI docs say that senders are supposed to use chronologically increasing timestamps. For example, Xequence sends a new packet list once every 300 milliseconds (and makes sure at least the next 600 milliseconds of MIDI data are included), and it sorts the events by their time in the song prior to inserting them. And of course the order the MIDI packet LISTS themselves get sent in is "automatically" chronological by definition... The only problem I can see is when you receive packets from several SEPARATE sources (apps) on the same destination, but that's rather unlikely?

    And regarding "the other way around": yes that is a problem, but I think the much more "normal" way to connect MIDI apps is that the sender chooses the destination, not that the receiver chooses the source. So I guess must apps (should) already do it the "good" way...

  • @SevenSystems said:
    The only problem I can see is when you receive packets from several SEPARATE sources (apps) on the same destination, but that's rather unlikely?

    In AUM's case if all apps bombard the AUM virtual midi port with data it can quickly become a havoc...

    Say for example you use Chord PolyPad (or other 'controller type' apps) to trigger a synth hosted in AUM as well as sending data from Xeqence or ModStep all data ends up at AUM's virtual port. AUM would then have to first analyse the incoming packets and let the user decide which source app to prioritise etc. etc.

  • @SevenSystems said:

    @j_liljedahl said:
    @SevenSystems Getting MIDI events ahead of time is not so easy, as far as I know. CoreMIDI automatically delivers the packets "on time", which is actually the same as "too late" if you want to be prepared for producing audio at the correct time in a coming render cycle. One can however configure a virtual midi input to receive the packets directly as they are sent. This gives two other problems though: the received packets might not come in chronological order, so you need to implement your own realtime-safe priority-list scheduler or the like, so you can buffer the events and act upon them at their correct time later. And, it only works for virtual midi inputs, so you can't just connect directly to some other apps virtual midi output as usually done. One must either select your virtual input as the destination in the sending app, or use a trick with CoreMIDI "MIDIThru" objects. That is also the only way one can receive events ahead of time from hardware sources, at least in theory (not tested).

    (I'd love to discuss this in more detail and see what solutions can be found, feel free to send me an email!)

    On the virtual input, I assume we're both talking about kMIDIPropertyAdvanceScheduleTimeMuSec being non-zero to get MIDI packets immediately when they're sent.

    Yes. And it only works for virtual inputs, meaning that any apps virtual output is useless in this regard. Connecting virtual outputs to destinations is quite natural as a user experience though, for example in AUMs midi matrix. (But there is a trick using MIDIThru, though it feels a bit convoluted/hackish. Especially if one needs to have one individual virtual input per connected source).

    About the problems: the order in which you receive events should already be chronological, because the CoreMIDI docs say that senders are supposed to use chronologically increasing timestamps. For example, Xequence sends a new packet list once every 300 milliseconds (and makes sure at least the next 600 milliseconds of MIDI data are included), and it sorts the events by their time in the song prior to inserting them. And of course the order the MIDI packet LISTS themselves get sent in is "automatically" chronological by definition... The only problem I can see is when you receive packets from several SEPARATE sources (apps) on the same destination, but that's rather unlikely?

    Unfortunately that's not unlikely at all. As @Samu mentioned above, you could have multiple sources connected to AUMs "MIDI Control" input. Or a synth receiving from both Xequence and a hardware midi controller for knobs and faders, etc. The CoreMIDI system scheduler takes care of all that, by collecting and sorting events before it's their time, but this kind of thing needs to be done ourself if we use kMIDIPropertyAdvanceScheduleTimeMuSec, and it's not trivial to get right regarding realtime safety etc (perhaps a pool of pre-allocated linked-list nodes that gets inserted at the correct positions in a linked list? I wonder how CoreMIDI implements it.)

    And regarding "the other way around": yes that is a problem, but I think the much more "normal" way to connect MIDI apps is that the sender chooses the destination, not that the receiver chooses the source. So I guess must apps (should) already do it the "good" way...

    I agree that's normal for a stand-alone sequencer app like yours. But for a host like AUM it's not, because it can host both the source and destination and provide a UI for connecting them.

  • edited August 2018

    @j_liljedahl ah sorry, of course you're right in that AUM obviously receives from several different MIDI sources at once on its single "AUM destination".

    BUT, as we're talking about that: I always wondered why AUM doesn't create a SEPARATE MIDI destination for each hosted app (if a single app can create multiple destinations, but I guess so)? That way, the 16 channel limitation would be gone, it would all be less confusing (as you could actually NAME the Virtual destination using the name of the hosted app), and this chronological order problem would also be a bit simpler because you would be able to assume that at least while on the same destination, the order would be chronological...

    Regarding how CoreMIDI does this... wasn't CoreMIDI ripped from an open-source project by Apple sometime in the past?

  • Regarding CoreMidi my main gripe in how it's implemented in many, many apps.

    It's like this 'always on' thing that can not be turned off causing havoc when the app is also connected thru IAA.
    (Meaning a CoreMidi enabled app always listens to both the 'hardware' and also the data coming from a 'IAA Connected Host'). This is one of the reasons why it's next to impossible to use Korg iM1 or iOS SoundCanvas with an external controller and let the 'host' do the midi-routing as IM1 & iOS Soundcanvas will listen to 'both' borths with no option to disable one or the other...

    At some point AudioBus implemented a 'Disable CoreMidi' feature when the app was connected using AudioBus but I never got it to work properly...

  • @TheOriginalPaulB said:
    but as a workaround, what if you either freeze the midi tracks or reimport a mixdown of the midi tracks and use that as a backing while you work on the song?

    Yeah that may be a workaround.. but every midi track that you record has te be freezed this way...
    I rather have a sequencer that performs solid so I don't need to freeze tracks when my CPU isn't stressed at all...

  • The user and all related content has been deleted.
  • Can any of you check this drum track I made? Do you hear any timing problems? I'm curious if my ears are still allright.. it's a simple 4 to the floor beat.. no shuffle whatsoever...

    The zip file contains an mp3 file..

  • @SevenSystems said:
    @j_liljedahl ah sorry, of course you're right in that AUM obviously receives from several different MIDI sources at once on its single "AUM destination".

    BUT, as we're talking about that: I always wondered why AUM doesn't create a SEPARATE MIDI destination for each hosted app (if a single app can create multiple destinations, but I guess so)? That way, the 16 channel limitation would be gone, it would all be less confusing (as you could actually NAME the Virtual destination using the name of the hosted app), and this chronological order problem would also be a bit simpler because you would be able to assume that at least while on the same destination, the order would be chronological...

    Regarding how CoreMIDI does this... wasn't CoreMIDI ripped from an open-source project by Apple sometime in the past?

    I've thought about that as well, but it would be a bit messy in case an IAA app exposes their own virtual input. However, it wouldn't solve the issue - there would still be situations where you want more than one source connected to one synth, for example!

  • @SlowwFloww said:
    Can any of you check this drum track I made? Do you hear any timing problems? I'm curious if my ears are still allright.. it's a simple 4 to the floor beat.. no shuffle whatsoever...

    The zip file contains an mp3 file..

    Yes, it has the timing errors (jitter) mentioned in this thread.

  • @Samu said:
    At some point AudioBus implemented a 'Disable CoreMidi' feature when the app was connected using AudioBus but I never got it to work properly...

    In Xequence, you will have separate buttons for each source, including "Xequence Destination" and "Audiobus", so if you want to mess up, you have to do so manually :)

    @SlowwFloww said:
    Can any of you check this drum track I made? Do you hear any timing problems? I'm curious if my ears are still allright.. it's a simple 4 to the floor beat.. no shuffle whatsoever...

    The zip file contains an mp3 file..

    Yes, immediately and extremely obvious to me even on iPhone internal speaker.

  • @SevenSystems said:

    @Samu said:
    At some point AudioBus implemented a 'Disable CoreMidi' feature when the app was connected using AudioBus but I never got it to work properly...

    In Xequence, you will have separate buttons for each source, including "Xequence Destination" and "Audiobus", so if you want to mess up, you have to do so manually :)

    @SlowwFloww said:
    Can any of you check this drum track I made? Do you hear any timing problems? I'm curious if my ears are still allright.. it's a simple 4 to the floor beat.. no shuffle whatsoever...

    The zip file contains an mp3 file..

    Yes, immediately and extremely obvious to me even on iPhone internal speaker.

    But I don't think you can 'override' CoreMidi if the app doesn't have any port settings or?

Sign In or Register to comment.