Loopy Pro: Create music, your way.

What is Loopy Pro?Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.

Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.

Download on the App Store

Loopy Pro is your all-in-one musical toolkit. Try it for free today.

Tasty Pixel Midi Clock Sync

13»

Comments

  • To me (despite the fact that there's other solution that 'provides the most accurate outgoing clock') the current state of midi sync in my current configuration is as good as bad. This is why I will support anybody who will get their hands dirty trying to sort it out. If the solution is out there why isn't it in here? Why don't we have a copyright fight while we keep using wonky sync.

  • edited January 2015

    The midiBus library does not currently have midi time code (24 full/quarter frames usually over sysex), just midi beatclock (start/stop/continue/tick - but this is accurate).

    What's the difference? MTC supplies the frame in the message, so is bulky and really only as accurate as the sending system. If that is IOS - as Michael says IOS/OSX are not realtime systems (I can't get anything better than 1ms sometimes and have spent weeks looking for solutions).

    With beat clock, there is no positioning - its all relative - you just count the ticks from the start.

    If this solution is just beat clock, well yes we already have this in midiBus - just no input - but the solution to this is easy (just process midi clock ticks fast, don't hold up the thread and display bpm). I do this in midiSequencer for SLAVE mode.

    So if the master clock app sends "start" and sends midi beat clock ticks (at whatever rate can be received by the slave app), and the slave app is up to snuff, then there should really be a fixed and easily counted number of ticks after "start" is received, and so that is all you need for an accurate timeline. Right?

  • My concern is that if what everyone wants is a standard, that all apps should conform to, then one SDK solution is preferred over a number of them (that will all be bound to be slightly different).

    Can you imagine what a mess there would be if there were 5 different MIDI-like protocols?

  • @ehauri said:

    My concern is that if what everyone wants is a standard, that all apps should conform to, then one SDK solution is preferred over a number of them (that will all be bound to be slightly different).

    Can you imagine what a mess there would be if there were 5 different MIDI-like protocols?

    Exactly my original point.

  • edited January 2015

    @ehauri @BiancaNeve I think Sonorsaurus' point was that which SDK you use doesn't matter so long as each SDK implements the existing MIDI Standard properly.

    @midiSequencer that was my actually my poorly stated point—dedicated boxes vs 'non-realtime' OSes like iOS. Though I don't exactly get what 'real time' is in reference to—I took it to mean dedicated.

  • @syrupcore - by realtime I mean can process without lag or delay - IOS is far from accurate as a timekeeper.

  • At 44.1k sample-accurate timing is +/- 0.023ms. If the Midi buffer is 512 then the jitter can be as large as 11ms (or sample-accurate, depending on how well the buffer handles the timing).

    It seems to me an iOS benchmark is what is needed, a very trim app that publishes all its available ports, and that apps can connect to in order to test how well they sync; a light & fast audio engine that produces simple pulses that can either be measured on an oscilloscope or dumped into a DAW to measure the resulting jitter.

    Something like this kind of test, but applied to iOS.
    http://www.eigenzone.org/2012/12/04/midi-jitter

  • @ehauri great article on midi jitter. I hope the developer community can come up with some solutions for these issues.

  • edited January 2015

    Still don't understand most of the stuff being talked about here, but I do know I have 6 different MIDI controller apps on my iPad, and pretty much no common ground in terms of functionality. Some do some things, some do others, and some only do some things with certain apps while not doing those same things with other apps.

    If that last sentence confused you, then you know how I feel trying to get midi to work on my iPad.

    And if I'm reading this thread correctly, it doesn't even appear to be all that clear to developers just what is going on. Before a skirmish breaks out, let's all hold hands and sing a peaceful song.

    Edit: Just don't anyone try to include midi in that song. It probably won't work. ;)

  • @1P18 - a lot of what you would like to know would be instantly available if app developers simply included a Midi Implementation Chart in the user guide of the app. This chart is a standard thing included in the user manual of any and all midi-enabled hardware, so why not software too? It is up to the developer to include this.

    http://www.midi.org/techspecs/midi_chart-v2.pdf

  • @supadom said:

    To me (despite the fact that there's other solution that 'provides the most accurate outgoing clock') the current state of midi sync in my current configuration is as good as bad. This is why I will support anybody who will get their hands dirty trying to sort it out. If the solution is out there why isn't it in here? Why don't we have a copyright fight while we keep using wonky sync.

    Thats right, supadom :-)

  • edited January 2015

    @BiancaNeve said:

    Audionics just pasted this over at Discchord

    "This video" being the one where Michael announces his sync engine.

    "What is described in this video is pretty well what the MidiBus library does now. It’s been available for over a year and around 130 developers have requested access. There is a growing number of existing apps using the library (just over 40) and more are in the works. Details at audeonic midibus. It provides the most accurate outgoing clock (this is verified) as well as setting things up MIDI-wise to make an app ‘behave nicely’ (including OMAC MIDI controlled app switching) in the iOS MIDI world out of the box with minimal effort required by the developer. It is freely available (even to Michael Tyson) for both iOS and Mac OS X. It’s actively being worked on (MTC generation is next) and diligently supported. There is also documentation on how to set up a ‘model’ app to handle incoming sync that is not tied to any particular audio engine. A number of apps are using this model and it is working well for them.

    The MidiBus utility app already measures error and latency on incoming clock signals and that has also been available for over a year.

    Tackling the MIDI ‘discrepancies’ under iOS is something I have been active in for quite some time now based on my experience with MIDI interconnectivity apps. John Walden’s Music App Blog (see Music App Blog) has a very comprehensive article on this subject for anyone interested.

    I would go as far to say that I feel it is extremely disingenuous of Michael to claim that ‘there is nothing around’ in the video. It’s been done and is already here."

    Problem is,none of my favorite apps is supporting it yet.

  • Well @Crabman, none of them is supporting Michael's solution either.

    Time to bang on your dev and leave appropriate feedback via their support email, forum, FB page or (last resort) in a review on the iTunes app store. Make the suggestion to them, either MidiBus or SSE, and see what happens. There are a lot of responsive devs out there who really want to make their product work well.

  • @ehauri said:

    @1P18 - a lot of what you would like to know would be instantly available if app developers simply included a Midi Implementation Chart in the user guide of the app. This chart is a standard thing included in the user manual of any and all midi-enabled hardware, so why not software too? It is up to the developer to include this.

    http://www.midi.org/techspecs/midi_chart-v2.pdf

    Documentation is not the problem. Some apps don't have things implemented, and some implementations flat out don't work right. I've even contacted some developers about the stuff that doesn't work right, and it has been confirmed that certain apps (not going to name any names) are a problem when it comes to midi.

  • Quite apart from the lack of transport or clock send/receive features, here is a really good explanation of jitter and a simple test that anyone with a decent DAW can set up to check the jitter in their own workflow. This is for hardware boxes, many of them almost sample-accurate, but direct analogies with desktop and iOS apps abound.

    And they have measured the jitter in many hardware devices (e.g. Name & Shame).

    http://www.innerclocksystems.com/New ICS Litmus.html

  • Excuse my naivety but shouldn't this be baked into the os? Not that midi is a big deal for latency but I wonder about having to bounce in and out of user-space for this kind of functionality.

  • @ehauri said:

    Well @Crabman, none of them is supporting Michael's solution either.

    Well,how could they if it's even not ready yet?While Midibus had quite some time now to get at least the most important Devs/apps on board.Anyway,i don't care who delivers a stable Midi clock if at least it's working with my apps.

  • Well, for hardware it IS baked into the device's firmware - but remember that these devices are tailored specifically for musical sequencing. And note that +/- 5ms (250 samples @ 48kHz) is considered bad in the hardware world.

    OSX/iOS are multi-use OS's that are (unfortunately) not primarily designed with musicians in mind, as @midiSequencer mentions above.

    But if the OS is capable of 1ms reliably (and "reliably" is the key word here), then that is still pretty good even by hardware standards.

  • 'i don't care who delivers a stable Midi clock if at least it's working with my apps'

    Hey @Crabman, stop reading my thoughts!

  • @supadom you might not care but I have noted a ~20ms penalty for using Audiobus and Auria rather than iaa and Auria. This may be an implementation issue but there are good reasons (performance and generality) that specific functionality is provided by the os.

  • @Deselby said:

    @supadom you might not care but I have noted a ~20ms penalty for using Audiobus and Auria rather than iaa and Auria. This may be an implementation issue but there are good reasons (performance and generality) that specific functionality is provided by the os.

    I think we'd all love it to be baked into the OS and agree that has the potential for optimal performance.

    Making that happen is a whole 'nother issue...

  • Please don't be defeatest. I am sure that 'the ghost in the machine' that is Apple is paying attention to this forum. By virtue that Audiobus, and by this I mean the individuals involved, posited and created a vision of what and where the technology for music production needed to be directed, I am sure that the audio minds in Apple are paying attention to this forum. I think the best evidence of this is the fact that Garageband supports Audiobus. Kudos. And it was shortly afterwards, if memory serves, that iaa was announced. In the grand scheme of things ios, the importance of music creation, our grail, is probably pretty low in general ranking. But I am sure that there are individuals at Apple who are as commited to making ios a music creation platform/environment. It is only by expressing our commitment that they will recognize that we are with them in expressing theirs.

  • Please don't be defeatest. I am sure that 'the ghost in the machine' that is Apple is paying attention to this forum. By virtue that Audiobus, and by this I mean the individuals involved, posited and created a vision of what and where the technology for music production needed to be directed, I am sure that the audio minds in Apple are paying attention to this forum. I think the best evidence of this is the fact that Garageband supports Audiobus. Kudos. And it was shortly afterwards, if memory serves, that iaa was announced. In the grand scheme of things ios, the importance of music creation, our grail, is probably pretty low in general ranking. But I am sure that there are individuals at Apple who are as commited to making ios a music creation platform/environment. It is only by expressing our commitment that they will recognize that we are with them in expressing theirs.

  • I'm going back to using a regular computer to post here. The whole ipad/safari experience is just too stochastic ... but I do love the ipad ....

  • edited January 2015

    @Deselby I'm not a defeatest but it might fairly be suggested that you are an optimisticest. :)

    Certainly there are folks at Apple aware of the awesome that is AudioBus (see GarageBand) and aware of our concerns/needs as musicians using pro-ish tools (see core audio and core midi and IAA later) but whether or not those that may be sympathetic to our first world plight actually have enough board room swagger to get resources devoted to moving the OS from 20ms to 1ms consistently (as an example) vs increasing battery life or making photos better or (...) remains to be seen. I can't say I blame them; sure, I want a rock solid audio/midi system baked into iOS but I quite like the battery life and camera on my phone!

Sign In or Register to comment.