Loopy Pro: Create music, your way.
What is Loopy Pro? — Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.
Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.
Download on the App StoreLoopy Pro is your all-in-one musical toolkit. Try it for free today.
Comments
To me (despite the fact that there's other solution that 'provides the most accurate outgoing clock') the current state of midi sync in my current configuration is as good as bad. This is why I will support anybody who will get their hands dirty trying to sort it out. If the solution is out there why isn't it in here? Why don't we have a copyright fight while we keep using wonky sync.
So if the master clock app sends "start" and sends midi beat clock ticks (at whatever rate can be received by the slave app), and the slave app is up to snuff, then there should really be a fixed and easily counted number of ticks after "start" is received, and so that is all you need for an accurate timeline. Right?
My concern is that if what everyone wants is a standard, that all apps should conform to, then one SDK solution is preferred over a number of them (that will all be bound to be slightly different).
Can you imagine what a mess there would be if there were 5 different MIDI-like protocols?
@ehauri said:
Exactly my original point.
@ehauri @BiancaNeve I think Sonorsaurus' point was that which SDK you use doesn't matter so long as each SDK implements the existing MIDI Standard properly.
@midiSequencer that was my actually my poorly stated point—dedicated boxes vs 'non-realtime' OSes like iOS. Though I don't exactly get what 'real time' is in reference to—I took it to mean dedicated.
@syrupcore - by realtime I mean can process without lag or delay - IOS is far from accurate as a timekeeper.
At 44.1k sample-accurate timing is +/- 0.023ms. If the Midi buffer is 512 then the jitter can be as large as 11ms (or sample-accurate, depending on how well the buffer handles the timing).
It seems to me an iOS benchmark is what is needed, a very trim app that publishes all its available ports, and that apps can connect to in order to test how well they sync; a light & fast audio engine that produces simple pulses that can either be measured on an oscilloscope or dumped into a DAW to measure the resulting jitter.
Something like this kind of test, but applied to iOS.
http://www.eigenzone.org/2012/12/04/midi-jitter
@ehauri great article on midi jitter. I hope the developer community can come up with some solutions for these issues.
Still don't understand most of the stuff being talked about here, but I do know I have 6 different MIDI controller apps on my iPad, and pretty much no common ground in terms of functionality. Some do some things, some do others, and some only do some things with certain apps while not doing those same things with other apps.
If that last sentence confused you, then you know how I feel trying to get midi to work on my iPad.
And if I'm reading this thread correctly, it doesn't even appear to be all that clear to developers just what is going on. Before a skirmish breaks out, let's all hold hands and sing a peaceful song.
Edit: Just don't anyone try to include midi in that song. It probably won't work.
@1P18 - a lot of what you would like to know would be instantly available if app developers simply included a Midi Implementation Chart in the user guide of the app. This chart is a standard thing included in the user manual of any and all midi-enabled hardware, so why not software too? It is up to the developer to include this.
http://www.midi.org/techspecs/midi_chart-v2.pdf
@supadom said:
Thats right, supadom :-)
@BiancaNeve said:
Problem is,none of my favorite apps is supporting it yet.
Well @Crabman, none of them is supporting Michael's solution either.
Time to bang on your dev and leave appropriate feedback via their support email, forum, FB page or (last resort) in a review on the iTunes app store. Make the suggestion to them, either MidiBus or SSE, and see what happens. There are a lot of responsive devs out there who really want to make their product work well.
@ehauri said:
Documentation is not the problem. Some apps don't have things implemented, and some implementations flat out don't work right. I've even contacted some developers about the stuff that doesn't work right, and it has been confirmed that certain apps (not going to name any names) are a problem when it comes to midi.
Quite apart from the lack of transport or clock send/receive features, here is a really good explanation of jitter and a simple test that anyone with a decent DAW can set up to check the jitter in their own workflow. This is for hardware boxes, many of them almost sample-accurate, but direct analogies with desktop and iOS apps abound.
And they have measured the jitter in many hardware devices (e.g. Name & Shame).
http://www.innerclocksystems.com/New ICS Litmus.html
Excuse my naivety but shouldn't this be baked into the os? Not that midi is a big deal for latency but I wonder about having to bounce in and out of user-space for this kind of functionality.
@ehauri said:
Well,how could they if it's even not ready yet?While Midibus had quite some time now to get at least the most important Devs/apps on board.Anyway,i don't care who delivers a stable Midi clock if at least it's working with my apps.
Well, for hardware it IS baked into the device's firmware - but remember that these devices are tailored specifically for musical sequencing. And note that +/- 5ms (250 samples @ 48kHz) is considered bad in the hardware world.
OSX/iOS are multi-use OS's that are (unfortunately) not primarily designed with musicians in mind, as @midiSequencer mentions above.
But if the OS is capable of 1ms reliably (and "reliably" is the key word here), then that is still pretty good even by hardware standards.
'i don't care who delivers a stable Midi clock if at least it's working with my apps'
Hey @Crabman, stop reading my thoughts!
@supadom you might not care but I have noted a ~20ms penalty for using Audiobus and Auria rather than iaa and Auria. This may be an implementation issue but there are good reasons (performance and generality) that specific functionality is provided by the os.
@Deselby said:
I think we'd all love it to be baked into the OS and agree that has the potential for optimal performance.
Making that happen is a whole 'nother issue...
Please don't be defeatest. I am sure that 'the ghost in the machine' that is Apple is paying attention to this forum. By virtue that Audiobus, and by this I mean the individuals involved, posited and created a vision of what and where the technology for music production needed to be directed, I am sure that the audio minds in Apple are paying attention to this forum. I think the best evidence of this is the fact that Garageband supports Audiobus. Kudos. And it was shortly afterwards, if memory serves, that iaa was announced. In the grand scheme of things ios, the importance of music creation, our grail, is probably pretty low in general ranking. But I am sure that there are individuals at Apple who are as commited to making ios a music creation platform/environment. It is only by expressing our commitment that they will recognize that we are with them in expressing theirs.
Please don't be defeatest. I am sure that 'the ghost in the machine' that is Apple is paying attention to this forum. By virtue that Audiobus, and by this I mean the individuals involved, posited and created a vision of what and where the technology for music production needed to be directed, I am sure that the audio minds in Apple are paying attention to this forum. I think the best evidence of this is the fact that Garageband supports Audiobus. Kudos. And it was shortly afterwards, if memory serves, that iaa was announced. In the grand scheme of things ios, the importance of music creation, our grail, is probably pretty low in general ranking. But I am sure that there are individuals at Apple who are as commited to making ios a music creation platform/environment. It is only by expressing our commitment that they will recognize that we are with them in expressing theirs.
I'm going back to using a regular computer to post here. The whole ipad/safari experience is just too stochastic ... but I do love the ipad ....
@Deselby I'm not a defeatest but it might fairly be suggested that you are an optimisticest.
Certainly there are folks at Apple aware of the awesome that is AudioBus (see GarageBand) and aware of our concerns/needs as musicians using pro-ish tools (see core audio and core midi and IAA later) but whether or not those that may be sympathetic to our first world plight actually have enough board room swagger to get resources devoted to moving the OS from 20ms to 1ms consistently (as an example) vs increasing battery life or making photos better or (...) remains to be seen. I can't say I blame them; sure, I want a rock solid audio/midi system baked into iOS but I quite like the battery life and camera on my phone!