Loopy Pro: Create music, your way.
What is Loopy Pro? — Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.
Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.
Download on the App StoreLoopy Pro is your all-in-one musical toolkit. Try it for free today.
Comments
Did you mean wanky or wonky? There may be a bit of a difference between the two
@Sebastian said:
Shut up and take my money ^^
I was talking to giku about it and we both wondered why the fuck you don't do it ^^
Isn't the the in app purchase (multilane) doing fine?
@supadom said:
Pretty sure the OP meant wonky. :-)
Developers that don't properly support MIDI are wankers who sell wonky apps.
While we wait for the iOS MIDI revolution, at least we can use many apps that DO play well together. Look how Sonosaurus does it, to name one developer that seems to know what they are doing.
Their apps discover all the available sources and destinations, and for each you can select whether clock or notes are sent or received. If all developers did this, would be even be having this conversation, or worrying what Apple has up their sleeves for the OS after the next one?
Presuming there is existing solid MIDI code out there (midibus? @sonosaurus? Funkbox? Loopy?) maybe what we need to crowdfund is:
That's likely not enough but as has been pointed out: there are good implementations out there. This isn't Audiobus 0.1 where the proverbial road had to be created. CoreMIDI already exists, great implementations of it already exist; developers need a great set of best practices, guides and easy to integrate code.
The goal here would be to avoid a decision of whether or not it is worth it to include the library—if you're planning on implementing MIDI anyway, it should come out as a cost saving measure.
I think the Community has been established within Audiobus. Most of the apps that value Audiobus would value MIDI fixing. I believe the most efficient route might be a series of meetings between Apple's Core MIDI IOS planning division and Audiobus to agree to a plan. While we are not at the top of Apple's radar, IOS music has grown and matured enough to have garnered some of their attention, and your input may be heard.
@Sebastian said:
This is sort where things get annoying. I've got an app out that runs on iOS 6/7, and the Mac, providing BLE MIDI. It works very nicely. I've invested some time in a Windows version (not quite stable yet), and have been tinkering with a hardware BLE adapter (similar to Quicco). I was going to release a free SDK that would let others integrate it into their apps, but then Apple made their announcement. Much of my time and effort, down the tubes.
So -- suppose Michael and Sebastian integrate MIDI into Audiobus, or all developers switch to MIDIbus, or some other clever group cooks up something new and awesome -- it's not impossible that Apple will roll out something that directly competes. The 800-pound gorilla of iOS audio, GarageBand, would likely only support the Apple standard (I'm stunned beyond belief that GB has Audiobus).
The success of any new platform will depend on a critical mass of developers adopting it -- and there's no guarantee that anything other than an official Apple solution will get that kind of consensus.
Further, if there's separate app in the middle that arranges connections, it actually complicates things for simple routing (the most common case). I normally want to send MIDI from controller app A to sound generator B. 95% of the time, I can just go to app A, and point it towards B. If there is a central routing application, I'd need to start A, B, and the routing app, and then go configure the routing.
--
If I were king, I'd decree that we all use the existing CoreMIDI/VirtualMIDI API in the same way. The grass is always greener in the other SDK; but if we all used the existing SDK in a consistent way, the problems would disappear. We decide on which side of the road to drive on, and stick with it.
We would all use Virtual MIDI. CoreMIDI would apply only for hardware interfaces (and that would be a great place for a single-purpose app, that listened to CoreMIDI, and then routed it to the desired Virtual MIDI destinations).
MIDI output apps would allow extensive configuration on where to send MIDI, and synth apps would play any MIDI that came in. That would leave the door open for an app that allows for more complex routing, while keeping the typical case simple. And apps should remember the last MIDI configuration they had, and be able to save and reload configurations.
Unfortunately, I'm not king, and no matter what I suggest, there will be at least 30% of the development community that wants to go the other way.
--
Man, I sound really grumpy. I'm actually not that grumpy about things. The iOS infrastructure for audio and MIDI is really pretty amazing. Considering all the moving parts, it's a wonder that it works so well. The functionality that gets packed into devices that cost a few hundred bucks -- astonishing. I was in the garage, and stumbled across my old Fostex X-26 4-track cassette recorder; the music making tools are so much better now, it doesn't seem reasonable to complain.
For what it's worth, I didn't get a grumpy vibe while reading your post, Patrick. I appreciate the perspective.
@SecretBaseDesign said:
this!
things could be so simple
If you have a look @ the patches I posted in the patch thread u'll find configs like 2 synth apps getting their input from soundprism for example.
If I reload them
I always have to reconfigure the midi routing
Only saving the audio state of apps doesn't cut my cake.
"I'd decree that we all use the existing CoreMIDI/VirtualMIDI API in the same way. "
That's really all we need. It exists, there is proof that it works well, it's just on developers to implement it.
A new standard similar AudioBus sounds great and all, but the fact that many apps DO already play well together through MIDI means that what already exists is sufficient. Maybe not optimal, but at least workable.
It is true that ios midi is wanky, truly wanky sometimes. But at this stage of the development of this continuously evolving platform we are indeed very lucky. With the sheer amount of apps out there one has to find their set up that works together. I used to (still do) get ultra frustrated with the midi but one just has to count the blessings and get on with things that work.
I currently have loopy - turnado - effectrix - samplr set up and for better or worse every time I call a preset and turn all in the above sequence, everything gets tempo sync'd and starts and stop as it should without the need to re-set midi settings. I've also been using mpk mini for ages now and never had to change midi learned setting in sunrizer, animoog or any other apps it just works. Ios midi will eventually get streamlined but god knows when and how. Yes, crowd funding may be a great idea but if Apple implements their own or borrow concepts from a 5 figure crowd funded project I don't think it is such a great idea. Of course people hooking up ipad to their more complex hardware may have different needs from me. I'd love the midi to just work but until that happens I have to say that for the moment I'm good and things are definitely looking up. Just give me more RAM.
C'mon Apple!
ACTUALLY DOING SOMETHING
OK, here is a how I would like to propose actually doing something:
Any developers with chops who are interested in tackling this primarily as a labor of love... let's talk. I can imagine lots of cheesy ways to try to monetize this, but I think the reality is trying to focus the community on some simple software that would benefit everyone.
Either that, or try to form a standards board as @zymos suggests.
I don't need an consistent midi ui - I think that's kind of overdoing it.
no biggy
There are enough apps that support midi just fine.
Just let me save the midi configs I use and I'll be fine.
Set, save and forget
Thanks so much for the insight you fantastic devs, and posters. Because of this info, I actually feel a little more at ease with iOS MIDI, not sure how else to put it...it's seems less mysterious to me now. That may not take away from the sometimes frustration of trying to use it, but the frustration level will likely be reduced. Off I go to find out...
I have a rant stashed away on this subject of anybody is interested. And, apparently, this is a second part to that. Sorry, but it seems I cannot be brief on this subject. It occupies me most every day.
I understand why the dismal state of affairs. That is the other rant.
The solution, from a technical point of view, is also clear and stated in several ways here already.
But the understanding of why it should be adopted, not so much. That is the subject of this screed.
I see a lot of nose to the wheel complaints about personal frustrations and workflows that don't work, and all are valid, but are also all just anecdotal and, as Seb discussed, too vague to carry any weight in persuading the dev community as a whole that there is value to be had in doing MIDI "Right".
It is a philosophic, worldview, issue. Nobody has yet articulated that the iOS ecosystem is in fact a system. It has moving parts. It has multiple denizens. It has not only an ecology but also a community. And I don't mean us, humans, here. I mean all those Apps that one's imagination would love to orchestrate together to a grand design but cannot because some of them are unaware they are part of a greater whole.
And while it is true that currently only a small percentage of us have any kind of Grand Design desires, I submit that making such achievable as a matter of course will have two effects: increase the percentage who consider such things (because they will not give up early in frustration), and make everything easier for the rest with simpler aspirations. Hardware users do not have these frustrations.
We have lost sight of the well acknowledged fact that MIDI enabled the rapid growth of the electronic music market. That it did not take long for boxes without MIDI connectors to be no longer tenable. And it also did not take long for the MMA to coalesce to ensure that those connectors talked right to a reference standard. Because all those boxes are actually used as system components. Interacting system components. Rarely the only thing with power applied.
And we are repeating that growth inside the Virtual World of iOS. But we are hampered by the lack of an MMA equivalent and good reference designs. I have proved to my satisfaction that the iPad is in fact a full studio in a box, equivalent to my garage studio of the 80's and 90's. And that as such multiple performers can effectively use a single iPad for all the audio needs, they only needing MIDI controllers for their input. I have 4 controller keyboards and all can play at the same time on different synths. But only if the set of Apps loaded do MIDI right. We are a solitary community and that has to stop. We have to play together more, then this would all become glaringly obvious.
Well said @Dwarman! So the big question would be, how do we persuade the development community that this will help them? How do we make it worth while to them to come together to solve this problem? Ideas that come to mind: proper midi implementation has to be extremely easy, well documented, increase the likelihood of selling more apps, and thus making them more money.
I think that part of the incentive has to be that a well designed SDK would benefit devs by not having to expend dev cycles on that aspect of their app, but the fact that Apple could trump anything thrown out there leaves me less than optimistic.
@Audiojunkie said:
I think the answer is:
To create demand, the solution would have to be elegant to use. Elegant enough that people would ask "when are you implementing fooMIDI".
Then, developers would have to be able to easily integrate. If this is a one-time rewrite with evergreen benefits, that would be a plus.
I'm still raising my hand as someone willing to DO something.
MMMA?
Hmm thinking about midi only apps.
Maybe I'm a little slow but what would u do with it?
I can think of 10 to 12 functions I could use,
So I would buy like 2 or 3 apps that cover that and then move on.
That's a pretty short list to be a market.
Someone enlighten me.
Seq, arp, a few note and controll modifieres, some input apps. That's basically it.
@lala In Thysys for example, you have gating effects for time and velocity. Since midi is ultimately just numbers, you can do any number of types of manipulations on them with midi effects apps to get different effects. The sorts of effects you get with analog you could also do a midi version such as delay, echo, etc....
That's kind of what I'm talking about.
U would buy one good app with midi modifiers that u can stack on top of each other and that it, it's not like synth apps for example where they all do different things. No money to be made here (X apps to be made and sold).
If u look into the hardware world there isn't much midi only stuff, some I/Os with filter and merger, a clock modifier, (seq and arps a pretty rare in the hardware world they are usually just a little extra in synths), some input hardware. MIDI is just the backbone of things.
And if u want to talk to hardware, that hardware is going crazy pretty quick if u fire to much nrpns or something.
If I'm guessing right audiobus used to be a sysex trick to send audio, so why not just add midi again?
The community is already here.
I can't see that the "lets do midi right group" would be a different group of ppl already around.
My 2 cents
Oh and I am surprised to read that Sebastian is thinking about other platforms.
Windows mobil? Nobody wants that.
Palm? Dead in the water.
Android? To fragmented, google hasn't fixed that. There a still new devices being sold that will never get a current os. and ppl on android do spend less money on apps compared to ios, they just want to click facebook and play their little game. It's just the os that comes with the device if u walk into the next store and say sell me a (not so smart) smartphone.
Don't shoot it's just an idea coming from my studio setup where I had trouble on software/hardware sync and did something rather "fancy".
Instead of throwing the F8 intervals for recognising the tempo and sync ,why not making an OSC or 14bit Midi or Sysex protocol where every app has it's own (stable) clock and the master transmits the Time(Clock)Change aka "TC" to the slaves (like a normal CC command).
That would introduce a problem:
On the Master app when going from 230bpm to 130bpm ,the slave app will delay some milliseconds to catchup the tempo change and if both apps play a loop the may be out of sync. In order to avoid that , the master should throw a message to the slave ,the slave responds to that message and the master calculates the latency. Then the master sends the TC with an offset of couple bpms. Meaning going from 230 to 130 but send to slave 125 and then 130. Hope it makes sense!
If it's wired midi connection or internal software then for simplify things there could be a manual offset (given by the user) ,instead of recalculating in every TC.
Thats a big problem, then u can't do ritardando and accelerando anymore, it just jumps around in steps?

I'm not trying to shoot you or oppress opinions
@lala said:
No problem
It's just I don't understand why can't do ritardando and accelerando. In every app you use steps of 1 or .01 bpm...
I don't understand what you are trying to say. Can you rephrase the problem?
Audiobus preset 'This Syncs 2': http://preset.audiob.us/3mM95z8c1kLIi2I
This works, change the tempo in sector, sunrizer will follow
The "problem" would occur if using tempo change midi/osc data instead of F8 midi clock ,and not calculating the latency of master --> slave app...
Just stick with midi clock ?
We will only create new problems with other timecodes ...
I don't think there is any hardware that supports osc?
I think what happens now is you send a message, let's say clock pings, that message has a time stamp , that message gets to the other app then the timestamps are reordered and u read the clock pings > u get clock sync.
somebody say stop talking non sense lala, if I am wrong.
MIDI works, some developers are just lazy or have never used midi? Or don't have the nerve to see if it works with what is known to work. And they have beta testers with no clue.