Loopy Pro: Create music, your way.
What is Loopy Pro? — Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.
Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.
Download on the App StoreLoopy Pro is your all-in-one musical toolkit. Try it for free today.
AB from a Technical PoV
Hey guys,
I haven't bought AB yet, but sweating to pull the trigger (will do eventually soon), so I can only speak for videos...
Something that has my head spinning is, how are they able to put a widget-like thingy on top of the screen while the apps are opened? Why hasn't any other app do that already?
And also when AB calls other apps, does it do it inside AB (in this case WOAH!!! how would it pass through apple's appstore guidelines for sandboxing) or does it call the app via the now familiar Custom URL Schema (http://handleopenurl.com), and in this case, again, how does it manage to stay "on top" (the widget thingy I talked about before).
Thanks for your time
Comments
AB wakes up other apps from within AB. These other Devs use the AB sdk to make them compatible. It's not like AB Devs are just hijacking apps. Not only does this widget thingy (as you call it) appear on top of the screen in compatible apps, it can also be hidden with a tap.
Not an audiobus developer so I'm mostly guessing here.
The little widget in all of the apps is a part of the SDK and each app has that widget code within its own binary. There is some interapp magic going on to let each loaded AB app know which other apps are currently loaded and each app updates their version of the widget accordingly.
There is no doubt that it is very innovative and clever, and it works very well too. Its great for example that you can start to record into Cubasis from a widget, rather than having to switch apps after starting recording in Cubasis itself. It makes the whole process much more seamless. Very well thought out.
We convinced every developer put a little piece of software inside of their apps that communicates with the Audiobus app and lets them display the connection panel when they're connected to it.
The same piece of software helps apps talk to each other so they can send and receive audio. Every developer has to adapt their app just a little to use it, which can take from less than an hour to multiple days or even weeks, depending on what they want to do with it (the most complicated implementation that I know is Beatmaker 2 which receives and sends audio and monitors it live while also being able to apply its own effects to individual Audiobus streams).
We're using URL-schemes to launch apps.
In other words: Magic.
Thanks guys, specially Sebastian for sharing the magic behind the scenes.
I'm a software dev myself, just not an iOS one, so it kinda bugged me for a while to know what kind of witchery you guys were doing.
But from a technical standpoint it seems simple enough yet elegant and groundbreaking (enough to get some of the big names involved!).
Unlike things such as iCade (?) where only a few devs got into the craze of using their SDK (that and of course the fact most iOS games are made with a finger motion controlling paradigm in mind... but ok I'm going totally offtopic here).
I still have a question left and that is... this connection panel, why aren't other apps using it? Is it part of the iOS 6.x SDK? (I'm guessing probably not since you guys must have been working on this way before SDK, even to get some guinea pigs to use your SDK inside their apps).
It baffles me a lot, because it reminds me of notification center and it seems weird that Apple accepts such things (being so uptight).
Or maybe you guys are using an in-house/external API that implements such control and it's perfectly acceptable for an app to use that? (Just like say, Auria, to name one, implements it's own UI that strides away from the common iOS app). Is that it?
And thanks so much again for your time.
It's just something that we made. It's not part of the OS or something...
We thought about how to do it best and that's what we came up with.
I think Michael started off with the idea of wrapping audio data in MIDI system exclusive messages and the concept has evolved from there. Apple allow inter-app communication, (open in, MIDI, etc.) so it's probably just a matter of leveraging the methods that are already permitted.
Just to be sure: Audiobus does not do that and the audio over MIDI sysex idea was quickly discarded.
I know, I did say it evolved.
And I said "just to be sure"...
That "Ohh, Audiobus works via MIDI"-myth needs to end...
Lol. Blame Michael, it came from his blog..
It was true back then, the first idea was to use that, but while MIDI is great sysex has too many limitations for an actual audio signal to be transported over that.
I think AB guys are Genius: sometimes my recording have little clip or lost pieces of audio, and AudioBus restore it automatically....it's incredible :-)
Cheers guys. I'm not specifically an iOS software developer, but other Unices nonetheless and my wild guess is you're just using a damn named pipe in /tmp or something
% cat guitarism.wav | LiveFX | Loopy
Yep... something like that...
Lol
Hehehehe
Ok guys I understand more "properly" if you may.
Thanks a lot