Loopy Pro: Create music, your way.
What is Loopy Pro? — Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.
Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.
Download on the App StoreLoopy Pro is your all-in-one musical toolkit. Try it for free today.
Comments
Wait. Is the connection like iDAM where the USB host is the Mac and the iPad is connected as a USB guest, either directly or through a hub? If so, how can an interface that has to be connected also as a USB guest also connect to the iPad?
Can anyone verify that an audio interface can still be connected to the iPad while the iPad is connected to this app? If it can then this offers another significant advantage over iDAM.
Thank you very much for having some interaction here, I appreciate it.
This kind of transparency gives me confidence in your professionalism (even though the product itself is quite indicative of that).
I’ll get it on my next paycheck!
That would address most of the issue, but I think that like video apps, music apps often involve a scenario in which you're using the app on screen but not interacting with it all the time. You might be watching some meters change value, or the little balls fly around Animoog, etc., and you reach for the screen, which just so happens to lock, and you're screwed — even if the sound isn't interrupted, you've lost the moment.
So the screen lock override does make sense, I think. I do see the problem with completely idle apps being left on forever after the user has forgotten they're still open — I do this all the time.
I think he's asking if an iPad can receive multichannel audio via an interface (into a host running Duo) and simultaenously send all those channels to a Mac with SideRack. Is that a yes?
This is a huge point in why I’m surprised this ability hasn’t been exploited more.
Yeah, you can buy hardware that LIMITS your number of instances to 4 (or 8, doesn’t matter) but you can already have a DIRECT connection WITHOUT a limit on instances, it just depends on how much your iPad can handle. There doesn’t need to be a need for a middleman.
Of course, i understand the trickiness and fragility (thanks to Apple constantly changing rules) to get this working but if a couple guys have been able to make this work from a couple separate efforts, how is it some of the bigger names haven’t jumped on this? I suppose it’s just too much of a pita…. But the reward!!!
I bought it after testing out the demo - and will make time this weekend to play with it.
I have bought so many things for the iPad and now mostly use my Mac so I figured this is a great way to start using all my iOS plugins again.
It's not cheap but - as a percentage of what I've spent on those plugins that don't get used any more - it's a tiny amount.
i can confirm you are able to have the audio outputting through the Ipad. and the plugin also transmitting outputting on a separate audio interface.... which is effectively outputting on separate audio channels
plugin channel
ipad speaker channel
which is very interesting!
so the routing is as follows
ipad audio outputting(remixlive) from ipad speakers
ipad receiving charge--->Duo (server)---> usbhub ---> m1 mac ---> Reaper-->siderack plugin(primer2 synth)---> audio interface
again the 2 audio outputs are independant... at least thats what im seeing
(no idam involved)
Thanks for checking. However, if I understand the setup correctly, that isn't what I was asking. I was asking if the audio interface can be attached to the iPad for input, not to the Mac. Currently with iDAM there isn't any way that I know of whereby you can have an audio interface inputting to the iPad.
with Idam enabled you can also have separation of channels
Ipad receiving charge--->remixlive--->lightening--->USBhub(providing charge)--->IDAM enabled --->Reaper ---> output to aggregated ipad output 1+2 ---> audio interface1+2
ipad with idam enabled & duo server----> Usb hub providing charge ----. Idam enabled ---- Reaper--->fx plugin siderack(primer synth) ---> blackhole channels ----> Audio interface 1+2
effectively allowing for Remixlive to be on a separate channel to primer synth
But still, with the audio interface connected to the Mac not to the iPad, correct?
Sorry, I'm really trying to get a handle on this. I prefer playing guitar into the iPad to take advantage of the lower cost FX plugins, and can't do that with iDAM. Round trip from interface > Mac > iPad FX > Mac would introduce too much latency. Another forum member has been asking about a solution for this as well.
i see what you mean
what i did try...
Ipad ----> usb camera --->usbhub ----> audio interface. (and then added)
split from usb hub ----> usb to usb cable --->m1 Mac ----> reaper ---->fx siderack
this did not work when opening the Duo on the Ipad, was not able to connect
but the audiointerface+ipad would operate as normal
Thanks so much for testing. That's the info I was after. I hope maybe @satoshiszk will be able to give the official word. An earlier comment seemed to indicate it was possible, but I doubt it can work based on my understanding of USB host vs. client connections with the iPad.
What could work is hooking the KQ Voice loopback plugin into AUM and Duo, basically using the technique to get IAA apps working in hosts that don't support them. This way you may get sound from your interface into a Mac DAW and still be able to process it with AUv3 plugins inserted after it in the chain/track.
in actual fact i carried out a test... which would effectively be the same as a guitar connected via the inputs of the audio interface...
the audio signal chain i have setup in reaper is as follows
inputFX which has a maschine project (which comes before the mic inputs of the audio interface-----> Fx plugin (siderack audio effect) ---> ipad ------>. Duo plugin processing delay plugin---->& audio return back into Reaper with processed audio ---> Audio interface
so thats effectively working... YES.... how much latency incurred i cannot say..
so based on where the inputs are situated in the audio signal chain... it effectively carries out what you are asking
(idam connected)
heres a little tip for anyone wanting to increase the demo time for the purposes of testing with reduced noise
caveat is the audio loses its high end frequencies... but as its used mainly for testing, you can get a gauge on the application and carry out testing
if you put a denoiser that is able to record a profile of the white noise... which is then phase inverted to reduce big portions of the noise... but again this produces artifacts and loses high end frequencies... but for testing purposes should be fine
heres a example of a JS-plugin which does just that (downloaded inside reapack if using reaper)
That makes sense. I've added your request to our roadmap. Thanks for your feedback and detailed explanation.
https://forum.audiob.us/discussion/comment/1307142/#Comment_1307142
I apologize, as a software developer, I hadn't thoroughly considered the hardware limitations at that time; that was my oversight. While USB may have some limitations, other audio devices like built-in speakers or Bluetooth headphones should still be accessible when our app is running. In any case, our app does not interact with audio devices.
does anybody know what gets installed with the desktop part? i mean beside the au-components.
edit: ...and vst-plugins
There's a SideRack folder in your user library/application support which contains a log file and a settings file but apart from that it's only the actual plugin files.
Setting aside instruments, synths, sequencers and such ... I have a lot of effects in IOS that I've picked up cheaply that are a LOT more expensive on desktop from vendors like Fabfilter, Eventide, Klevgrand, Nembrini ... and my expectation is these are the same DSP inside mobile & desktop. I've long hoped for a way to use the iOS versions on desktop productions.
Siderack will allow me to send audio from a desktop DAW to an effects chain on iOS correct? What about instruments that originate on iOS? Can they send signal to the DAW?
Can I process DAW audio with an IOS effects chain and then round-trip back to the DAW using Siderack? Is it a one-way sink from plugin to server or a two-way channel between the two components? Or would it be necessary to bounce stems in IOS and import files back into the DAW?
And just so I'm clear, this is not over network but over USB? Will it work with a powered USB hub in the middle?
my testing i've carried out of the plugin... does allow for DAW to IOS to DAW...
but you need to be realistic in your expectations in terms of latency..... that is the simple reality of the sending, processing & returning...
look back on some of the tests i have carried out....
if you can follow the routing you should be able to gauge whats possible....
but just understand the nature of latency & processing and be realistic in your expectations with your current setup... in other words... dont expect it to reduce latency
Thank you @triple7 ... high latency might be acceptable to me, but I guess it depends how high it is in practice. I'll look back over your notes in this thread.
We can have an acceptable latency by playing on the "sample rate".
For example, if I program a sample rate= "176400Hz" / 2048 samples on my audio interface (Fireface UCX) via Gig performer + a Siderack Audio effect, I have a latency of 23.2ms, which allows me to play an instrument without too much problem.
In my case, I use the "Roxsyn" application on Ipad with a guitar connected to my Fireface UCX.
After multiple attempts, I have two questions:
Is there a way to prevent the iPhone from going to sleep when the "Duo" client is active?
Is there a server provided to transmit/receive both audio + Midi information in the same plug-in?
The solution at the bottom in this thread looks promising: https://apple.stackexchange.com/questions/373189/shortcuts-app-automation-to-prevent-iphone-screen-from-going-to-sleep
The solution proposed there is to use set auto-lock to "never" under screen brightness settings, and then make a shortcut with a condition using low power mode when certain apps or programs are not open.
I'd try to use a Focus mode (e.g. a music making) including all audio host apps (including duo) using the shedule option, and then in the shortcut turn off the low power mode when an active focus is detected.
Low power mode with auto lock would be the default, and this then gets overridden by active focus modes which would trigger the shortcut to disable the low power mode (and enable the screen lock never set in the screen brightness settings).
--
Edit: Focus modes have the option to add filters where you can turn low power mode off.
I've tried it and it works on the iPad.
Set screen lock under settings/display & brightness to never.
Set low power mode as the default under settings/battery
Create a focus mode and add all the standalone apps for music to a music focus mode, add a filter to turn low power mode off when the focus mode is active.
Now the focus mode will kick in when the apps are in full screen and the low power mode is automatically turned off, meaning the screen will never lock.
This is great as I’ve still been sleeping on the Focus modes and this seems like the first seriously useful application of it for me, so thanks.
Though I still have to add, using an iPhone 12 mini, in default mode the fx round trip latency is still acceptable imo. Only if i was using it in a live performance context would i be concerned about the latency, but with all the mention of it I want to voice that it’s impressive and probably not worth being scared off from by hearing it for yourselves.
Thank you very much for this tip which perfectly meets my expectations!
You're welcome. I've also started to use focus modes now. Without a problem like that I may have never really looked into it like that.
Update is out for app and desktop, will take it for a spin later.