Loopy Pro: Create music, your way.
What is Loopy Pro? — Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.
Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.
Download on the App StoreLoopy Pro is your all-in-one musical toolkit. Try it for free today.
Comments
According to the devs there is a bundle coming, but as always Apple takes ages with these
MIDI clock events are simply MIDI messages, so I see no reason why you could not just roll-your-own system to send MIDI clock messages based on system time. Or, conversely, to process MIDI clock messages that are received by TouchOSC. The system for doing this would be built in Lua, which is known for being a very fast and efficient language (it's often incorporated in games to allow user-scripting, where speed is essential), and would presumably be reusable, so once someone creates a MIDI clock bolt-on they could share it with other TouchOSC users. And users could collaborate to develop and improve the clock-message functionality. At least, this approach is typically how issues like this are solved in general purpose languages like Lua. It would be preferable for TouchOSC to have the capability added by the TouchOSC developer, but should be possible for TouchOSC users to create MIDI clock functionality (and similar, fairly "low level" things) even without that.
Can someone please clarify this for me? I'm trying to set this up so I can run touchOSC from my iMac and my iPad. I went through the tutorials and am confused about what exactly is happening. So, do MIDI and OSC do the same stuff? Or are they actually capable of doing different stuff? As long as my device is connected to the bridge, is there any reason to use one over the other? I can connect through USB from my iPad but if I connect over Wifi, the address is different from the Wifi address on my iMac. It gets all confusing cause I'm not sure if I'm using OSC or MIDI when I have both enabled. Looks like both OSC and MIDI are being sent at the same time in Protokol. Can someone clarify how best to think about all this?
I would disable OSC unless you have a specific use for it. MIDI and OSC are two different systems that do basically the same thing, but differently. You need only one (at least if all your gear supports that system). I expect you're used to using MIDI, which is what you should continue to use. Most synth-related software and hardware isn't even capable of communicating via OSC.
Thanks. This really helps...
I wish more apps used OSC as well as midi.
It's is really good for complex apps because it's address based so goes by names, you don't run out of midi numbers and channels etc. So it's useful in things like Touchdesigner and Resolume where you can have vast numbers of controls and layers, it's very easy to get lost with midi mapping. OSC has much better resolution and speed as well.
OSC is a bit of an initial pain to setup but then it's much less confusing overall.
Very interesting reading regarding MIDI vs OSC.
What ever happened to MIDI 2.0?
Are there any apps on the App Store right now (other than this control app) that accept OSC messages?
It suits even just for silly design mockups
I guess it would be quite hard to implement it in any tracker kind of way, but it was fun.
https://www.reddit.com/r/lua/
The just-released miRack update includes OSC compatibility.
Mozaic has a timer that is "sample-accurate in milliseconds" and runs independently from host tempo. TouchOSC does not have anything like that.
You can kinda simulate a timer if you add your code to a callback function like
update()
, which is called every time a control is about to be updated after finishing processing user input and received messages. But it's not accurate and does not seem to have high enough resolutionI actually found a way to make it process MIDI clock messages and I'm getting mixed results. I'm using Ableton Live as a midi clock source, tempo is set to 120bpm. The midi monitor utility shows midi clock packets arriving every 20-21ms, but in TouchOSC I get intervals from 15 up to 40ms. So, it's not very accurate but probably good enough for most cases. Well, it depends on what you are trying to build. I think it looks promising
Please note that I was doing my tests on a fast mac computer using an empty project (no controls). No idea how it will behave on the ipad and if the project contains a lot of controls.
Let me know if you wanna see my code.
I must say I like Lua. It's simple and easy to learn
Also, if you don't know, there's an iOS IDE for Lua similar to Pythonista: https://codea.io
I don’t have TouchOSC yet. But cool that you’re already experimenting with ways to enable midi-clock functionality. I obviously don’t have a good feel for the TouchOSC/Lua environment. I expect there are plenty of projects out there with Lua where Lua has been used to generate and receive midi clock. (One example I found from a quick Google search: https://github.com/okyeron/midi-demo/blob/master/midi-clock.lua ). I don’t know, maybe there’s something about Lua working within TouchOSC that creates insurmountable problems. But maybe it’s worth experimenting further and/or asking the TouchOSC developer whether an end user bolt-on of midi clock functionality seems feasible, if it is what the best approach would be for “bolting-on” the functionality, whether there are plans to add that functionality as built-in so users don’t need to bolt-on, etc.
You can try the desktop version, it seems to work fine without a license
To answer to myself - yes it does. Though it doesn't see my bluetooth midi adapters (CME WIDI Master and WIDI Jack) unless I open e.g. Gadget and connect it there. I can then close Gadget, but the connection becomes visible to TouchOSC.
As an exercise I created a Korg monologue controller. I don't know why I'd ever need it, but the process was fun

Very attractive. Which was the kiss of death for midi designer
That is standard. BLE Midi connections don't happen automatically. Some app always has to manually initiate the connection, after which it's available everywhere. You might want to hunt around in TouchOSC to see if it has it's own BLE Midi connection option. Most apps now have it. If TouchOSC doesn't then that's something that should definitely be requested of the developer. It's not difficult to implement.
I wish BLE Midi connections could be made persistent in some way.
[edit] meh. I guess it doesn't have a connection dialog. I'll look into requesting it. Until then, any app (Audiobus, AUM, whatever) can be used to make the initial connection. Korg supplies the free BLE Midi Connect app that can do this too.
Yes, the controls look pretty nice and there is quite a bit of customization options
Thanks, I sent a message to the developer too. Hopefully he will add it
All of Monome's gear, eg Norns, uses Lua. There is a whole community over at Lines dedicated to teaching it.
Yay!
https://apps.apple.com/us/app-bundle/touchosc-generations/id1571055482
@ashh do you have a link for that? I don’t know Lines…. Thanks!
So far, Lemur seems to be the only DIY control surface that comes with a sequencing engine.
My experiments with Mozaic have quickly ended because although it supports halfway precise timers, the interpreter is simply not fast enough to achieve tight timing without eating the CPU.
Yep. I plan to have another look at Lemur. It's been a while since I touched it last time. Its editor used to crash on start, but seems to work fine now.
It looks abandoned though. I've seen numerous complaints about problems with newer versions of macOS and non-existent user support
That's quite true unfortunately but the on-device editor works well enough so that I've never needed to use the desktop editor in the past.
Make sure you get your choice of templates from the user library and grab some information from the community forum, you never know, last time after it went down it took maybe half a year until it was up again.
But that doesn't change the fact that Lemur is stil an outstanding solution for MIDI control.
I love Lemur, but the lack of support and it's apparent abandonment, plus the current price, makes it feel like a waste of money. I have it for my Androids and always enjoyed using it.
Yep, here you go.
Can anyone tell me if TouchOSC supports long sysex messages? (like e.g. 500 bytes in one message?)
I've read that Sysex can only be received on root level and using lua only but no specifics listed in the non-searchable, non-pdf online manual.
is there any reason TouchOSC couldn't be used to make a Roland JD-Xi patch editor?
I think it uses sysex for most of the controls rather than NRPN/cc but judging by things like the awesome looking Korg controller @branis made it should be ideal?
You press the little three lined menu to the right, then press the little book icon.