Loopy Pro: Create music, your way.
What is Loopy Pro? — Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.
Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.
Download on the App StoreLoopy Pro is your all-in-one musical toolkit. Try it for free today.
Comments
Terrifically done. Very impressed! I think I'll need to hear your music.
Mono/Poly was the first software synth I really connected with, in the Legacy Collection. Spent a lot of time fiddling with the wurlitzer preset, if i recall (the entire legacy collection is crash-city in FL Studio so these memories are fading).
Ah thank you for doing this Ian.
I will set some time aside and watch through them all later.
You’ll need a full hour in total, and although I don’t want people to skip anything I’ll drop a hint that the most fun is at the end — 9 and 10 (10 is just under fifteen mins long!), and there’s a particularly poignant sign-off / wrap-up message there too, which may be of interest to some. You need an hour.
Nice work matey, I enjoyed watching and learnt a couple of things too
One thing, when you talk with your hands, like I do too, you have to remember that what you see is back to front to what your audience sees, I am thinking about your 'finger graph' that you use when describing a low pass filter. I used to do this all the time when I was teaching.
thanks for taking the time to put these together.
This kinda makes me wish that Korg would port the rest of the Legacy Collection to iOS with 'Gagets' and naturally include IAA/AU implementation
Yes the finger movements were me getting caught by technology. In teaching (I was a lecturer, and when I could find the work a few years ago still do occasionally) I do adapt to reversing the view quite well in front of real live students.
In my previous video presentations, I have a Blackmagic ATEM system which feeds the program and multi-view video output to monitors which I will see facing me. That image is the end result video, and I’m used to laterally reversing my actions there, too (especially as that constitutes a live mix of different video streams, i.e., background video, me upstream chromakeyed on top, words from computer downstream lumakeyed on top of that).
(this is a test using that setup which is intentionally segmented into short bursts with passages of silence / fade to colour, for a web-tech experiment involving indexing into video that I was trying out) – what you see on screen is exactly what I saw, even to the point that I was reading from it!
However, these Mono/Poly vids weren’t done using that live TV vision mixing tech, they were just shot to in-camera SD cards and edited in Final Cut Pro X afterward (a far more time consuming way – I really hate post-pro, just do everything live and capture it, top and tail, and be done). What caught me out on these Mono/Poly vids (and didn’t realise until I got to edit stage) was that the cameras, when the displays are flipped around facing me, actually left-right reverse their image, to make it more like a mirror image! When the display is facing the rear, as a person operating the camera would use it, the cam display is normal. Only when it is flipped around for “selfie” will it reverse, but this is entirely what fooled me. The video was shot with two cameras, a Sony RX10 (amazingly superb lens) facing downward on a tripod crossbar overhanging the synth (which produced an upside-down image, which I flipped back round in FCPX) and a Sony CX410 (truly rubbish video quality) sitting on a mini tripod on the table pointing at me (and catching part of one of the diffuser brollies from the pair of 135W 5500K CFL lights either side of the synth). The CX410 display was flipped round to show me when I was in frame, and this is what I was drawing the graphs onscreen to, inadvertently backwards.
My lav mic was a bit rustling and I realised later (after edit, after upload!) that the Sony RX10 cameras mic feed was actually quite a lot better, because it was positioned straight in front of me as if it were simply a superb microphone itself. I should have used that feed instead, had I realised. I simply had it on in-camera mic purely to sync up the multicam clips in FCPX (which can sync on audio). My own audio went into an A&H Zed 10 mixer (and then out over USB) to mix synth and lav mic together (if I’d kept synth and mic separate always, there’d be no easy way to sync them up again, so my mix feed was used to sync to the synth sound, and also to sync to the voice sound). However, I took an aux from the Zed 10 into another USB audio converter and used that alone for synth, once they were all synced up from the mixed feed, which after then became superfluous. Yes, if I do that again, I’ll take the RX10 built in mic more seriously — it really did give unexpectedly good results (and fortunately I didn’t have it set to AGC) considering I usually dismiss the built in mic of a cam and only use it to sync with.
You can tell you've put a lot of thought and work into it , good quality video and the content is balanced just enough to have some detail but not get too boring.
And there was me thinking you just put a Go Pro on