Loopy Pro: Create music, your way.
What is Loopy Pro? — Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.
Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.
Download on the App StoreLoopy Pro is your all-in-one musical toolkit. Try it for free today.
A fantastic sequencer/synth I'd love to see on ios
This blew my mind when I saw it pop up a few years back, personally I thought it was a very exciting development, but work on it seems to have ceased. Was thinking it would be fantastic on the ipad pro chubby with an apple pencil, stuff of dreams, there was a pc version with mac coming soon (2013), although for me ios would be a perfect fit. Skip to 7.46 for the interesting part of the vid, his take on automation is by far the best I've seen.
Comments
You can sort of get a flow like this using AUM especially with AU apps as you can control their parameters via MIDI notes and CC. Channel volumes, mutes, bus send amount, panning, etc. can all be controlled via MIDI. Perhaps the missing piece would be a CC editor where you could create automation lines for looping tracks of automation.
An alternative can be to send various channels of CC to AUM using apps such as Patterning, Oscilab, ZMors Modular, Lemur, midiLFOs, ScriptSONIC etc. and route them to control various parameters.
Aum is high up on my wishlist, looks fantastic and will fill a nice hole for me workflow wise, I'm just on an app diet at the moment. After reading the patterning thread and all the gushing that too is on my wishlist, will check out scriptsonic even though I was sure I had it, just incase.
The rest I have or are not a priority atm, the automation workflow I can kind of get with the apps I have, but it's the way the information is displayed and organised that has got me excite about audioGL. For me being able to see so much information at once and zoom around kind of reminds me of hands on approach of buttons and sliders on analog/digital hw synths.
Daws in there current state kind of remind me of digital synths with small lcds and no external controls, capable but fiddly, audioGL in automation mode, just the way the music is visualised I find amazing the gui work I find outstanding, makes perfect sense to my brain. It kind of reminds me of jasuto when it's in editing/2d view.
https://www.audiogl.com/en/home
If anyone is interested, but it looks like the project is on ice, frozen, abandoned in the tundra which is a shame.
It would definitely be hard to pull something like that off in an environment that wasn't integrated but I reckon it's 100% doable in something like Audulus or perhaps SunVox. Either way, that 3d visualization is fekking cool.
@mister_rz it does seem like a lot of work the developer put in to have it frozen due to lack of traction.
Audulus has a zooming sort of interface with real time color changing nodes which is 2D where you have all sorts of controls and sub patches but it definitely is much more programming oriented than clicking together more standard audio modules.
Perhaps the quickest route to something like AudioGL would be the development of a 3D zoomable environment where you can control AU and IAA apps by mapping their controls to the app parameters to make your own Frankensynth. When you load in an app it would create a node with valid connections you could hide and expose different controls to connect to the 3D automation tracks you'd draw or import. Being able to organize them in various levels would allow for easy popping up and down to various levels of the Frankensynth. It'd definitely be a way of leveraging the touch screen of a large iPad rather than recreating existing analog or software models on iOS.
The experience could be like being inside of the instrument rather than just playing it from the outside. Perhaps even a merging of VR game tech with music apps as both depend upon rhythm, context, visualization, and tactile sensation. You could have a team of networked band members playing in the Frankensynth in realtime. The computational load would be distributed across the devices and cloud. The current status of iOS being like an 8-bit gaming system versus the real time console gaming systems on the market now.
@ syrupcore it kind of reminds me of both audulus and sunvox, after watching the vids, (I think there is a better one I couldn't find) and tasting the future my 2d timelines seem a little lacklustre. Hope it inspires some devs, I sent an email to the dev, asking about an ios version, I just hope he is alright.
@ infocheck a frankensynth of such magnitude would be a religious experience for me, especially if the nodes could switch between oscillators (with being able to draw your own shapes), samples and au's, plus fx built in and au/iaa. Would be like an audiobus/aum/audioGL sandwich, I'm getting a bit carried away here, inspired by your post and the possibilities.