Loopy Pro: Create music, your way.
What is Loopy Pro? — Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.
Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.
Download on the App StoreLoopy Pro is your all-in-one musical toolkit. Try it for free today.
Comments
Side thought: MidiFire can host Mozaic and also handle in/out routing duties for Drambo. You might be able to concoct some elegant way to support what you want to do with it @AlmostAnonymous .
I've used midiflow before. The issue here is the scripts aren't just translating and routing. Theyre exchanging large amounts of sysex to control lighting, encoder switching, encoder updates, etc for hardware.
This is where I was going with the coreMIDI thing, but i didnt know midifire could do mozaic. Thought it might have only been streambyter scripts (thus the rewrites). I only messed around with it briefly in the past. Bome Network Pro took care of most of my midi routing/duplicating/etc (no scripts), but it appears my setups are outgrowing it.
ah...now I follow. That very well might work. Merci. I'll give it a go. I probably should rethink my midi routing needs and see if I can handle most of it in midifire before it even reaches host apps.
have a 4th post too
.
More Pointless Expression
MIDI Puke Explosion
My Prog-rock Experiment
Multi-Port Excrement
Must Purchase Everything
Mediocre Performance Enhancer
Musician's Penis-Envy
Merlin's Penis Enchantment
Man-splained Painful Exegesis
Machismo Puerile Emotion
MusicRadar's Platinum Endorsement
(Sorry.. i got a bit carried away there!)
Obviously not a complete design as I skipped synth four.
deleted - wrong thread.
Right wim.
My crazy setup though. Just for an effect. Another ipad, audio 4c and faderfox pc12.
Mad.
The £1400 effect lol.
MPE = Modwheelx3 Per Eachfingertip
They are always active on their FB, maybe try asking there?
Glad you like the series. 😁🙏🏾
Yeah, I had the same thought myself.
The quality seems really good, they like to show that off on their FB page.
I got another audio interface yesterday so it's up to you....😏
Yup.
They are cute...
.
I've just seen more of your requirements incl. sysex - Sorry to say that but on iOS, you might hit more than one roadblock, including limited size sysex messages (unless Apple have improved that in recent iOS versions, dunno).
Yep, using MidiFire sounds like a good idea if you can get the MIDI part to work there.
Drambo can even be hosted as a pure MIDI effect there.
@rs2000 i've hit no road bocks yet. the usual issue is the devices dont respond fast enough and i have to put delays in.
my launchpad script is sending 18 value sysex arrays x 32 arrays right after each other...no problem with no delays. those are the largest ones i've been working with so far. i could probably go larger.
and the more i thought about it today, the more midifire is making sense, especially hosting mozaic. if i can work all of the communication between the scripts and controllers before ever touching an app/host, and just route of everything out of midifire on a virtual port...i dont have to work around balancing that midi with a project template midi
Speaking about Victor Porof (BlueVeek Atom Midi-developer) - he totally disappear from the stage?
Victor has’nt logged in here on Audiobus-forum since july 2021 - what’s happening with him? Too much work?
We miss you Victor!
I think the reason that some DAWs don't record MIDI from plugins is to maintain a straightforward "signal path" and in some cases, they may have been designed before MIDI fx were a thing. The complication is that often we want to apply the midi fx to the incoming midi stream but don't want to record the output....we want to record the notes played on the keyboard and when we play the track back we want to apply the effect. That is pretty straightforward. The DAW doesn't have to treat the recording/monitoring any different than the playback. If we want to record the output of the effect then we need the DAW to record the effect output and then turn off the midi effect...because we rarely want to recursively apply the effect.
None of that is insurmountable, but it just isn't super straightforward...especially if you didn't originally plan for it. The straightforward thing architecturally is to he able to route the output of a midi track to another midi track and have that output be post effect. But for some reason, a lot of apps didn't design that in and it may destabilize the architecture to add it in even if it seems like it should be straightforward.
If I remember correctly, he works at google now or something.
Yes it would!. Easy and straightforward. As I said, automation would cover it except for the random/cycle thing.
It’s funny how use cases are so different. You might see something and think “why is this missing?. It’s so obvious!” When in reality no one else seems to care.
Two words: Choke. Groups.
I’ve given up on that quest.
Ok...Thanks, guys!
Seems like it's worth dropping the idea, for the moment. I guess MPE is still too early in it's infant-hood for developers to go crazy completely overhauling their apps for what might end up being a fad.
no. MPE is amazing and there is now a standard/docu!
https://cdm.link/2022/05/midi-polyphonic-expression-has-been-updated-with-one-powerful-feature-better-documentation/
As I've said earlier, I'm no stranger to MPE. I use it with my desktop setup.
The issue is not that MPE is standardised, but that it's not trivial to implement and so it's yet to gain any real traction in the iOS-world. And really, just because there is a standard protocol it doesn't mean that it won't eventually die out (There were presumably standards for implementing 3D on TV screens, but I've not seen one in a store for years).
The problem in getting MPE off the ground is that the surfaces used to properly interact with it are still way too expensive. As such, it's gonna be much harder to sell the amounts necessary to get it into the hands of everyone, insuring it's longevity, Ironically, it's mobile tech that could lead the way forwards, in this regard, given it needs no other equipment than the tablet already owned.
If you have any recommendations for solid iOS MPE implementation, perhaps I'll start a separate thread (Given Dr's currently limited implementation, it's off-topic)
Cheers
is it possible me to send audio from Aum to be recorded into the flexisampler in Drambo?
So is Atom 2 abandoned?
Not exactly a pro but I would record to a Drambo ( mfx in aum effects slot ) Save as a project via the aum Drambo but save with samples option. Then open via drambo. You might then also have sequence triggers also. If made via aums drambo but you might be able to route aum output?
i’ll try it out. now that i’m thinking of it, it might just be easier to record an audio file i. AUM and just import that into flexi sampler
Inspired by @echoopera’s excellent video from the other day, I've started a new YouTube series called 'Let's Beepity Boop'. In this, I get 10 minutes to improvise a track. Obviously I'm starting with a few vids on Drambo and this is the first. The aim is to show off various aspects of Drambo's workflow and create something... not totally shit? No talking after the tiny intro, just some beepity boops!
Who knows - after 50 of these, I might start making some good music...