Loopy Pro: Create music, your way.
What is Loopy Pro? — Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.
Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.
Download on the App StoreLoopy Pro is your all-in-one musical toolkit. Try it for free today.
Comments
The thing is, LFO doesn't just send out a single CC message only when the value changes. It sends out a constant stream of CC messages every 10ms or so, even if the value doesn't change. So, even if you have, say a really slow sample and hold LFO, you're not going to get a single CC converted to PC only when the value changes, but a blast of them all the time.
Maybe it'll work for your app, but it seems like it could cause problems for most apps. If it does work, I think it would be very dependent on how the target app handles that kind of flood of messages.
I hope it works out. If not, then some more code would be needed to filter out duplicate messages.
Look up the Squarp Pyramid. It sends PC's via-an LFO. Also, look up the Eyesy. Both these things are hardware, not apps. As well, the Eyesy just handles visual scene changes via-MIDI PC, and does not crash when getting PC's at a rapid rate from the Squarp's LFO (not audio rate but fast nonetheless).
I have done this already many times and assure you it works fine, that's not under question. The only reason I don't want to use the Squarp for this all the time is I'd rather use the Squarp on audio synths as its way overpowered MIDI-wise for the Eyesy's needs.
I have considered other options to replace the Squarp, and Rozeta's LFO is the best candidate. With an option to send PC's via-LFO, it would be essentially perfect for it.
Well, maybe the script will help. StreamByter seems to be a little more streamlined in this case once you wrap your head around its syntax. Good luck.
Thanks! Keeping my fingers crossed that Rozeta adds my feature request but this is going to be a good way for me to sharpen my coding skills.
I have a launchpad mini mk3 that I’m using to control several drum machines from one grid. I’d like to be able to hold a MIDI momentary button and press one of the drum pads to mute that note (or unmute) but I can’t find any way of doing this practically. This is standard behavior in Ableton drum rack for instance, but Atom2 lacks this, and other drum sequencers I’ve found are capped at a low number of drum hits/notes.
Anyone aware of a way to achieve this? I was thinking to use a Mozaic script for the “shift” behavior, and a series of MIDIGATEs plugins for octaves of drum kits to support setting probability to 0, but this is messy. Is there a nice way to do this with Mozaic alone?
Hi, i just updated the ATOM Pattern Switcher & Randomizer with useful and beautiful eye candy:
Update v1.5
+ Added UI Settings menu when holding SHIFT on pads layout
+ Added configurable big pad labels and corresponding icon label to distinguish between script instances. There are 26 predefined variants like Chord, Drum, Guitar, Kick, Lead, Melody, Pad, Perc, Synth,Vocals
+ Added current pattern pad color setting
Making midi devices in Mozaic has been my COVID crutch, my obsession, my life.... for the last year!! Jury is still out as to whether its been a good thing or not. Today I got around to uploading 8 devices to PatchStorage.
QK:KNTRL: Midi control centre for instruments, effects & mixer. Create your own layouts. Control 1000 parameters with 4 knobs.
QK:SCENES is Master scene controller. Set durations and sync scenes across all QK devices and trigger scenes in apps like LK.
QK:CLIPSEQ: Trigger and sequence clip/pattern's in apps like Atom2, LK, Helium etc. 8 channels, 8 scenes, 16 clip slots.
QK:NONK: 16ch adaptive keyboard that updates its layout with the selected channel. Custom layouts. Build your own.
QK:AUTOM: Scene base automation generator, recorder, controller & editor. Add 16 destinations with colour labels.
QK:BEATS: Drum pattern generator & step sequencer. 4 multi-note channels. 8 scenes. Sequence from LP. Use built-in patterns.
QK:NOTES: melody/rhythm generator & step sequencer. 4ch's, variable pattern rate & length. Sequence from LP.
QK:KEYSEQ: Sequence key changes, quantise notes to the current scale. You always compose in C Major & route notes via SCALEQ to be re-harmonised. A 'C' note is always the root note of the current scale.
There's short youtube videos for all devices except KEYSEQ.
@soundtemple Thanks a lot for developing and publishing that marvelous gem and especially for the extensive documentation including texts, tutorial videos and demos ! Sorry for the late reaction, i have been abroard/offline the last 5 days
Since the whole suite is greater that the sum of its script parts and they work closely together, i created a 'Quantum Compuser Suite' category on the Mozaic Script List wiki right at the top
Maybe link to the main QKV2 Documentation by filling in the 'Source code URL' and adding something like 'Press the blue Source button for more documentation' at the end of the Description text-box in patchstorages 'edit patch' page for each of the scripts. This will direct more people to the script overview and video playlist.
@_ki Thanks. Honoured to be on the Wiki!! I added the links to the Docs.
...> @wahnfrieden said:
So, are you looking for something where you.....press a momentary switch, play a MIDI note to toggle its setting to "muted",
and then when you release the switch,
future inputs of that MIDI note are blocked from passing through Mozaic?
It could do that.
I’ve since built it, thanks!
New person here - well professional software developer and occasional musician.
Just bought Mozaic as it looks absolutely perfect for my purpose - which is to reduce the number of switches I have to dedicate to controlling OnSong from my Line 6 Helix. Currently I have switches set to send Notes from Helix which only provides simple momentary actions for MIDI.
Press a switch on Helix and it sends e.g. C2, and on OnSong C2 is set to toggle Autoscroll. Except sometimes the timing varies with the live band and autoscroll is either too fast or too slow and I need to send page down or page up messages. So I have had to dedicate 2 more switches to give me those functions - which is a waste as I only have 10 switches to start with. So what I want is to use Mozaic to extend the simple momentary action into 3 options: Simple Click, Double Click and Click and Hold with the latter being mapped to Page Down and Page Up. Nothing time critical, if it takes a second to start scrolling that doesn't matter.
My generic idea is that Mozaic listens on a (parameter) channel with a (parameter) duration based on https://patchstorage.com/double-tap-and-hold/ and if it is a single press then the note as provided is sent, if double pressed then the note +12, and if hold then note -12.
Hopefully this would be a simple generic addition to the pool of usefull patches.
I am fully into code re-use.... Before I spend any time working on it has anybody already got a patch that does this simple task?
Here's some code I wrote for someone to use with iRig Blueboard. It detects different tap types, then alters the channel of the incoming note based on the detection before sending it on. I never polished it up or considered it ready to publish. It might be useful for ideas for tap type detection though.
The person I wrote it for then used another script downstream to take custom actions based on the note/channel combination received. The reason for splitting the two was so he could modify the actions without worrying altering the detection script above.
I had planned a combined version with a GUI for setting the outbound messages and also detection parameters, but it never got completed due to (self inflicted) scope creep and general ADD.
Thanks - thats a lot more comprehensive than I have put together myself.
I did a some reworking of the example for pad tapping. Like not having the timer fire every 1ms but instead just once at the end of the timeout. And according to the logging it is all working. Different notes are being being send for single, double tap and hold.
I have confirmed it with a second instance of Mozaic using the Note Names script receiving from the first - I get note on and off pairs for the three notes.
But it isn't working with OnSong which is ignoring anything not from connected hardware. It has a virtual MIDI Source and I have directed the AUM output from Mozaic to this virtual MIDI, but no response.
I increased the delay on the note off to 500ms in case it was ignoring short notes, and switched on Midi Note monitoring (where Onsong displays received notes on screen) but nothing is working. I even tried looping the physical MIDI ports on the Helix and enabling MIDI through but even with an attempt at filtering by channel that resulted in a MIDI feedback loop.
Unless anybody knows any secret about OnSong virtual MIDI being sent from AUM it looks as though I need to get on to OnSong support.
Sorry if these are dumb questions, but sometimes things get overlooked: Does Virtual MIDI have a checkmark next to it in Settings > MIDI > Devices > Sources? Also, do own "Premimum"? I notice on the manual page for those settings that it's marked "Premium".
It's OnSong Pro with the MIDI Add-On (in app purchase from before premium)
In MIDI settings: MIDI is enabled, Virtual MIDI is enabled
In MIDI Settings/Sources: OnSong Virtual Connection is checked, Session 1 Network Session and Helix Direct Connection are unchecked. (checking Helix Direct Connection works, but without remapping of the notes)
I will be a bit upset if with the options all active and enabled that virtual MIDI connections cannot be used. Premium in the UK is £50/year and my use-case is the simplest possible single user without any of the fancy stuff such as lighting, video playback, etc.
The cost of premium is such that for this use it would be cheaper to buy an additional bit of hardware to give me the extra switch logic. A Morningstar MC3 would more that cover it including the double click and hold options
Do you have Audiobus? If so, another test would be to send the midi out from Audiobus (Mozaic > Virtual MIDI Bridge). Set up this way, you should see Audiobus as a source.
It could be that OnSong has trouble receiving on its Virtual MIDI port. It wouldn't be the first app to act that way. It's rather unfortunate that AUM doesn't expose itself as a MIDI source. Makes it harder to work with some apps.
I have Audiobus 3 and it's virtual port is visible and selectable in OnSong, but is likewise ignored.
I have opened a ticket with OnSong and we are exchanging emails. I have directly asked whether Virtual MIDI is broken in OnSong Pro with the MIDI In App Purchase, so we'll see what happens.
It is perhaps unfortunate that MidiBridge with the localhost Network port is no longer available, but I don't know that MIDI Network ports work either!
Just to keep you updated, OnSong have confirmed that Virtual MIDI should be working:
Thought I should come back and give an additional update in case it helps anybody else.
I have got it working - I can use Mozaic to convert Helix simple momentary note sends into 3 notes for single press, double press and hold (with a configured delay for feet taking time to double press), and I have OnSong doing toggle autoscroll, page down and page up in response to these notes.
To achieve this I am using Audiobus 3 to route the midi. Input is Helix Hardware, processor is Mozaic, output is (Audiobus) Virtual Midi. OnSong is only using Audiobus 3 Virtual Midi as a source. I also re-installed OnSong.
At this time, exactly the same setup but using OnSong Virtual Midi doesn't work, which is why using AUM didn't work. The OnSong development team are aware of the issue and I am sure will be addressing it.
Thanks to @wim for his very useful suggestions
Hey - I released something I think could be pretty darn useful. Expression Redirector can convert MPE expression and also Channel Aftertouch to MIDI CC streams. It's a little hard to explain what it does, but the bottom line is it can be used to modulate up to hundreds of synth and FX parameters from a single controller. For instance, my Sensel Morph piano overlay has 24 keys and each can send five types of expression. That gives me the potential to control up to 120 synth parameters ... many more if I change octaves ... in a space the size of my iPad.
It will work with non MPE controllers that send Channel Aftertouch as well. Even note velocity for plain controllers can be used to send CC's, one for each key.
It's kind of hard to communicate what it does. But I myself find it insanely useful and think I will be using it all the time (unlike most of my scripts 😂).
EXPRESSION REDIRECTOR v0.4
Convert Channel Aftertouch and MPE Controller Output to MIDI CC streams.
SO! What is this good for? Well, for instance it can enable an MPE controller to send up to 128 MIDI CCs for controlling non-MPE synth parameters from the "Slide" expression (usually sliding up and down on the key). Or a controller that sends Poly Aftertouch can control as many MIDI CC's as there are keys on the device. Even just the AUM keyboard can send up to 128 CC's based on where you strike the key. An MPE controller with Pressure, Pitch Bend, Expression, and Release could in theory control up to 768 parameters!
I dunno - seems like it could come in handy. Credit to Audiobus forum member @stown for this really great idea. 👏👍🏼😎👏
I just tried it in AUM, using KB-1 and Drambo. Works great! Maybe I'll get a Sensel Morph.
(Drambo's MIDI learn function is glitchy, so the AUM project does not work great, yet.)
Should work great with the microfreak, thanks!!!
Yeh, I found it tends to assume controllers are incremental. So often, rather than an absolute controller, you end up with one that increases if the CC is 64 or greater and decrease if below 64. I keep having to go in and change the controller type to absolute. You can check that by long-pressing the control in learn mode.
If you do, you'll need to have it plugged in via USB for MPE to work well. It doesn't work well over Bluetooth MIDI when in MPE mode.
I don't use Bluetooth MIDI so that's fine, and thanks for the tip about Drambo.
I'm most curious about the "innovator interface" thing, and if I could find an easy way to create surfaces with good tactile feedback.
@Skyblazer: as an FYI, I think Morphs are in short supply. Small electronics companies have been hit hard by the electronic parts shortage.
I just updated the Remap Multi-Channel Midi Drums script to v2.0
Added mappings for AR-909, Digistix, DrumComputer, Drumdrops Mapex, EDGR 606, EDGR 909,
EDGR 909, FAC Drumkit, Hammerhead, Koala, sEGments, VADrumSM updating from 30 to 42 drum mappings
Muted pads are lit in red
Hello all, I am dipping my toes into using mozaic and have found the @wim ‘basic midi controls’ patch really useful for controlling the parameters in Rymdigare. It’s great as Rymdigare’s controls are spread over different screens and using Mozaic allows me to have them all close together in a row.
I will try to explain my question. Each time I create a new project in AUM do I need to map all of the CC numbers in Rymdigare to recognise the messages being sent from Mozaic? I’ve tried saving my own preset in Rymdigare but the parameters are all empty when I load up my preset. I then have to tell AUM to recognise ‘midi channel 1 - CC14’ etc.
Is there a quicker way is is this how it needs to be done?
Thanks in advance, this forum is my place of worship and education 🌟