Loopy Pro: Create music, your way.

What is Loopy Pro?Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.

Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.

Download on the App Store

Loopy Pro is your all-in-one musical toolkit. Try it for free today.

Mozaic Script Help -or- Other Solution Help Needed

Hi,
GOAL: I play a note on an instrument.
(a) A2M audio to MIDI turns it into a [single] MIDI note, and I send the note over to my Mozaic Script. My host here is Loopy Pro.

(b) The Mozaic script does two things:
[1] It turns the single MID note, sent via A2M into a chord and sends it off to a synth like a pad
(in the script code it turns one note into several notes and sends several MIDI Note On messages )
[2] the Mozaic script continues to send velocity change /MIDI CC messages to the exact same channels and notes to vary the volume of the notes of the chords

PROBLEM: Mozaic has LFOs, but they are "tied" to the host playback (AaaAAAaarrrggggh). I understand the need to sync to the host, but having the option to run them independently would be nice.

PROBLEM 2: While Mozaic supports sending MIDI thru, I want Mozaic to receive and "intercept" the CC values from an LFO which runs fully independent of the host play, and then use my script to do step [2], which is send out the velocity

SPENT TWO HOURS on a worthless script because Mozaic LFOs require the host to play and/or they can't "intercept" MIDI CC and let me use an external LFOs. aAAAAaarrrgggh

Should I just get Quantichord? But the problem is after sending off the MIDI On Notes, I want to oscillate the volume /velocity of the chord harmony notes (the 3rd, the 5th) to give it a more human feel.

ANY THOUGHTS? I have my Mozaic checked into PatchStorage
https://patchstorage.com/chord-maker-from-single-notes/

TWO requirements
1) Turn a single MIDI note into a chord, whatever chord I want (not what the software dictates)
2) Be able to send MIDI CC messages out to change the volume or velocity of the MIDI notes being played

Frustrated in Philly

Comments

  • @Vmusic : what makes you say that the LFOs can’t be independent of the host sync?

  • Check the docs. They can be free

  • I don’t understand what you mean by problem 2. Do you want to thru the midi or not?

    You can intercept midi without thru-ing it. Midi you intercept doesn’t get passed back out…unless you send it out explicitly.

  • @Vmusic said

    [2] the Mozaic script continues to send velocity change /MIDI CC messages to the exact same channels and notes to vary the volume of the notes of the chords

    I just want to clarify that you cannot "modulate" note volume using velocity. A Note On message includes the velocity as the note was struck. Sending another Note On with a different velocity to a synth will have no effect, because it's already playing that note. You would need a Note Off before sending a new Note On with a different velocity, but that's likely to produce a gap and attack effect. You can use aftertouch messages to modulate, either Channel Pressure (same for all notes), or Polyphonic Pressure (few synths implement). And of course you can use a CC, but that will apply to all notes (synth voices) unless you're using an MPE synth and sending it MPE messages (in MPE, each note is on a different MIDI channel, along with the modulators for that note).

  • Incidentally, why not just use the LFOs in the synth that's sounding the notes? They can certainly be decoupled from the note starts. That would be much simpler than trying to modulate individual notes from outside, vi MIDI.

  • @espiegel123

    So I was wrong. Totally wrong. Mozaic can receive MIDI messages from the host or from another MIDI app running in the host. I sincerely apologize.

    Getting apps to work or communicate with each other is often not easy.

  • McDMcD
    edited November 2023

    @wim wrote and shared a Mozaic script to turn a note into a chord with scale settings and such:

    https://patchstorage.com/the-chordulator/

    You should start there and apply a light strum that @wim added to humanize. An LFO for volume is really a tremolo and it sound mechanical to be. On patchstorage there are apps to humanize midi which is usually a bit of time and maybe volume variation. @_ki wrote a randomized script. In my experience A2M picks up notes from a mic and adds extras you might not want in your chords. I should study it with the chord script and see if I can sing an orchestra.

    You can create a series of Mozaic scripts in AUM and experience no noticeable latency. And you can also run a lot of them too. aUM has the best midi routing and Loopy Pro might also be a good choice to make a loop that plays chords and play over that.

  • wimwim
    edited November 2023

    . deleted for now .
    After reading code at https://patchstorage.com/chord-maker-from-single-notes/, I need to review my post.

  • wimwim
    edited November 2023

    @uncledave said:
    Incidentally, why not just use the LFOs in the synth that's sounding the notes? They can certainly be decoupled from the note starts. That would be much simpler than trying to modulate individual notes from outside, vi MIDI.

    That was my first thought as well. It's usually easier and cleaner to modulate within a synth.

    But then on more careful reading, it sounded as though not all simultaneously playing notes would get the same modulation. Only a very few synths have per-voice modulation internally. So, that sounds more like it would require not only an MPE synth, but also an MPE modulation source. That's a whole 'nother ball-game.

    [edit] after looking at @vmusic's code it looks like the intent is to use different midi channels for the various chord degrees, I suppose sending to different synths or instances of a synth. So maybe MPE isn't required.

  • I thought no the LFO is to make it sound less relevant robotic by introducing small changes… some LFO’s do sample and hold.

    Human touch s more likely to be imprecise in ntime to sound human. I think this could be addressed with a randomized or probability function operating system on the midi chord stream.

  • wimwim
    edited November 2023

    @Vmusic - is that the plan? Send the 5th and the 7th on separate midi channels directed to separate instances of a synth or synths? If so then I don't think you need LFOs in the script at all. Just route the cc output of any free running LFO app to the volume control of each synth. A single instance of Rozeta LFO has all the LFO's you need for the Root, 5th, and 7th.

    Rozeta does need the transport running though. So another LFO app such as midiLFOs might be more appropriate if you don't want to have to have the transport running.

    The volume control of the synths would always be modulated. But there's no harm in that if no notes are playing on them.

    Or, better yet, don't use external LFO at all. Just use LFO's in the synths if possible.

Sign In or Register to comment.