Loopy Pro: Create music, your way.
What is Loopy Pro? — Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.
Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.
Download on the App StoreLoopy Pro is your all-in-one musical toolkit. Try it for free today.
Creation of harmonically related polyrhythms by mirroring played note relationships.
This is an idea I've been rolling around for a bit. It was something @blueveek said is his Atom2 thread that started me thinking about it... The concept that rhythm is the same as frequency.
The theory (hypothesis)..
Use the harmonic relationships of "standard keyboard play", and translate corresponding note values, to rhythm, by applying a denominator to each note.
Polyrhythms can then be constructed by live-playing familiar note ratio relationships using the keyboard (chords, scales, etc.).
Concept 1: Can the mind recognize the relationship between Note frequencies played at a Rhythm that is a denominator of the note frequency?
Concept 2: Can the mind recognize the relationship between multiple Rhythms that are all relative to the same denominator of the note frequency?
If a major chord (triad) possesses a relationship of frequency ratios between three notes. If those three notes are mathematically divided by a same denominator to corresponding rhythms. Will the resulting polyrhythm elicit a pattern recognition for the listener similar to that of the chord?
Concept 3: Create an interface that translates notes to rhythms using a denominator. Allow for mapping of the keyboard keys to control which keys (or groups of keys) will send midi on what channel to play different instruments.
The result should be an ability to control the play of polyrhythms on different instruments through live play of the keyboard.
Comments
Intriguing questions.
I'll give them some thought.
Do you have a working example?
For instance you mentioned a major triad?
Assigning different amounts of rhythms to each note?
I did something like this in my most recent
dRambo project using Euclidean sequencers.
I don't know if it's in the same ball park or totally out there.
Anyway I'll continue.
So let's take a standard major triad which is root, 3rd and 5th.
Root has 10 beats
3rd has 15 beats
5th has 20 beats
Beats being the amount of steps.
Instead of using the standard keyboard I used a rotary
so that I can dial the notes which had different steps assign to each note.
It didn't produce chords as such until I added delays
and those chords were passing, not obvious.
Now, to hear this in action so to speak you can clearly hear it at the beginning
of my most recent piece "Euclidean Sequencer iii - Soundcheck".
I'm not going to post it here as it's in another thread titled "YouTube subscribers 1k+"
I could exchange the dial for a keyboard so that when one plays
the keyboard each note will have a different polyrhythm assigned to them.
It could get complex if I were to map an entire keyboard, saying that it can be done.
@horsetrainer Here’s one experiment in miRack based on how I interpreted your first theory. It has three lanes of a polyrhythm where the second lane is 4/12ths faster than the first and the third lane is 7/12ths faster than the first. 12/12ths would be twice as fast. This should model a major chord. I ran it at slowest, slower and slow but couldn’t get any feeling of a chord from it. I ran it at audio rates too (not shown) and that doesn’t have any chord-like feeling either.
I’m willing to believe that this is rather too simplistic to get at your idea but it was fun to put together.
@Gravitas
The polyrhythm at the start of your video sounds good. But I'm confused how you got the relationships for your beats assuming a Root, a 3rd, and a 5th ?
As I understand the math... Using a Major triad as an example:
If a Root has 10 beats
3rd has Ratio to root of 5:4 = 12.5 beats
5th has Ratio to root of 3:2 = 15 beats
In theory, actual note frequencies can be used to convert from note to BPM.
(these are rounded Freq values)
C4 = 262 Hz
E4 = 330 Hz
G4 = 392 Hz
I'll convert to BPM by multiplying by 60:
C4 = 262 Hz = 15720 BPM
E4 = 330 Hz = 19800 BPM
G4 = 392 Hz = 23520 BPM
Divide all note BPM's by a common denominator of 200 to obtain a beat for each note:
ROOT: C4 Beat of 78.6
3RD: E4 Beat of 99
Root by Ratio of 5:4 = 98.25
5TH: G4 Beat of 117.6
Root by Ratio of 3:2 = 117.9
I've no working example yet.
I'm contemplating a Mosaic script, but I keep getting stymied by the math and code needed to recognize keyboard notes and make different notes output multiple note-based-rhythms, with an ability to select a midi channel for each individual rhythm.
My latest thought for a Mosaic experiment is to assign a lower octave to one channel, and an upper octave to a second channel. Then include a knob for selecting a dominator for dividing note frequency, one knob for each octave.
The key concept I'm interested in is being able to "Live-Play in Polyrhythms" using the keyboard.
Where I think this theory may get interesting is when you play a C4 Major triad and you obtain...
(Imagine this all played on one synth patch)
A note at the pitch of C4 playing at 78.6 BPM
A note at the pitch of E4 playing at 99 BPM
A note at the pitch of G4 playing at 117.6 BPM
Next, simutainiously play a Base Note for C2
(It will play a different synth with a Base patch)
Using the above formula, that C2 Bass Note will resolve to a BPM rhythm of 19.5.
I think this is interesting because you can polyrhythms using a keyboard, and the note pitches that will relate mathematically to to both the pitch and the rhythms of each note.
But also.... Playing the keyboard could also be used for triggering live generated polyrhythms for playing ordinary sequencer note sequences.
Imagine an ordinary step sequencer linked to this keyboard controlled polyrhythm generator. You would not be restrained to a pre-programed rhythm sequence. The rhythm sequence can be fluidly "played" by using a keyboard to press familiar chord patterns that are essentially already polyrhythms in the audio range. The interface would slow the audio range to the BPM range, but the polyrhythmic ratios would remain.
I think the cool thing about this is a person who understands how to play chord progressions, can use the same chord progressions they use to play melodies, to instead construct complex ratio related polyrhythms, which are based on standard, known, music theory.
And they can play then fluidly, and play them live.
@horsetrainer
My recorded example wasn’t for the harmonic relationship of the
notes though it can be easily adapted for such.
My example does have more notes in the scale though I
had constrained it to an octave per Euclidean sequencer
keeping within a C-minor scale otherwise it would’ve become unwieldy
which I’ve done before when initially putting together my first Light Synth.
What I was trying to point out is that each note in the
melodies had a different amount of steps per note.
The following steps are arbitrary,
C had 8 steps
D# had 17 steps
A# had 23 steps
It was more for the mechanical principle of the melodies.
My apologies for not making that clear.
I started thinking about how to achieve what
you were describing rather than the precise ratios
hence why I threw in numbers which could easily have been x,y,z.
I’ve done the same in this reply, see above.
Now that I can see in detail what you’re striving for it’s
even easier for me to visualise it in my minds eye.
Basically regardless of the ratios, because the ratios can always be adapted for use case,
you would like to be able to play the keyboard and what you will be hearing
would be instant polyrhythms which maintain their rhythmic and harmonic ratio?
Playing chords like this would be quite fascinating.
The simplest version would be an arpeggiator per note set to different divisions.
You could set note ratios using the maths modules.
Another thing about that piece is that the drums are fluid.
I was triggering, using the LP X and adjusting the steps
on the fly using the rotaries of the LC XL,
8 Euclidean sequencers and a 4 voice 16 step drum sequencer in realtime.
I’ve since put together a 32 step version of the drum sequencer.
That piece that I directed you to was entirely realtime.
The only thing presequenced was the bass riff.
I think instead of using code maybe think practically.
In drambo you can use the note filter module to select ranges and you could very
well use the Euclidean sequencer or even the cv sequencer to get the rhythms.
It could get quite complex because you will need a lot of modules
and requires patience but you can put it together in there.
I’m thinking about it.
It’s a great concept dude.
I will add that the rotaries for my piece can easily be replaced with a keyboard.
I had set myself the challenge of being able to play melodies in realtime using the rotaries
and playing chordal arps using the LPX as well as realtime drum programming.
@Gravitas and @xor
Thanks for your interest and experimentation.
I've been trying to make experiments work using Atoms or Heliums in AUM, but I haven't yet found an accurate method.
But from my first tests I can hear the relationship in the C triad rhythm when it's played fast.
I'll try Drambo next.
This definitely seems like a question that can only be answered by building it and seeing what it sound like.
I have a feeling that harmonic relationships at the audio range depend on many cycles per second to achieve the waveform interactions that we perceive as chords and such. The intervals may be too far apart at the BPM range for the mind to recognize.
But if true, this still leaves the possibility for modifying the intervals for each given note to a multiple that does create a BPM level that works. It just might not be based on a concept of all note frequencies divided by a common denominator.
I think the thought of being able to play polyrhythms using known keyboard note relationships, is what's drawing me into this idea.
@horsetrainer
I do think the direction that you could try would be the Euclidean sequencers
in Drambo as that would provide instant polyrhythms.
Select a range using the midi note filters and start connecting them to set notes.
The other thing which I discovered when putting together the Light Synth
was keeping the ratios of oscillators whilst changing pitch which could
once again be applied to your theoretical model.
So instant polyrhythms using the Euclidean sequencers
and keeping the ratios using maths modules?
Let’s discuss this some more over time and see what turns up.
@horsetrainer thanks for posing this question. I've wondered this exact same thing!
Here’s my take on poly rhythms that can speak to someone.
Pick 2 dissimilar numbers between 2 and 9.
Now create rhythmic voices for each of this 2 numbers (i. e. 2:3, 3:4, 3:7, etc). Bongo and clave.
I think they need to align on a One periodically and that 3 voices is pushing but if so make it a multiple of one of the first two numbers... i.e. 2:3:4, 3:4:8, 3:7:6.
The alignment one a common one is a clue poly rhythms are at play.
10:12.5:15 is really 20:25:30 which reduces to 4:5:6.
Any conceptual alignment with chordal frequencies is probably not unlike mixing 3 paints... you’re going to get a new rhythmic color that is unrelated to the sound of the paint. Just my thoughts... I’m a KISS guy. keep it simple sweetheart.
The game of thrones theme is a 2:3 poly rhythm in the orchestration. Most Celtic music mines this pattern too.
3:4 gets a bit more esoteric but we can hear that triplet against a 4 on the floor. We want to be the flowers triplet or us that just me?
@horsetrainer I’m not sure I understand polyrythm at audio rate. Once you push into audio rate the brain doesn’t hear a rhythm anymore, it hears a tone. My guess of what you’re thinking is to play a note 261 times per second? If you do this the perceived pitch depends on the duration of the note being played. Relatively long durations, around 3ms will sound more like the note you are playing with fast clicks. As you reduce the duration of the played note you’ll begin to hear short pops of middle C. If you combine this with other rhythms you will hear a chord but it won’t really be pleasant sounding.
Here’s an example playing a C6 (1046Hz, two octaves above middle c) at a rate of C5 (523Hz, one octave above middle c) [and a C6 at E5 rate and C6 at G5 rate]. I begin with the C6 sine wave, then the C5, E5 and G5 square waves, followed by the C6 being played at the C5, E5 and G5 rates, followed by the chord.

And another playing a C6 (1046Hz, two octaves above middle c) at a rate of C2(65Hz) [and E2 and G2].

Check this book out:
https://www.amazon.com/Mathemagical-Music-Production-Derrick-Heerden/dp/1494312042/ref=sr_1_1?dchild=1&keywords=mathemagical+music&qid=1622003840&sr=8-1
It's interesting to hear what these experiments sound like.
When I say "polyrhythm at audio rate" I'm comparing the mathematical relationships in polyrhythms to the mathematical relationships of complimentary musical frequencies.
Two or more note (Harmonic intervals) on the chromatic scale can be selected so waveforms interact to form the harmonic relationships we call chords and scales. The math of the chromatic scale reveals that the different frequency waveforms of complimentary notes are forming repeating mathematical alignments (ratios) that the mind understands as musical.
Polyrhythms are similar to note-frequency relationships (ratios), such that different rhythm patterns played together can also form repeating mathematical alignments that the mind understands as musical rhythm.
My questions are...
What makes these frequency and rhythm relationships sound "pleasing"?
Can we identify similar mathematical relationships between frequency and rhythm relationships?
If so what are they?
Hence my interest in finding a method for conducting experiments where I can "transpose" note frequencies played from a keyboard, into midi note pulses that play at a rhythm which is a ratio to the note frequency to which it corresponds.
The midi rhythms calculated from the played notes can then be used to play instruments.
The second level of this experiment, is to use the rhythms calculated from the played notes, to actually play instruments in the same pitch as the notes that the rhythm is calculated from.
Hearing the pulse "rhythmed" experiments are helpful. The end use for this experiment if successful, is to create an iPad based method to use "playing the keyboard" for creating complex polyrhythmic instrumental and rhythm tracks.
"Mathemagical Music".
)
@horsetrainer I've experimented with this stuff first in 1994 on my Roland XP-80. The sequencer has an "RPS" feature that lets you trigger sequence snippets by just hitting a key. You can map a different sequence to each key and combine them spontaneously.
As for the relations between audible and slow/rhythmic frequencies: You can find all kinds of mathematical relations between them but that's not how our brain works, as rhythm and frequency are perceived differently.
What I've done a lot recently though is experiment with resolving chords into melodies, like done by a number of Max/M4L patches like Chordimist for example.

We basically have this on iOS too, hidden inside my favorite chord tool: ChordPolyPad.
I'm recording these separately and play them back at different speeds when required.
I like this idea.
I wonder, you might be able to get a working concept by sampling a short trigger or gate into Flexi Sampler. Let's say the gate has a duration of 3ms, but the sampled file is 1 second long. Set your play mode to 'loop'. Now you duplicate this 4 times, and trigger each with the Midi to Poly module. Because each Flexi contains a gate rather than a sample, you can route the output of each of the samplers to a different percussion element - kick, hat, snare, ride. When you play a chord, you should be producing a polyrhythm. The higher the 'pitch' then the faster the corresponding sampler will loop... but because the sampler contains only a gate, and that gate is triggering an external percussion element, then you don't get any pitch change - just a tempo change. The timing will be dependent on the length of the gate sample file in Flexi.
Now the cool part will come when you layer this 4 voice polyrhythmic gate sample player with an instrument. So you can layer it up with some strings, triggering both from the same midi source. When you play an A minor from A3, then you get specific polyrhythms, when you play a different chord, you get different polyrhythms.
I used to do this with samples in my Yamaha VSS-30 and MPC. You sampled your voice saying 'ahh' or a finger snap or something. Then when you played an interval of an octave, the octave would be looping at 2x the speed of the root, and when you played a 4th or a 5th then.. I don't know the rhythmic relationship, but it sounded good. My method above takes from this concept, but improves upon it because each note played is decoupled from pitch, but retains its rhythmic relationship to the other voices.
Eureka... It Eff-ing Works !!
Mozaic is the tool for it.
My experimental script is crude but I got it sending midi notes so each sounding note plays at a rhythm based on that notes frequency.
Yes, I can hear a relationship between the frequency and the rhythm. It gets really cool when playing multiple notes.
I'll post a video after I figure out this complex math and code.
This all sounds very intriguing
What did you decide on for a base rhythm / hz? C2 for example is about 65hz, so are you dividing that down? Even divided by 32 it'd still be about 2hz, which = 120 bpm. Just curious how you've chosen to relate note frequency to rhythmic frequency.
edit: nevermind! saw you covered your formula above in response to @Gravitas Looks sound to me!
Awesome.
Looking forward to hearing it.
Are you a bot or for real?
😅 Indeed, that sounds more difficult than playing a violin with your feet.
Lolololol nah it's actually not that difficult. you've just got to think ahead. 😁
Build the groove and then play with the melodies. 😇🙏🏽
...from your stereo. Got it. 😁
I got to playing around with this idea a while ago, not sure how musically useful it turned out but interesting all the same. Thought I'd upload the patch incase anyone wants to play around with it. Drambo obviously.
https://patchstorage.com/pitch-rhythm/
There is indeed some new age woo woo in it, but it also has some practical advice for OP’s interests, in a desktop context.