Loopy Pro: Create music, your way.
What is Loopy Pro? — Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.
Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.
Download on the App StoreLoopy Pro is your all-in-one musical toolkit. Try it for free today.
Comments
I think understanding modulation routing is mandatory before understanding quantum physics.
The MS-20 in all it's forms is not only a great synth but a great teacher of modulation. Even without a single patch cable it has the default routing printed on the front panel letting us know what's happening and what's going to happen when we twist a knob. It really asks to be messed with and explored. In iMS-20 it's helpful to just focus on the synth section and ignore the sequencer and drum controls.
+1
I’ve always wanted to know why scales are treated as important or why they’re even a thing at all? Aren't they simply a specific restriction on which frequencies to avoid? Why? They’re all available all the time throughout the universe, so why so many sets of disallowed frequencies at any one time? One should be able to specify any frequency one wants, and then go to any frequency one desires, at any time. What’s the point of restricting so arbitrarily? What about when the patch uses vectors between an initial frequency and a target frequency, such as when using a ramp generator modulating the VCOs. Or even, in a basic and common case, using portamento to impart a curve from one frequency to another. The sound event can’t be said to have a specific frequency at all, it glides from a frequency to another frequency without ever starting or settling on a fixed frequency. What use are scales in such cases, then? Am I just supposed to duck or gate or silence the parts where it glides into one of the disallowed parts? And how wide is the window of discrimination of what that note is? A few Hz below and a few Hz above it? More than that? Precisely the frequency of the note that is not in the scale? It’s all too arbitrary to be useful.
And come to that, what’s the point of the black key on a conventional keyboard? They’re not only unergonomically placed, but the pattern of distribution is disconcerting. Each time I see it I keep wanting to redesign it so that the spacings are even and symmetrical, not incorrect as they presently are with no meaningful pattern. It’s uncomfortable to look at. If they’re too far back like they are, how can I reach them efficiently if I ever wanted to press one? Is this another case of systemic exclusion and pointless austerity and denial, when the universe in fact provides availability and abundance? What if I want a frequency that is in between a white key and a black key? Where would that be?
And why is there such idiotically mixed up use of words in music — if I’m talking about keys on a keyboard and the song has a key change, what am I supposed to do? Dismantle the synthesiser and unscrew the keyboard and change the keys? What to? Just mix them about a bit? Not all of them fit together side by side, so there’ll be unsightly gaps. Or are they talking about changing the locks on my door — that kind of key change?
And why is it that when learning music we jump ahead instead of starting at the beginning? For some reason, I was pushed to learn about playing C first. Aren’t we going to do A and B? Surely they’re the ones to start with? I feel as though I missed out on the fundamentals without starting at A. On the other hand, it was all a lie anyway — the songs in C actually didn’t actually use just C, they used a whole collection of other neighbouring keys too, all pressed one after the other.
Honestly, I think all of music theory should be redesigned — it’s a total hotchpotch. I doubt anyone understands it.
@u0421793 Music is a reflection of cultural history and how we respond to various frequencies and their harmonies/discord. Some instruments like the piano are made with a 12 tone scale in mind. There are guitars designed with more strings and different layouts that can be played using different scales. Similarly throughout the world there are different instruments played on different scales. Scales describe a relationship between the frequencies. There may be sin waves that produce a pure tone and others like a guitar where you have a fundamental frequency and several additional overtones. Timbre is a term which refers to the sound of an instrument as a trumpet, guitar, violin, flute, and piano all sound different even if you play the same note on them.
Many people do understand music theory and it is not a static field as people continue to explore.
There are alternate keyboard layouts such as hex. The traditional black and white keys are still used as many people have learned to play piano with this layout.
Some music such as percussion is based upon rhythm rather than a scale.
You might not find scales or music theory useful for you when you create music but many people do.
Would you say that it was a bit of a mistake, in retrospect, to have MIDI based more around which key on a keyboard is being pressed, rather than which frequency is intended? In other words, if you want to record/receive/transmit and ultimately play the conventional octave of ABCDEFGH notes, (well, maybe not the H), MIDI is ideal.
If we want to specify the notes in-between the notes in-between those, or specify a gradient or vector of starting point and ending point, MIDI is clunky at that. Similarly, midi finds it more difficult to describe a blown instrument because it is structured more around note start and key-up events, whereas many times a wind instrument might have started to be blown before anything is audible, then fade in, then fade out a bit then fade back in. MIDI treats it as a sequence of note ons, not offs, and with frantic envelope parameter alterations between each.
I know on my guitar (yes, I know a guitar chord - can't remember its name, but I was taught it in the 90s) when I had my Roland guitar hex-pickup - midi converter going into my Oberheim Matrix 1000 the separate pitchbend on each string picked up all the natural handling fingering messiness (more so in my case). At the time I was using an Alesis MMT-8 sequencer (at the time I refused to use a computer to make music — they're for proper work, like Illustrator and QuarkXpress, and I can't make music sitting on my arse and slightly moving one finger of my left hand, I've got to jump around doing it!).
I hit about four chords and the MMT-8 displayed on its LCD “Bummer dude, Memory full”. Honest! That was the error message. It occurred to me then that in the days of CV/Gate we could do bendy swoopy curvy sound synthesis a lot easier than in MIDI, which prefers everything to fit into slabs and rectangles and steps.
If only the first implementation of MIDI came from the direction of a theremin and not a Prophet 5!
@u0421793 MIDI has its limitations and is certainly a product of when and why it was made by music hardware manufacturers to connect various music hardware together which was all based on the western 12 tone scale. OSC can be an alternative and has more flexibility and band width. There are ongoing attempts to expand and update the midi standards.
Two words @u0421793 ;-)
Harmony & Germans
;-)
Music is organized sound. Limitations provide structure.
MIDI Is one way to organize, there are others.
Check this app:
Wilsonic by Marcus Satellite
https://appsto.re/us/NrhMY.i
Check this video:
@u0421793, @Paulinko, @MusicInclusive, interesting indeed, me I think it is just universal uniform conformity at work, this probably pervades everything, not just music/sound. With regard to music this effects not just pitch, but rhythm and timbre. Not to mention fashion!
@knewspeak - H is so much better dressed ;-)
OK. I shall stop now ;-)
Let's see. What else do I not understand? Hmmm. Drum humor... Ba dum bang.... Tishhhh
:-D
OK. Now I'll stop...
With you there @Flo26!
Funny though - I can imagine dancing about architecture :-)
The Power of Expectation
Hey @u0421793, maybe you'd like the Virtual ANS. You just draw in frequencies. There is a keyboard along the side just to let you know roughly where you at..For example Middle C is not fixed at 261Hz but anywhere from 247hz to 279Hz.
Great thread. The shocking thing I don't know (I imagine while sat here mewling to myself) among all the other things I don't know is what you people mean when you talk of, say, SunVox and invoke 'trackers'.
There. I feel better now.
I'm just learning so much, sometimes I wish we had 48 hours in a day. With Sunvox and Mitosynth, I'm finally learning how to tweak sounds. Like Doug said, modulation matrixs drive me nuts, but like him, thats why I like presets. But lately, I'm really getting into creating my own stuff. Feels so good to hear stuff I teamed and got to sound just the way I like it. So far Mitosynth is helping me get there. I need more time!
@Musikman4Christ I have been campaigning for 30 hour days for some time now. Will let you know if I make any progress,
No one says you need to stay within a scale. They're a thing because scales are mathematical constructs that express the intervals between notes (ok, numbers). Music is art and music is math. Named scales are a musical expression of the math part. Any set of notes repeated across octaves could be considered a scale—just might sound like shit and not have a name.
Scales are portable in 12 tone music. That is, a pentatonic scale in G has the same set of intervals as a pentatonic scale in B. Scales are a convenient way to describe a collection of notes to another musician. True, scales can provide a simple way for musicians to 'sound good' together quickly but again, just because scales exist does not mean there also exists a law that says you must confine yourself to it.
I know about 4 scales on the keyboard, for the record.
Black keys are fine. Play the white keys further up and move your shoulders as you go up and down the keyboard—watch any good keyboardist. I'm not sure how you could look at an 88 key piano and not see a pattern with the black keys but perhaps I'm missing your deeper meaning.
On an instrument that isn't the piano.
Others have already discussed why we use the 12 tone equal temperament scale so I'll skip it but I'll add some other reasons: money, mechanics and replicability.
Pianos are expensive to build, things need to be fixed in order to fix the cost.
Mechanics/trade off: sure, a piano type instrument could have more than 12 notes per octave but how many octaves could you fit in? 12 may be limiting for some but that affords us the engineering possibility— and, more importantly, the musical possibility— of playing very very deep tones and very high tones on a single instrument.
Conforming to those 12 notes also means that I can go anywhere in the world and play music on any piano. Synths didn't start with piano type keyboards. That happened because of market demand: people want to play on interfaces they know and understand. MIDI's limitations are a natural (and nowadays unfortunate) extension of that.
I think you're having a go but I'll reply anyway: that's is just English. English is stupid—ask my poor 7 year old.
Because the black keys do form a pattern (both visually and musical) and C is a) in the center of a piano, b) easiest to spot and c) is the only 'key' that has all of the notes of its major scale on white keys only. But if you abhor scales, I can see how that might not be a selling point. Certainly, flopping around on white keys is a beginner-friendly way to get pleasing results.
I certainly don't but most of it is pretty well vetted and clearly/cleanly defined at this point (most of it was defined a gajillion years ago). I wouldn't call it a hotchpotch. People understand as much of it as they want/need to. It's math and is similar in that way to other mathematical studies.
That's an inspiring post. Glad to read it.
I'd think Sunrizer a good learning synth because everything is on the same page. It doesn't have a modulation matrix but instead has predefined routing, save the LFOs. All a modulation matrix does is expose all of the endpoints—anything that can be modulated and anything that can send modulation—you just connect them and sometimes set the amount. You wind up doing a lot of the same routing over and over again anyway so some synths have just predefined some of the routes.
More terrible English, I'm afraid. I have no idea why they're called trackers. They usually consist of one or more monophonic vertical grids where each step can contain a note (and dynamic information like velocity and gate) or a rest (no info entered). If you've ever peeked at the MIDI Event List in a desktop daw, it looks quite a bit like that.
They traditionally (90s) offered you access to the sound making chips on your computer's soundcard. They came into their own when soundcards started offering multi timbrality. If you know how to work them and make a type of music that's conducive to the style (say trance or techno or something) they can be an incredibly fast way to work.
Thank you.
Heard and (mostly) understood. I will feel a little better about going out in public now without the scalding shame etc. Seriously, good to learn stuff.
VividTracker is another tracker app based upon tracker software that ran on Amiga computers. You can load up files that were created in the .mod format which contains the samples and note information used in the songs. To see a tracker in action, you can view the app's video in the App Store.
Thank you Mister P. I will take a look.
Don't forget for the frequencies in-between the chromatic scale notes you have ... pitch bend! :-)
Zetagy, that video is excellent. I watched it all the way through and will do so again. A very well-produced vid, as well as very valuable content. Thanks.
By the way, I bought scalegen and gestrument a while back but never used them until this week. I made a couple of versions of a scale. Not sure what to do with it, but it went into gestrument, an app I first found infuriatingly annoying but couldn’t put my finger on why. I accidentally used it upside down and it all suddenly made sense (except the writing was upside down, so nothing made sense).
JohnnyGoodyear, I have an aversion to trackers. They came out when I was working on games magazines (as an art editor, not a games player, I've never really played a game), and some of the games reviewers on the Amiga mags started playing with them, telling me about them, thinking I’d be impressed. However, I’d spent thousands on the second incarnation of my electronic music studio full of proper gear, and I wasn’t really about to take some silly little upstart Amiga or Atari ST program seriously, if it’s so cheap and quick and …what horrible grating graphics, and can it only produce those mechanical dance tunes? Must be a toy. I dismissed trackers purely out of snobbery and this attitude has persisted.
I'm glad someone mentioned envelope followers as I never quite understood it, at the moment most of my focus is being taken up with really understanding mixing a track, in particular understanding m/s processing, sound placement, dealing with phase issues as I like to stack pad and atmospheric sounds like a scooby sandwich.
For example I know at a rudimentary level how to put something back in a mix, but have trouble with placing sounds above or bringing things forward on the sound stage, I've done it before but by accident, not having a true understanding. I have problems dealing with phase issues, mainly mono compatibility and pad, strings and atmos sounds, in order to make them more mono friendly I end up losing what I liked about them and make them sound too thin and lifeless.
With m/s processing as I don't hardly use mics to record instruments, drum kits etc, only mainly voice and environmental sounds, I've only just started scratching the surface with m/s processing to give my tracks some more width, I've got a few plugins for desktop on my wishlist to help me with these issues, but I have to really understand these things and not rely on plugins alone.
So for me I've been so carried away enjoying the creative process that I've overlooked many basic things when it comes to mixing and mastering, for years I tried to get a mastered sound out of my modest setup and ended up over doing things and scratching my head, not releasing I was trying to run before I could even crawl. So for me these are the things, where I feel I should be a lot further ahead in my understanding, then I currently am, hence where my focus is at the mo and yes I do feel slightly embarrassed, ashamed even that it's took me so long to start getting my head around the art of mixing and mastering.
On the subject of scales It's based on the harmonic series, freqeuncies that work harmonically are turned into notes on a keyboard, that's why certain notes work together like C-E-G and other notes sound wrong together (sharps and flats). It's the fundamentals physics of frequency overtones. There are longer lectures but here it is in 4 minutes.
I am omniscient- but my bandwidth is choked
Lots to this art/science I don't understand but it's possible it wasn't on accident and instead based on the instruments in the arrangement themselves (vs an 'accidental' mixing thing you did).
In recording anyway (vs mixing already recorded tracks), it was always expressed to me to keep it simple: If you want it in the background, use a darker mic and move the source away from it. Reverse for foreground. I realize you don't use mics to record instruments often but you could try applying the same techniques for mixing—roll the highs off the background and put a little room reverb on it, brighten up and compress the foreground element.
@syrupcore
Your mic example has given me a good mental image in a literaral sense, regarding the distance, I tend to over complicate things when I'm mixing and even in the early stages, love layering things to the point where it can cause problems later on.
Sometimes in my tracks the instruments, drums, perc and bass sit in such a way that it's a pleasure in the mixing phase, so I understand what your saying about the instruments and arrangement having an affect on the soundstage, other times though I've had to do some heavy editing, have to leave some tracks for a year or two, let them settle in my mind and come back with a fresh perspective.
Think this is where I'm going wrong, making bad decisions at critical stages, like the other month I was trying to give my drums more presence and punch focusing on the kick and snare, got stuck in a quagmire, reverted back to an earlier version and focused on a few percussive elements and it sounded much better. Keep it simple is great advice, which is too easy to overlook, so I appreciate it, thanks.