Loopy Pro: Create music, your way.
What is Loopy Pro? — Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.
Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.
Download on the App StoreLoopy Pro is your all-in-one musical toolkit. Try it for free today.
What's all this midi stuff about then?
Forgive my ignorance but as a guitarist first and foremost I am slightly confused as to how I set-up and use midi.
I am using a 3rd generation ipad and have downloaded a load of music apps mostly with Audiobus compatibility. I am amazed how wonderful and creative these apps are. But I wish to gain some skill in combining apps in Audiobus. As an example Glitchbreaks has just been updated with lots more midi implementation, I can't find any documentation for the new features........so.......am I just being an idiot and as a musician, SHOULD know about these things? I'm sure there are many users in the same position as me that need a bit of a midi overview when using Audiobus. Can you point me in the direction of a beginners guide to IOS midi (clock, virtual, though, in, out, shake it all about) or............will you enlighten me?
Comments
There are many sites that cover this. I can't recommend any but if you google "explain midi", that should keep you busy for a few years!
Oh, yes please. Seconded x 100%. For example, Glitchbreaks updated with MIDI, but I have no idea what virtual MIDI stuff it actually enables you to do.
I don't think seasoned music bods understand just how complicated some of this seems to beginners, so some kind of step by step explanation would be amazing.
I'm personally still confused when it comes to people describing apps "sending MIDI to" something, as it always seems the wrong way around to me somehow!
MIDI can be confusing at first, but once you've played with it for long enough it will become clearer and clearer just how simple it really is. But brace yourself, your head is about to explode...
A majority of apps on iOS implement 3 message types which can be sent across all or a selection of 16 channels.
CC (continuous controller) - Numbered 0-127 (cc00, cc01, cc02...cc127) These can be assigned to control virtual knobs and faders, settings like volume, pan, or modulation. There was once a MIDI standard (cc07=Vol, cc11=Expression, cc01=ModWheel, etc.) but this has more or less been thrown to the wayside. What that means is you can assign whatever CC to whatever control you like, whatever suits your needs. CC7 is often assigned to control the volume setting right away, but even that is a crapshoot anymore. These CC's will output a value between 0-127, so if you assigned a knob to send CC7 to the volume fader on a synth, then turning the knob all the way down would turn the volume of the synth all the way down (a value of 0). As you turned the knob up, it will continuously send new values to the volume fader, until you've turned it to 100% loudness (a value of 127). These can also be used to turn effects or switches on and off (for instance, a mute or solo button). So lets say I have a button that I want to assign to MUTE channel 2 on my mixer. I'd set the button to send a CC message (any #, let's say CC#55) with a value of 127 (ON) when I pressed it once, and then a value of 0 (OFF) when I pressed it again. I'd then also have to tell the mixer software that I want the MUTE switch on Mixer Channel 2 (NOT MIDI channel 2) to respond to CC#55.
Note On/Off - This tells the synth/device to play a note (again, from 0-127, where Note# 60 = middle C and Note#69 = A 440, although this can be different depending on how a developer decides to implement it.)
MIDI Clock - This is generally handled automatically and with varying success on iOS. Often, all you need to do is turn MIDI clock ON/OFF in any given app, but make sure that you have ONE app that SENDS midi clock, while all others RECEIVE it. If you have an app that can only send midi clock, but not receive, then this will likely be your MIDI MASTER. Anything which is receiving midi clock would then be referred to as the MIDI SLAVE.
In MIDI, you are given 16 channels, each with 128 (0-127) CC#s and 128 possible Note On/Off messages. This allows you to assign each synth it's own channel if you like. For instance, if all your synths are set to receive volume messages from CC#7, and you've got all your synths set to accept midi on Channel 1, then when you make an adjustment to the volume it will happen to all of your synths at the same time. This is pretty undesirable in most instances, so you could set up Synth 1 to receive MIDI on Channel 2, Synth 2 on Channel 3, etc. etc. Then you could set up a bunch of knobs so that Knob 1 sends CC#7 on channel 2 (this will affect the volume of Synth 1 only) and knob 2 sends CC#7 on channel 3 (this will affect the volume of synth 2 only), etc. etc.
This was a lot of info in a short space and I don't expect you to just understand it automatically, but it's difficult to describe in words, so here's a fantastic resource:
http://www.midi.org/
And don't be afraid to message me with specific questions.
See that's the problem with MIDI its been around so long people think we all know what it is..it stands for Musical Instrument Digital Interface and was developed by the guy who made the old Prophet synths, Dave Richards I think..its a way for instruments to communicate with each other so it saves you the time..in it most basic form it lets you connect a keyboard to your iPad and play all or most of the synths on there just like a hardware synth..if you have a midi control unit you can set it up to control all the faders and knobs too...Also you then have General Midi sound sets..so you can record some music with various instruments then save the tiny tiny midi file, send it to any one else who has the same gear or any gear with GM and they can play it back using the same sounds or a variation of those sounds. Virtual MIDI is similar to real MIDI except that its done inside the iPad or computer..for instance if you set up virtual midi in Cubasis with say Animoog..you can record with Cubasis using Animoogs sounds but only as long as Animoog is running in the background..if you turn Animoog off the notes or MIDI data will still be in Cubasis and you will see them but they wont produce the sound of Animoog unless you route back. But you could then set another external synth or one of Cubasis MIDI instuments to play the notes recorded with Animoog..it goes on and on and on...best bet start simple..MIDI can get so confusing that many app and VSTi's have a MIDI panic button..
Dave Smith
Here is a introduction to Midi PDF from the MMA (midi manufacturers association):
http://www.midi.org/aboutmidi/intromidi.pdf
Everything I know about MIDI I learned from playing with an app i found called MIDI designer.
The pro version is pricey, and competes (on some levels) with the functionality of the Lemur app. The free version is fully functional, but limits you to creating 12 total MIDI controllers on iPad. So, it's very useful for playing around, and testing functionality before paying anything.
The website has quite a bit to offer on information, how-to videos, etc. http://www.mididesigner.com
And the dev is one of these top-notch guys who is very responsive, interactive, and constantly pushing out updates.
Give the free version a try
https://itunes.apple.com/us/app/midi-designer-lite/id492292826?mt=8
Sometimes it's nice to be able to try out what you are reading and learning, so I wanted to include this in the discussion as a helpful, free tool.
So is it incorrect to say that, at a very basic level, a Midi file is just music notation analogous to standard staff-based sheet music?
It's information about all the same things standard western musical notation describes (frequency, amplitude, duration) just notated in a different way, roughly.
Dave Smith..yeah thats him..he still makes great synths...who the heck is Dave Richards..lol.
Midi can bog down your crative workflow if you have to think about it for too long..i.e..wow man just had the most amazing 48 hours in the studio, listen to mega blip sound I made by routing my Novotronickaotowobbler back into my gizmatotron pulverouter...
@nathanbiehl - that's a fair analogy, although midi goes beyond that in that it allows automation of playback and equipment controls, including recalling patches during songs. It is also used to control stage lighting.
Gizmatron Pulverouter... that was my last cat's name
It's all starting to become more clear to me, I understand CC's but for some reason I always get confused whether to pick Midi IN or Midi OUT...idk!!! One thing that doesn't help is that all apps implement different levels of midi control and that can be confusing
Don't stress it man. im also a guitar player first accustomed to a four track cassette recorder who dove into computer/iphone recording. Think of MIDI as a guitar cable going from say your guitar to your distortion pedal. The controller is basically the guitar, or say the app like beatmaker would be midi out to the distortion pedal (say drumjam) and then on drumjam you would click on midi in. Since you got gltchbreaks, you probably wanna mess around with beats so also learn about midi clock because you can have all your apps sync up to the controller clock sorta speak. so youd also have to click on clock out from the head app sorta speak, and clock in from the apps who will be following the tempo.
Whatever apps/devices you wish to control will receive the control data on the midi in port. Whatever app/device is doing the controlling will send data out its midi out port to the apps/devices being controlled.
In the iOS app world, the main implementations are core midi or virtual midi. These are the 2 routes that you can communicate with your music apps. Core midi usually refers to midi data going to you apps via controllers like an external keyboard, drum pads. Midi guitar etc. Virtual midi refers to iOS apps talking to each other, so like the pop up keyboard in Cubasis can trigger sounds in Sunrizer and so on. There's also midi transmission of a wireless network.
The above was a simple description of the communication options. The rest of midi revolves around what kind of data can be transmitted. ie note on/off events, volume/ dynamic changes, modulation wheel, pitchbend etc
Midi isn't limited to musical instruments. live shows can use midi to control light shows etc
That's a problem of inconsistent midi implementations across iOS apps. Not a diss but having 100 different devs implement it 100 different ways is just asking for usability trouble. Hopefully the AB team is still aiming on doing something about this.
Presuming the UI isn't impenetrable, imagine you're hooking up physical MIDI devices when you have to make a decision. You want thing A to control thing B. With hardware you'd go from A's midi out port to B's midi in port.
Agreed about inconsistency, hopefully in the future it will be more uniformly implemented
@Mmmwahaha To just get a quick idea of what virtual MIDI does for guitar players on iOS, watch this video and skip ahead to 7minutes:37seconds:
http://www.youtube.com/watch?feature=player_detailpage&v=30vjsdUN_iI#t=458s
Great demo @Blue_Mangoo!
@syrupcore: Nothing AB can do about this. MidiBridge already does all that is possible and that can't do anything about it either. The problem is getting all the App designers to agree on how MIDI should be handled. And not all App designers even agree on what MIDI is. Ranges from NLogPro and littleMIDI at the done it right end, to Korg iPolysix at the done it real wrong end (no choice on who to receive from or what channels, no send). Not counting the ones that did it not at all. And nobody seems to have really figured out clock. It's not just clock, start, and stop, guys. It's also continue and song position pointer. And MIDI time code. And these are MIDI Real TIme messages, not CCs!
MidiBridge is in a special place of its own. It is a MIDI connection manager and mapper. It is instructive to load up other Apps and through MidiBridge you get to see who got it right and who did not, because only the ones that got it right show up with full functionality.
We had it right 20 years ago. The MMA acted as a standards organization, published the specs, required compliance to call your cable MIDI, did not require you to imlement everything but did require you to publish what you implemented
Aaah! accidentally tapped submit. Sorry.
--- and required registration which gave you the right to the MIDI mark, and a unique Manufacturers ID for Sysex so it would be impossible to hook up things that responded to the wrong messages. (We are 00 00 55).
I do not see iOS ecosystem paying much attention to the MMA at all. Hence the confusion. Yes, the MMA is at the physical cable end, but MIDI is a kind of a trademark so using it properly should be a given, but isn't. I know somebody at the MMA is watching, but I'm sure with some bemusement.
So I'm not surprised you newcomers are confused. It's almost as if the lore has been lost. And having three different type names for MIDI ports - Network, Core, and Virtual - just makes things worse. How a port becomes available to connect to should have no bearing on how it is implemented or the transmission medium. That's for the programmers who wrote the port to figure out. The whole point of it at the user end of things is you have a patch bay for MIDI, ports just show up with some instrument-assigned recognizable name, and any Out can be connected to any In with predictable and stable results. And if not explicitly connected nothing happens. Which MidiBridge understands. And Korg (e.g.) clearly do not.
** lighting etc - MIDI Show Control is the formal name for the system. I played a small (tiny actually) role in its formulation. High end venues use it.
OK. Got that off my chest. If you read this far thanks for listening. MIDI done right is so essential to smooth inter-App control and it is so frustrating to me to see it done wrong or not at all in such a widespread manner. Almost as an afterthought, left to later updates if we're lucky. Contributes to the "toy" image the Big Boys have of iOS, When we know it is - can be - so much more.
If the individual iOS apps aren't getting it right (midi clock), this maybe a job for a third party app developer.
I'm just spit balling below, forgive me if any of you think my idea is silly, not feasible or just the wrong way to go about it.
Imagine a time code/ sinc app that would work in between other apps. So your DAW would record a code from this app, either midi/smpte on one track then any time you hit play/stop or advance forward backward etc, the recorded code would tell all other apps where in the pc it is. This third party app could distribute the SDK just like Audiobus does now.
Could it work? i don't know I'm not a programmer.
My 2c as a guitarist and app developer - I agree and disagree that there's nothing AB can do about it
Agreed that there's nothing that AB can do to improve the state of CoreMIDI / Virtual MIDI as it is. The standard is already out there, iOS exposes APIs that let apps do "MIDI things", and AB can't change that. So those problems won't go away. And there are definitely a ton of problems.
However, AB can (and really should!) create an alternate MIDI transmission system which is simplified (like AB itself is) such that the user gets a clear view of what's going on. It would allow users to create MIDI routings (like they create audio routings in AB today) and then manage sending the MIDI messages from app to app based on the user-created routing. It would require new APIs for developers to support, as part of the AB SDK or a new "MIDIBus" SDK. It could interop with CoreMIDI / Virtual MIDI as well but internally would use a much clearer routing model so apps and users don't make obvious mistakes. In principle this could be very similar to MIDIBridge, but have a much clearer UI, perhaps impose more constraints on connections, and most importantly, use a separate MIDI transmission system rather than just trying to make CoreMIDI and Virtual MIDI slightly better.
If you browse the OMAC Google Group (open to users as well as developers) and Michael's somewhat recent Audiobus blog post about Audiobus, you'll notice that something very interesting happened around the time Michael tested his first prototype of Audiobus. From what I can observe, it was initially intended to be an open standard that anyone could freely implement, like CoreMIDI / Virtual MIDI. But then Michael + Sebastian realized it would suffer from all of the same problems as iOS MIDI and thus decided to make it a slightly more closed, controlled system. This was a pretty awesome and game-changing decision, and has resulted in a ton of value created for users, developers and the AB team. The same principles can be applied to creating a better MIDI routing system, and will similarly result in value creation.
Quite frankly, anyone could make this - doesn't have to be the AB team. But they're best-positioned to do a great job of it and get it quickly adopted by a wide range of developers as well as users.
Actually, @AkaMarko, I'm kind of pining for a really lightweight purely MIDI manager myself. MidiBridge does the routing, and I believe that that is a properly stand-alone role, and the MIDI system management (time, mapping) is similarly properly a separate stand-alone lightweight system.
Also like to see a straight-forward tape recorder analog in MIDI, one without all the overhead of the synths that can bow be satisfied from the wealth of such Apps out there.
The current crop of DAW seem to be suffering from the same kind of bloat that affected the desktop publishing business - each one attempts to satisfy every need for your studio , and all of them have some bit they do well and some they don't, not quite so anyway. AB and MB let us mix and match, but with every App we load having to carry all the deadweight of the bits we want done differently the iThing quickly runs out of execution resources. I'd rather mix and match just the bits that fit my personal idea of the "right" workflow and components. And I have to say, TableTop, for all its teething issues, may be closer to such a framework than AB+MB. But much more closed.
@dwarman I disagree that AB can't solve it (they've hinted that they are aiming to!) - or at least provide a solution that devs can choose to implement or not. I could go on but @rhism said it all way better than I ever could.
IMO, the MIDI situation is getting better on iOS, and I'm hoping everyone converges on virtual MIDI (with Apple turning the network session interface into a virtual MIDI port). I doubt that a new option would help things -- we'd just have one more partially-supported standard to further confuse the issue.
I also don't think that a new system would get quick adoption. Everyone jumped on board with Audiobus because there was no alternative. Either you're on the bus, or you're left behind. With a MIDI alternative.... are you really going to bail on a working solution, in the hopes that every other app developer will bail on a working solution too? The US still uses feet and inches -- yes, it's stupid, but there's so much inertia that we're not going to change course.
Virtual MIDI is not any harder to implement than CoreMIDI -- so given a choice, that's what I think developers should go for. In my experience, I've had very little trouble connecting a virtual MIDI controller to a virtual MIDI synth. When it doesn't work, it's because one of the apps is severely broken (and if that's the case, there's no system in the world that would make it work).
@SecretBaseDesign It's not just about ease of implementation (although there's a ton of room for error there too), but ease of configuration for the user. Right now it's a nightmare. MIDIBridge is the best "MIDIBus"-like thing that exists right now, and it's far from user-friendly. This thread, and many other threads on the AB forums, proves the point that users don't really get it. Folks are discussing MIDI issues more than AB issues on some "XYZ is on the bus" threads. There are fundamental problems with the state of MIDI routing, because of which users have to learn a bunch of secret handshakes to get stuff working.
Apps wouldn't need to bail on any existing solutions, just like AB apps haven't stopped supporting stand-alone usage. But there is definitely a huge vacuum in the "user-friendly MIDI routing configuration" space. The Audiobus SDK is extremely easy to integrate for developers, thanks to Michael and Sebastian's hard work in streamlining the complex stuff. If they did something similar for MIDIBus I would support it in a heartbeat.
I guess I don't see the need for another layer of indirection. With virtual MIDI, you just toggle on the interfaces you want on the controller (or sequencer). At worst, you have to go to the synth, and toggle things so that it will receive (this should be on by default, IMO). There can be some bad user interface design that makes the toggle switches hard to find, but a new connection system won't fix that.
Much of the trouble that people seem to be having is that they have no idea what MIDI is, what it does, or why you'd need it. MIDI is relatively new on iOS, and some of the early implementations in apps are a bit broken. Having three systems (plus DSMI) doesn't help. If everyone supported the currently existing virtual MIDI solution, we'd have the problems sorted out in no time.
One confusing thing with virtual midi is that we have both input and output ports. And it's very easy to accidentally create double connections. If all apps only had virtual input ports, and a list of which other apps/hardware input ports to send to, everything would be a lot easier. So you'd never receive directly from other apps output ports, but only the apps own input port and hardware input ports.
In Gestrument, I disable sending to any virtual port if the apps' virtual output port is enabled/selected. That way I avoid users accidentally making double-connections. Same thing for the input port: Either you receive on your virtual input port or some other apps virtual output port, but not both kinds at the same time.
@j_liljedahl Yep you touched on one of my pet peeves with virtual midi. This is one of those secret handshake things I was talking about.
I agree if Virtual MIDI were universally adopted, and folks stick to one direction of connection, and expose all their MIDI settings in a uniform way, then some of the pain would go away. But IMO to a (non-engineer) user, a MIDI routing management UI would make things a heck of a lot clearer to understand. They already understand the AB model very intuitively, and it naturally lends itself to extend to MIDI. A visualization of apps sitting in boxes, connected to other boxes, with animated arrows showing the direction of flow - these things make a monumental difference. This becomes even more important when you factor in MIDI Clock