Loopy Pro: Create music, your way.

What is Loopy Pro?Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.

Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.

Download on the App Store

Loopy Pro is your all-in-one musical toolkit. Try it for free today.

What's all this midi stuff about then?

2»

Comments

  • I dsagree that AB folks should do a new API. The existing iOS MIDI API is already sufficient and complete (maybe missing one layer of abstraction - three types into one?), the problem is that many Apps failed to implement to it fully. All AB can (and IMHO should) do is provide the visual representation that non-engineers need to understand things. MB is a bit short in the pretty visuals, and there's always been room for competition in that area. But please, please, don't make it yet another API, or we'll get the same fragmentation we already see between AB and and Tabletop.

    But for those Apps that implemented to Apple's existing API, MB does a proper routing solution. As in, turn off all ports in all the Apps,, so they only get stuff from their defalt IN port they publish, and do all the connections in MB (or alternates as/when they arrive). No double connections, visually see who sends to whom, no yet another custom API for programmers to deal with. At most it might be nice to have a little library (need to look at pgmidi here, maybe it exists but has been ignored?) so App writers ony have to deal with one port type, but sit it on top of iOS, not another App like AB.

  • I disagree that it's obvious how to configure MIDI stuff on iOS. For example, I have never ever tried this Midi CC stuff (and tbh I don't know exactly what it means) because it looks horribly complicated. Take a look in Thumbjam's menus - there are so many options! What do they all mean? How should they be configured? An Audiobus for MIDI would be a godsend, IMO.

  • OMAC developed a pseudo standard for virtual MIDI and also the concept of app switching. Audiobus came from OMAC and they have successfully been able to get developer buy-in and commercialise it. Audiobus is the logical place for OMAC and MidiBridge type functionality to go. MB is a brilliant app but not exactly an attractive UI but it is very flexible once you get the hang of it.

    MidiBridge does loads more than MIDI routing (filtering, remapping, velocity splits, keyboard splits to name a few) and the developer told me that the next version will have user defined 'scenes' where you can setup routings and program change settings and then activate them from MidiBridge or via a MIDI program change. The program changes get pushed to the various apps and devices so it would be a step towards being able to recall entire configurations of apps, devices and routings at the touch of a button.

    Seems to me that Audiobus should buy out MidiBridge with their new found wealth (!!), integrate it into Audiobus (and give it the Audiobus UI magic touch and impose rules for MIDI on app devs) and then we would have all audio and standardised MIDI control in the one place and with MIDI controlled app switching (MidiBridge has this but very few apps support it) and audio+MIDI configuration recall. That would be very cool!

  • Totally agree that the problem from this side of the glass is inconsistent setup screens. If AB can normalize that, it'll be a big win. I assumed in these hinted at MIDI in AB conversations that AB would be providing a unified UI and perhaps a services/configuration layer on top of existing iOS MIDI APIs. I've been using midi consistently since 1986 and am still often confused by various apps' MIDI setup screens.

    Or: everyone make your midi config screen like Funkbox'. :)

  • @snoopy - what he said! Also, I implemented that level of MIDI reconfig in my Lone Wolf MidiTap in 1990. If anybody is in Seattle this coming Sat, I'll have it showing this kind of functionality (and why it is needed) in eanest at the MMTA SynthFest, jam room, guest players most welcome. The rig also uses MB's CC mapping feature to turn a nanoKONTROL2 into an 8 channel mixer surface by sending each fader to a different channel number as CC #7.

    @michael_r_grant: I don't thnk anyone said it is "obvious". That's why a good visualization scheme is needed, and why this discussion is taking place at all. Even for myself, in my own quiet way something of a MIDI wonk, it took a couple of sessions to get my MIDI network mapping for MMTA worked out.

  • @syrupcore - mixture of inconsistent setup screens and inconsistent/incomplete levels of MIDI implementations

  • So one thing I've learnt from ~10 years building developer-facing SDKs is that if you give developers multiple ways to do something, they will do that thing in different ways. If you give them options, they will use them all. If you allow them to get fragmented, they will get fragmented.

    ACP worked well because there was only one way to do it, so everyone did it that one way. AB is good for similar reasons, plus folks have the benefit of Michael / Sebastian's guidance. But MIDI. There are too many ways to do MIDI. Perhaps OMAC can cajole the universe of music app developers to conform to a standard. But I'm not optimistic. There's no 'leader' so convergence is slow at best.

    I want a dictator who understands my needs and tells me exactly what I (and other devs) must do to make everything 'just work'. There's a world of people who are curious about doing interesting things with music. The folks on this forum are generally the 1% in terms of sophisticated understanding of how stuff works.

  • I agree entirely... I only understand 1% of how stuff works...

    ...oh.

  • @Rhism, I'll take a stab at that. Not that I'm a dictator, nor want to be, but I probably have as much networking experience with MIDI as anybody else here, going all the way back to 1983, and all the way back to 1971 for data communications in general including protocol design.

    It naturally follows that I have a bias, and that is the bias of a serial router designer.

    So, here goes, believe it or not briefly, my ideal solution:

    MIDI-using Apps need minimally only publish 0-n IN and 0-n OUT Virtual MIDI ports. Minimally one IN for a synth, one OUT for a controller, and one of each for a sequencer.

    That's it.

    Just like physical gear. There are synths, and sequencers, and routers. Sometimes synths include sequencers, but I never saw one try to be a serious router as well. Not even the A-80 with its 2 IN and 4 OUT ports.

    They should be selective as to which MIDI channels and system messages they receive and send. Although availability of good routing utilities can make even that requirement more of a nice thing than a requirement.

    Then, and MidiBridge already services this market but there is plenty of room for others, we need MIDI routing apps that present some analog of a patch bay paradigm.

    These Apps will take on the task of seeing CoreMIDI, Network MIDI, and Virtual MIDI ports, and present them as the ins and outs to the patchbay, such that all the user has to do is connect an OUT to one or more INs, without them or the other Apps worrying about the type.

    Pros:
    No new API for Apps to use. They already have the Apple API and it is sufficient, and they only need the VM part of it.

    Lots of room for competition and differentiation in the routing demographic.

    The exists at least one routing solution already that is useable for 90% of what I've tried so far, so long as I am loading up only those Apps that support VM properly. That's a problem with the Apps, not the router.

    Cons:
    Users must acquire at least one routing utility.

    This is a high friction point, and there will need to be at least one credible free option. Which does not currently exist.

    It would be real nice if Apple could do this, like they do for OS X users, but experience suggests this is unlikely to happen in iOS.

    Note: Audio has a similar problem, and AudioBus provides one solution, and it's not free; TableTop another, and it is free, also covers MIDI routing via learning, but no way to visualize the actual connectivity, and external connections only Core and Network MIDI. Also feels somehow much more closed than AudioBus. This solution for MIDI, unlike AudioBus and TableTop, can offer alternative and much more complex routing solutions without requiring App code changes (beyond maybe fixing their VM support).

    There is much room for routers to do more. For example, route down to the channel level , message filter sets per IN and OUT channel, route RealTime messages differently, route Sysex messages differently, associate maps with the connections as well as ports (this is a subtle one but exceedingly powerful). And then of course the visual renderings I foresee getting quite exciting.

    "Just Work" solution:

    I have found it to be a useful simplification to treat the iPad as a single complex instrument satisfied by looking like it has a single MIDI cable each way to the outside world. Here, the current solution of listening to every source that can be found suffices, as long as the number of sources is matched by selectivity in listening.

    To some extent this also works internally, but not completely. Indeed, I can see a router simple enough to be free that does exactly this, as a MIDI Bus that every App can connect to both to send and to listen (although the code has to ensure no App hears itself, or to make that optional with the default being not). Like AudioBus, in fact, and just as topologically limited.

    And just as likely to not play nice with Apps that do not know how to shut up and stop listening to everything promiscuously, or talking when others should be. As is an endemic problem right now, with any routing solution.

    But it breaks down when the sources and destinations available get to be more than a couple, and when the Apps are no longer operating in isolation but are expected to co-operate with each other and the outside world.

    At that level each App must come up with a way of being selective first about which ports it listens to, second which messages it listens to on those ports. This is where it gets tricky, and (your point) where every attempt to solve this is different, yet the user has to mentally model all of them, and visit every one of them, in order to establish and verify the MIDI topology they need for the project in hand. If it is even doable with the components involved.

    The VM + routing App reduces that problem considerably, in that all of the routing is done uniformly within each router, and the topology is immediately visible without resorting to trying to mash together several mutually incompatible mental images of all the players tromping on each others' shoes.

    And then it is quite OK for different routers to use different paradigms, since at any one time only one would be in use (er, hopefully, although only minds would be exploded if one ran more than one:) - that after all is the competitive aspect of this model.

    YMMV

    OK. Done now. Is this reasonable, and if so how do we get there? If not, why not, and how do we get there instead? Or all of the above, though that would just increase the cacophony.

  • @dwarman OMAC is where it started but is a desert now. My understanding is there is a dev only section on the Audiobus forum. I imagine all the devs have moved there and are probably discussing this right now and is probably the forum for this to happen. MMA were engaging with OMAC and I suspect they would have an interest as well.

    Apple could do it, but as said here already, unlikely to be in their roadmap.

    As for a free solution, well ultimately devs need to be paid for their work, so I suspect this would require Audiobus/MidiBridge (assuming they merge/takeover, haha) change business models and charge devs to join the scheme. Probably not going to happen.

  • Yeah, I know, one can hope. I also know that to do a freebie requires the company to have something else they are selling and the freebie acts as a come-on and tool to attract and enhance the main line. We sold our router, but gave away the software that gave a graphical way to manage the system, not required (each box had a hard 16x2 LCD UI) but made it a lot easier. Demonstrating it with that software at winter NAMM 1991 was what made the difference between "that's interesting" and being nominated for an AES Tech Award that year. Shiny things.

  • @dwarman So I wasn't kidding about wanting a dictator. Your proposed model may very well work. But unless someone is making sure everyone follows it, a lot of folks will get it wrong. The experience won't 'just work'. Because of this lack of quality control, I'd be less motivated to do it perfectly myself since I know that I'll still have to deal with support requests because of other apps doing it wrong. Which further compounds the problem.

    E.g. this model only works if the apps themselves do the right things to avoid double connections, i.e. controllers shouldn't send messages to CoreMIDI or other apps' virtual inputs when sending to their own virtual output (and the same with synths and virtual inputs). Unless someone enforces this, devs will get it wrong. It needs to be idiot-proof, for devs as well as users.

    @PaulB it's true that the 1% only knows 1% of MIDI, and the 99% know nothing :) The reason a lot of great apps get MIDI wrong isn't because the devs are dumb. MIDI is a complex beast, and some apps (GeoSynth/Cantor, iFretless Bass, ThumbJam) push it far beyond its initial design (with awesome results), which makes it even more complex. Audiobus made audio routing simple not just for users, but for developers too. ACP did the same with audio transfer.

  • For the most part, I'm with @dwarman -- there's a basic level of functionality that apps need to get to, and "that's it." It's not a particularly high bar, and many apps have been there for a while already.

    Basic MIDI really isn't that hard (for either developers or users). If you look at an app like Genome -- a top seller, nicely put together, but only useful if the person using it can hook it to MIDI synths. And it sells well because lots of people can do that. In my experience, getting a couple of apps to talk with MIDI takes a few toggles in each app. Finding the right page to toggle is probably the hardest part (and that's UI design, not a problem with MIDI). You don't need to be genius to get MIDI working.

    Virtual MIDI in particular is a good, already existing solution. Not perfect, but there's the adage that you shouldn't let the perfect be the enemy of the good. Most people will want very simple routing (controller app connected to synth app, or sequencer app to synth apps), and setting that up is a breeze. For the folks who want to do some complex routing with crazy keyboard splits and arpeggiators -- by the time you know what it is that you want, you'll also know how to make it happen.

    The power and flexibility that MIDI gives you is so huge, and the effort to get it right so small, IMO people should demand it. This lets users match the controller they like to the synth the like, and get the best of both worlds.

    There are plenty of free apps that will send and receive MIDI -- the free version of my guitar app sends MIDI, for example. There's also Little MIDI Machine, from the Funkbox guys. For the folks on the fence, grab a MIDI app, toggle on a connection, and find out what you've been missing. Don't be scared off by all the MIDI-is-uber-hard FUD. MIDI won't bite you; jump on in, the water's fine.

  • I think that is a given.

    So somebody needs to write the killer gasket library for them. Open Source it, even Public Domain. Port management only. They can do their own message management.

    Using probably terrible names and no error handling (at this dash it off level) (details to be added as implementation reveals the gaping holes in the model):

    note: VMPM == Virtual MIDI Port Management

    typedef struct { size_t size; time_t timestamp, byte msg[]; } VMPM_mmsg;

    id = VMPM_CreateMIDIPort(bDirection, sName, fMIDIHandler(mmsg&))
    VMPM_DestroyMIDIPort(id)
    VMPM_SendMIDIData(id, &mmsg)
    VMPM_ReadMIDIData(id, &mmsg, bBlocking)
    VMPM_EnableMIDIPort(id, bRun)
    VMPM_SetMIDIPortMode(id, bPromiscuous)

    Wrap 'em any way you like but this is the minimum set I have found to be a useful abstraction for port handling. Elaborations for industrial strength robustness needed but keep them to a minimum too. Note that only the Apps ports are here. It owns em all, both IN (via callback or direct read), OUT (via send when ready), creation and destruction. The router side then connect from and to these ports because the OS will simply make then enumerable in the port space. App need not do any enumeration.

    But make no mistake, what has to live under those few function calls is not as simple.

    Message filtering is a separate issue. I have a table driven solution for that too that works pretty well.

    Just not in iOS yet. C only. And a Lua wrapper, even more handy. And FORTH, but you probably don't care about that.

    As to devs and dumbness: this stuff is, indeed, hard. It took us several programmer years to get our soft topology MIDI network routing stack to stage-ready robustness. At one AES meeting back then, somebody in the audience stood up and said "This is not rocket science. In rocket science you know the parameters of the operating environment. No, this is harder. You don't know those things, but it has to work anyway". And since I have also been playing in the Audio DSP stack arena these past few years, I now have that visceral feel for how hard that part of the system is too. My hats off to all you out there doing things like iMini and Auria and AudioBus. I am nowhere near that level.

  • Probably getting too technical for the forum here -- but the Apple CoreMIDI framework does pretty much what @dwarman sketched out, and there's tons of example code. You don't have to write your own network stack. IMO, getting MIDI working is roughly 10% of the challenge of getting audio working. All of the really gnarly protocol stack hacking, interrupt handling, and so on -- already available to any developer who wants to use it.

  • So here we get into a mystery area for me - the social engineering to get them to want to use it. Trouble is, I usually don't find out about an Apps' MIDI deficiencies until after I've bought it, beause only very rarely does a review or blog comment mention it to warn me off. By which time their incentive to change is gone, 'cause my little voice isn't loud enough on its own.

    Am I the only person doing a 6 synth live jam setup with 2 keyboards, where 3 of the synths are external hardware and three also sit inside on AudioBus? I have about 80 Music Apps, and it was really tough finding the three that both played together nicely on both fronts and had nice knobby CC mapping capabilities. And even then one of them (Sunrizer) published a VM IN port but did not actually listen to it, and another (Magellan) saw and listened to all other OUT ports without the ability to select a subset of them. Neither of which then permitted the use of MB mapping abilities cause they also listened to the raw incoming pre-map data. So they are doing a lot of redundant parsing of messages they then throw away.

  • @SecretBaseDesign On the LiveGuitar thread on this forum, users described issues sending LiveGuitar MIDI out to SampleTank, Sunrizer, Alchemy, Music Studio and Cubasis. Not ragging on your app, none of these were LiveGuitar's fault. They were all either user error or a bug in the other app's MIDI code. But these are fairly sophisticated users, and fairly sophisticated apps. So the fact that these sophisticated users and sophisticated apps are having trouble with MIDI is a pretty glaring indication that MIDI is not easy.

    And yes, some of it comes down to UI design. But good, clean UI design is really, really, really hard to do. If it were easy, UI designers would be out of work :)

  • It's all hard. I myself probably create the nerdiest UI's of all. But nice engines underneath, if I say so myself.

  • @Rhism -- there were some hiccups, but AFAIK, all the MIDI issues were solved except for SampleTank (which doesn't even run correctly for me when Audiobus is going -- they've got a conflict with their audio session). All the fixes involved toggling the interface on in the right spot (I'll grant you, UI design is where a lot of apps fall down, but I don't see it as a reason not to embrace the technology). I've got a pile of synth apps that support MIDI; SampleTank is the outlier, and I can probably get to the bottom of what's going on there.

    The free version of Live Guitar has full MIDI support; we want people to know what they're getting into, what the app can do, and we don't want to surprise them with limitations. If they want Audiobus goodness, and a couple of other things, they can grab the full version. If not, then nobody got hurt, and everyone is happy. There's no try-before-you-buy in the iTunes store, so we're doing what we can to be fair to the users.

    I sort of see the proposition as "here's 100 awesome synths, and you can use any of them in any way you want, but two of them will be a pain in the butt to configure" versus "you don't get to use any of the awesome synths, and you're stuck with just the sounds the app produces." I'll take the former, but to each their own, I guess.

  • edited March 2013

    @SecretBaseDesign I don't think anyone is suggesting we shouldn't use Virtual MIDI. Definitely the former!

    Audiobus can normalize config/setup screens likely with some visual finesse. They are also obviously well situated to fill the role MIDI Bridge has been filling.

  • edited March 2013

    @SecretBaseDesign Yeah to be clear, I'm not saying that developers shouldn't support MIDI. The ability to send MIDI around is awesome, and I've pretty publicly stated that I'll be adding MIDI to guitarism as an IAP. I'm just saying that it's much harder than it needs to be, and a Audiobus-like solution that brings it together will add significant value to the ecosystem.

    Perhaps you have a ton more experience with it than I (or maybe I'm just really dumb) but I've got a long list of things to do just to get basic MIDI working well in guitarism with low latency, good usability, high reliability, easy setup, avoid double-connections - all the little things that cause failure. If Audiobus took care of all that I could just integrate it with a bit of code and could focus my time on stuff that's more core to my app

Sign In or Register to comment.