Loopy Pro: Create music, your way.

What is Loopy Pro?Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.

Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.

Download on the App Store

Loopy Pro is your all-in-one musical toolkit. Try it for free today.

MidiBridge updated

Save/Recall "scenes", Program Changer, and... with support for "MidiBus"!
https://itunes.apple.com/ca/app/midibridge/id449160859?mt=8

Comments

  • :O Have Sebastian and Michael been outed here?

  • edited May 2013

    What is MidiBus? The question may be, is Sebastian and Michael hiding something?

  • I bet they're involved, or at least hope so!

  • edited May 2013

    Wondering whether I should throw sticks onto this fire.....hmmm.....ok, maybe related, maybe not....but Michael released this pic on twitter, and has also been trying to find terminology for a "Master/Slave sync"... sound like a midi description? Possibly an advancement in their plan to connect multiple devices?

    http://img.tapatalk.com/d/13/05/14/by8e5u3y.jpg

  • Midigate

  • I love how now anything that's a scandal gets -gate attached to it...haha

  • Gate-gate!

  • TPM is all over the gates. Love it.

  • edited May 2013

    We are not involved with Audeonic.

  • Oh well 8 hours of midigate anyway. On to another rumor.

  • edited May 2013

    Am i the only one who thinks that it's not the nicest move to name it Midibus although it has nothing to do with the creators of audiobus?Leads to confusion as we can see here and this seems intentional.

  • It's business.

  • Ha...just realised 'BUS-INESS'...unintended pun.

  • I'm with you Crabman. On the off chance that it was not an intentional bite/nod, it's confusing. But it was likely intentional and in that way it's just kind of cheap and misleading. Still, great app and by all accounts a very pleasant and thoughtful developer.

  • ...and now somebody has to force both busdriver to share one driver seat in the same bus and bingo :).

    Where will it end with all the helper apps running in the background?That's why i'm not often using Midibridge (and because i don't get it most of the time^^).

  • @crabman, read Nic's page again. His midib.us is no more restrictive or exclusive than is the Jack API,and like Jack can all co-eist with AudioBus. Unlike Jack and AudioBus, however, Nic's library does not require a helper App to be useful to the devs.

    Helper Apps (and I include MidiBridge here) are useful to the users who have problems fixable most conveniently - or only - by those helper Apps. And their MIDI features will be just as useful to users of Apps that used Nic's library for their internal MIDI management. As they are now even in the face of the cacaphony that is the state of MIDI implemetations today. Perhaps especially so.

    (warning: long spiel follows)

    As to "not getting 'IT': perhaps an illustration of my everyday studio setup - that relies heavily on MidiBridge - would convince you that there are places where these helpers make things possible. And I do not consider this an extreme example - in fact it is the result of several down-sizing episodes from my original studio, and even that was small compared to a pro setup. Back then my day job was designing MIDI routers for the pro setups. For iOS to really go Pro it is going to have to meet the same requirements. So far, few bits do. MidiBridge probably makes the number of usefully Pro bits larger. As does AudioBus.

    System Components:

    • MacBook running Logic Pro and Reason
    • iPad running whatever it can handle at the time
    • Yamaha DX-7E! (in local off mode so its synth is separate from its keyboard)
    • Roland MKS-50 with PG-300 programmer
    • Roland S-550
    • Emu Proteus 1
    • Roland A-80 master keyboard
    • Yamaha KX-5 keytar
    • Oxygen 8 (for its knobs)
    • Korg nanoKONTROL2
    • U33E control surface
    • Akai Synthstation25 used in USB-MIDI mode, also frequently with an iPhone inserted so it is also an audio source (and the iPhone gets to interact with the rest via Network MIDI).

    Needs:

    1. Just sitting down, powering up, and loading up the softwares of the day, and playing any synth from any keyboard or DAW, solo or all at the same time from the (many) sequencers in the system.
    2. Record it - Audio and/or MIDI - on either or both DAW systems, with playback from either or both.
    3. Minimal boot up mode is just iPad (noodling, ideas time, composing)
    4. Final mixdown and Post boot-up is just MacBook5. Tracking and initial mixdown is probably both together.

    The goal is to not have to fiddle with configurations and establishing connections and MIDI channel assignments etc in that last step. Just power up and play.

    MidiBridge made it possible to include the iPad in the scenarios, and the latest updates make it even more so, since I also take various combos of the gear out to jams with friends, or to noodle at on the ferry commute.

    The only place where I really have to limit my choices is in how many and what synths + effects + DAW I can run at the same time on the iPad (new one). CPU power is one limitiation, but the other main linitation is all the various MIDI incompatibilities. It is also impossible to simply visualize and verify the MIDI routing and mappings between all this stuff if one has to go to the MIDI setup page (usuall buried several menus down) in each and every App and step through their options. And most of their defaults simply do not work unless they are the only Synth or DAW running at the time.

    For now, this is raher lengthy, I'll leave the technicalities of how this is all physically connected for another day and expressed interest. But it is really cool.

    So, how would you manage this mess? wthout a helper such as MidiBridge? Or AudioBus for that matter.

  • I would add that one of the main things the new MidiBridge adds is the concept of Scenes, which allow you to create MIDI configurations (routing and/or CC mapping, keyboard splits, etc) that are themselves selectable via MIDI messages. This is something I've seen people ask for a bunch on this forum: the ability to save an Audiobus "scene" or "snapshot" that incorporates the apps, routing, mode, parameter values, and every other aspect of a session so that you can come back tomorrow and just hit Go and get it all back. Or store it as metadata with a recorded track. Or email it to somebody (with the same apps) and have the configuration duplicated on their device. But this requires a bunch of infrastructure, because you'd need every musical app to be able to read a configuration stream to duplicate menu selections, much like macro record from ages ago.

    I actually asked the MidiBridge developer to make MidiBridge a MIDI app -- that is, to not just route and filter MIDI commands as it has 'til now, but to respond to MIDI commands to change it's various routings and filterings, so that I can map a button on my keyboard to a routing change in MIdiBridge. I believe the latest release can do this. Combined with OMAP, this should mean I can have two synths open in Audiobus, and change both the routing of my keyboard from one to another, but also change which app is displayed on the iPad, all without my fingers leaving my keyboard.

    I've also asked several developers to map MIDI messages to panel changes or finger taps -- think changing panels in an app like Magellan or Cassini -- so that with the press of a single button on my keyboard, I can both change what panel is displayed on my iPad and what virtual button/slider my physical button/slider is mapped to.

    The Audiobus guys have been awesome about convincing the vast majority of top music developers to incorporate inter app audio into their apps, and I know that common, uniform MIDI behavior is high on everyone's list. I think to get the scene/snapshot behavior to be common, intuitive, interchangeable, would be a great achievement, but also a bunch of work since following any sort of standard would be voluntary and driven by user demand.

  • Midi makes me pull my hair out. Anything that makes it better is welcome IMO.

Sign In or Register to comment.