Loopy Pro: Create music, your way.
What is Loopy Pro? — Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.
Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.
Download on the App StoreLoopy Pro is your all-in-one musical toolkit. Try it for free today.
How should MIDI be implemented in the new Audanika app?
Hey guys, I'm Gabriel, the inventor of SoundPrism. As you might already know, I'm working on the Audanika, a new digital musical instrument, which is going to become the future successor of SoundPrism.
https://forum.audiob.us/discussion/44558/new-app-audanika-the-future-successor-to-soundprism
The Audanika is already in the AppStore and the Play Store, but be warned: It has only one sound and no connectivity, and no recording. But hey, it has an improved tone system closing significant gaps in SoundPrism's layout. So get already familiar today with it and take off once MIDI is available.
Shortfalls of existing MIDI implementations
With existing DAWs, I often have the following problem: MIDI connections get lost when I switch the environment, e.g., switching from one MIDI interface or another. Additionally, I'm often missing convenient ways to map or filter MIDI channels and controllers.
Thus I'm thinking about implementing the following the concept for MIDI Out:
MIDI Environments
I'm planning to allow users to create so-called MIDI Environments
.
An
MIDI environment
defines how MIDI generated by theMIDI sources
is
processed byMIDI Transformers
and sent through aMIDI Gate
toMIDI destinations
.I'm planning to allow to offer the creation of multiple MIDI environments at
the same time. Thus you can define by yourself how to split MIDI between
multiple environments.
MIDI Sources
The audanika offers multiple MIDI sources. In a MIDI environment, you can decide which sources should be used in the environment or not.
MIDI Transformer
For each of the MIDI sources offered by the Audanika, you can define MIDI transformers. These transformers allow you to do operations like:
- Transforming MIDI channels
- Mapping MIDI controllers
- Filtering MIDI Channels etc.
MIDI Gate
Finally, the transformed MIDI is sent to a MIDI gate. With a MIDI gate, you can define to which MIDI destination the MIDI is sent and not.
Resulting Possibilities
- Create a particular MIDI environment for each synthesizer you have
- Send out the same MIDI to different destinations
- Easily replace MIDI MIDI Interface A by MIDI Interface B without losing
mappings
My Questions
- Is this concept too complicated?
- Does this concept make sense?
- Is there something missing you are needing?
- Did I chose the proper terminology? Are there better names for the parts of
the concept?
Comments
This issue has recently come up in a discussion about virtual MIDI ports in Drambo, more specifically, in a mega project done by @Gravitas which uses over 100 MIDI port modules to implement MIDI controller LED feedback.
It's not only a problem switching to a different interface - because each port has to be re-adjusted to the correct one individually - but also if you want to make such a project available publicly and there's no "search and replace" to fix it easily.
A possible solution could be:
Prompt the user which source fails to find which destination and let her/him replace it with one of the currently available ones. If there are multiple routings in your app that had used the same interface before, replace them all.
Other than that, I don't find your concept too complicated if you find a way to show a clear and easy to understand visual representation of the routing and processing blocks.
BTW what will the "MIDI Gate" do? Forward MIDI to one of the destinations controlled by a gate signal that enables/disables it?
To me it seems like the concept of the Midi Environment makes it easy to route, or powerful (which can be complex)
Sounds like the perfect solution, since midi can be complex for most users
@gatzsche If the "Gate" is just an output routing matrix, it might be better to name it "Output" or "Destination". The term "gate" has strong special meaning for anyone familiar with modular synthesis.
Moving that project from one ipad to another would be laborious for other users.
I’ve now put together a 16 step and a 32 step drum sequencer using the LP X.
I was configuring the 32 step drum sequencer, I added another midi controller
for a basic live looper and all of the midi outputs changed.
I have the patience to reconfigure but an end user would most probably not.
Agreed.
Not sure the full set of capabilities you envision for the transformers but you might want to look at this recent thread https://forum.audiob.us/discussion/45590/generative-melodic-variation-of-existing-midi
@rs2000 Thank you so much for your suggestions!
Here is how I understand you:
When you plug in a new interface the app should ask you
a) If you want to add an additional interface
b) If you want to replace one of the previously used interfaces
If a) Is a is selected, the interface will appear in all "MIDI Environments", but it will be disabled by default.
If b) is chosen, you will be asked which interface is to be replaced. After that, the new connected interface will appear in all "MIDI Environments", but will be only be enabled in the environments, where the replaced interface was enabled before.
Is that, what you have in mind?
The "MIDI Gate" is like an airlock in a spaceship. Sources in a MIDI environment are not connected directly to virtual or physical MIDI destinations. Much more they are connected to the "MIDI Gate" which will forward the message to the enabled physical devices.
This makes sure, connections do not get lost, when a MIDI interface is replaced. Also switching a MIDI interface should be easy.
Yea, its more like an output matrix or better a MIDI splitter. The MIDI arriving at the "MIDI Gate" is copied to all enabled destinations in the "Gate".
I'm understanding that "Gate" is not a good term. "Output" or "Destination" is too general for me.
What do you think about naming it "MIDI lock" in the sense of an "Airlock in an airplane"?
Or how about "MIDI Output Proxy" or "MIDI Destination Proxy"?
@gatzsche Your first reply: Yes, exactly.
Your MIDI Gate: OK, sounds like you're planning to introduce another internal set of (named?) virtual MIDI ports that can be connected to "real" MIDI ports at any time?
If so then I think that's a great idea!
What I have in mind with the transformer is:
Generative melodic variation of existing MIDI is new to me. Are you using that in your productions?
Exactly. But I cannot name it virtual MIDI port, because there is already a concept of "Virtual MIDI ports" in iOS.
Meta ports or port types maybe?
How about Midi Sorter or Midi Router, instead of Midi Gate?
Anyway, your ideas seem very logical and would likely work great.
It doesn’t sound overly complicated, and would save a lot of hassle when trying to change Midi destinations.
I wasn’t sure about this app when it was released, but I bought it, based on some users’ feedback on this forum, to give it a try.
I am glad I did. I am loving this new layout (and the default sound).
When your Midi plans get implemented, this app will become incredibly useful.
Repeat post from other Audanika thread.
Hey @gatzsche Happy to have you in the discussion. Really excited for midi out. It would be really cool if the midi out could send on a multiple midi channels. For example: the vertical “Bass Bar” on the left hand side could be on midi channel 1, and the playing surface chords/notes on a separate midi channel, midi channel 2. Or whatever, just the ability to send midi out on a couple separate channels, so it can play 2 or even 3 different apps in AUM, AB3, ApeM, etc… it could then also be recorded separately, if one wants to, in Atom 2 or whatever. The ability to send both the Bass bar and playing surface on the same channel or on separate midi channels would be dope. Cheers!
Thank you all for your suggestions. I think I will name it just "MIDI Destinations". Do you see any confusion, naming it in this way?
Assuming that multiple MIDI environments are allowed in the app you could:
For example, you could do the following with the MIDI generated by the bass:
As a bit of a MIDI numbskull (sometimes I get it, more often it's all a little confusing) this actually makes sense - as someone who might just be a typical user of audanika I appreciate the simplicity of the approach...👍
I love environment concept, but as a suggestion please make the GUI user friendly
The logic one for example pains my eyes
Maybe you environment could be an IAP with a suite of apps ( remapping , filtering , routing, monitoring etc) for those who like complex stuff
And basic version can be the classic 3 channel model
Yeah... Midi destinations is a good name indeed. Will not be confusing and the whole concept is easy to grasp 💖
exposing some parameters for CC, would also be great.
I just downloaded this app, the concept is fantastic!
For midi I would really like to be able to open it alongside other apps, as koala sampler for example:
As I find using soundprism very difficult to record on the same device because I have switch between apps while playing.
As an Auv3 it would kinda work like this but I prefer to use apps fullscreen, so a side window would be perfect
Anything. anything at all that makes MIDI easier to use and simpler to implement is a good thing. Well done for thinking about how to tackle this.
What – in regards of MIDI – is is so fantastic for you?
As long as it can send midi to other IOS APPS, I’m happy. And again, split screen capabilities would be awesome for this. Personally what I would want to do with it is use it to record chords and melodies into Nanostudio 2. If it was AUV3 and could send midi out to any host, I could load it as such inside NS2 and then tell NS2 to send the midi to the “parent track”. So each track would have an instance of Audanika as a “sibling track”, sending the midi to the “parent” which would have the sound generator. These are NS2 terms if you’re not familiar.
But without all that, just being able to open it in split screen alongside the app it is controlling would be great. And I’m totally fine with it going into a vertical orientation for that if needed
I don't understand the question. I was referring at the melody/chord arrangement, is just genius.
For midi I'm happy for it to transmit bass, melody and chords on different channels, and as I said, to have it available in a smaller window to be able to use it alongside other apps, just like Koala Sampler.
If I need more complicated midi routing I would manage it directly in AUM with streambyter if really needed, but I don't see the use case.
Changing octave would be useful, I find my self playing with only the first three rows of SoundPrism because the octaves are set too hi.
I thought your mention was referred to the screenshots. 😃
Yea, you meant the Audanika, right. Thanks for appreciating it. 💪
I have put it on the feature list. Should I just make the columns smaller in split view or would you expect the number of columns ti decrease?
An adjustable amount maybe? iDevices can be very small or quite large but the finger size doesn't change. And on top, people are different 😄