Loopy Pro: Create music, your way.
What is Loopy Pro? — Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.
Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.
Download on the App StoreLoopy Pro is your all-in-one musical toolkit. Try it for free today.
Generator or Instrument. What's the difference?
I always see the different instances but have never found a good explanation on the difference between the two. Thanks for any info.
Comments
Lol let me know what you find out!
Instrument receives MIDI, Generator does not. However, both produce audio.
IMHO - I think we need a little more context to give a meaningful response. Can you provide specific apps and/or description that use these terms.
Even if we restrict the use of these terms to music related topics, there are still many possibilities. Also taking into consideration the Internet and poor translations, the terms may not even be appropriate in the context they were used.
You can send midi to a Generator in AUM using "Midi Route".
I think they are very specific terms as related to audiobus and AUM routing.
@ExAsperis99 - I concede that you are correct with respect AudioBus and AUM routing. Although this is the AudioBus forum, I took the question to be of more general terms since it was not specified it was in reference to AB. My bad.
They are used across all of iOS but not necessarily in a user facing way. They exist in the context of both, AudioUnit Extensions and Inter-App Audio.
Apple defines a generator as something that produces audio with no MIDI input while an instrument (or music device) does produce audio and receive MIDI.
No problem at all. I saw that you were entering the realm of nuanced discussion and logic — which clearly has no place in the discussion of iOS midi!
Thanks. I was more specifically referring to how IAA apps are listed in AUM, either generator or instrument or both. In this case I was using Module and sending midi to it, but its listed as a Generator. Both receive midi in AUM and seem to provide the same routing functions, so I'm still not sure exactly what the distinction is. I appreciate the input though.
Thank you for this straight to the point explanation
@samu once provided me with the most accurate and concise explanation (which I think is a little more than the above) I have heard.
It made complete sense until about three minutes later when I forgot what he said. Maybe he'll chime in when the time-zones align.
The IAA instrument MIDI port in AUM will always say it’s specifically be listed as Inter-App Audio. The developers of IAA apps need to add IAA Generator and/or Instrument. Some generators can also receive/send MIDI and will be listed as Virtual in AUM’s MIDI grid like Fugue Machine. Depending on your setup, you may need to try the virtual or IAA MIDI port or it may work without connecting any of them to the grid in AUM (e.g. Fugue Machine and Addictive Synth).
Usually generators require the midi matrix to route midi. Instruments let you do it in the track settings or midi matrix. Talking AUM here.
Korg Module can receive MIDI, but I think it uses Core MIDI or virtual MIDI. I don't know what's going on underneath---I kind of miss the days of physical cables---but AUM could send it MIDI that way even if Module is a generator with regard to IAA.
I hunk that Generators can generate midi to the host, I.e. via a keyboard or step sequencer in the Generator, while Intrumemts only receive midi from the host.
Confuses the shite out of me every time though.
Edit- and Generators, if I have this right, are both send and receive.
In terms of grooveboxes and drum machines Egoist (an instrument) automatically syncs to an IAA host while Attack (a generator) does not.
So developers should always choose Instrument then? Why settle for Generator?
The app doesn’t use MIDI or the developer is new to iOS and doesn’t know IAA well enough to do instruments?
Yah I think instruments are simply better than generators if they can pull it off Really though I think IAA is on the way out. I imagine AUs will just get better and better over time.
yes
as an example garageband allows only audio generation
Cubasis allow to choose app as an instrument and allows both Midi and audio generation
IAA Generator = When attached Outputs Audio, Midi need to be fed separately.
IAA Instrument = When attached receives Midi and Outputs Audio.
In both cases (as far as I know) Midi Input (ie. to be able to record incoming midi from an IAA Instrument) needs to be configured manually in the recording app.
Personally I try to avoid IAA(Instruments & Generators) as much as possible.
It’s all down to history really I think
Once upon a time there were music apps and if you wanted to use the sounds of more than one there was AudioCopy
Then came Audiobus and the short lived Jack for iOS and Apple decided these were a security risk in some way
So Apple produced IAA and the Audiobus guys rewrote AB so that it was using IAA generator technology and as a result all the AB apps that updated to the new AB sdk got AB generator ability as a freebie. Many of these apps already had core Midi features and didn’t bother to add IAA midi.
And now of course we all want AU
little classification schema:
So please help me out with this set up:
I wanna record from Navichord in to Cubasis the Bass note only when I play chords. I I’ve setup BS-16I as an AU with a Bass Preset in Cubasis. But here’s the thing, I want Step Poly Arp to be in the middle of Navichord and Bs-16i so I can record the midi output of SPA arping the Bass AU in Cubasis.
This is madness!
In short it feels like you need to 'chain' some of the apps, Ie. use Step Poly Arp as an 'Input' to Cubasis and set Step Poly Arp to listen to Navichord. (Don't have those apps but that's the logic behind Midi Chaining).
I know I'm not helping but have you considered looking into modular/euro? Sounds like it'd be right up your alley
@Samu thanks! Only thing that confuses me is once I add a midi track in Cubasis, say I add SPA in there, couldn’t decide if to add it as Generator or Instrument. So I went with instrument. But then I couldn’t figure out how to send midi from that channel which has SPA to the channel that had BS-16I as AU. I did set SPA to receive midi from Navichord and set the correct midi channel corresponding to the one that Navichord is outputting midi. But how can I send to the channel hosting the AU?
@ka010 thanks! But have no clue what modular/euro is
Also if say Cubasis is hosting Navichord and SPA as IAA, do I need to make midi settings in the respective apps or just in Cubasis?
No need to host them in Cubasis since you're only inputting midi to Cubasis.
And yes, midi-settings in respective apps.
Thanks Samu!
I was able to record the midi already arped from SPA! It was so cool to see that arped midi for the Bass recorded in Cubasis. I did host Navichord and SPA in Cubasis though as this allowed me to sync start all three apps via the IAA side bar.
For some reason I’m finding that the playback or may I say sync is a bit tighter with IAA? Don’t know, just feels like Navichord Song Mode plays better as the chords play back.
Well, thanks for the advice.