Loopy Pro: Create music, your way.
What is Loopy Pro? — Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.
Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.
Download on the App StoreLoopy Pro is your all-in-one musical toolkit. Try it for free today.
Comments
Maybe not specifically for Korg users, but Inter-App Audio in general. Or a better way to put it would be ”everything that is not AU or external synth”.
The general idea is to use the SynthJacker slicing engine to slice an audio file that was recorded outside of the app. You could generate a MIDI file based on the SJ sequence settings, then play it back with your IAA app and record it, then bring it to SJ for slicing and other post-processing.
@dendy yes that would work thank you.
@coniferprod excellent idea. Sounds like an ”easy” and general solution. I know developers hate to hear when people say something is easy
I will recommend SJ to the deluge community as it is becoming an essential tool for me.
Thanks for giving the Thor and the Thor people some love! Really appreciate it!!!
When you think about, this is all you need to sample IAA apps. Most users have AUM which can host IAA apps and record the output. So in my opinon there’s no need for audiobus integration since the solution is so general that it will work with any app the can receive/play midi and record audio.
Exactly – and that is why I don’t really see the need for full-on Audiobus support.
@coniferprod : I think supporting audio input via AB (which is all we are talking about) is way more straightforward than you think. It would also have freed you from needing to host AUs yourself.
I am not sure that more people have AUM than Audiobus, btw.
Off topic : There was a recent poll where AUM dominated. It’s normal since AB3 was introduced much later and first versions where miles behind .
Back on the subject, we don’t know if creating an AU host is easier than creating an AU plugin. That’s for the dev to decide.
A couple of thoughts
I believe (not as an expert obviously!) that IAA, and therefore AUM compatibility, comes as a side benefit of implementing Audiobus compatibility. So it’s not a question of either/or unless a developer decides to go IAA only.
Here's a real benefit to being able to sample inside AudioBus. I could load an IAA or AUv3 app and insert FX (after a synth to get EQ for example) and then pipe that signal into SynthJacker. With AUv3 loading I have to accept the output of the AUv3 without any additional controls... some of the AUv3's by the way do not even allow me much control of the app itself. AudioBus was/is and will be the best tool for piping audio and midi together on IOS. There are 100+ developers that got the message and coded to AudioBus as a defacto standard. AudioBus is typically required to get a complex setup to start/stop from from a single transport button.
So... not the future? No. Trusting Apple to solve all problems is tricky. The fact the @coniferprod got AUv3 hosting to work is testament to his skills. It's not a trivial exercise given Apple's documentation and what the standard does. He had to code a host to get us this capability and it's worth every penny. But added AudioBus support as an FX app (meaning we feed it audio to be processed) instantly doubles if not triples it's value to me.
But I do still want the longer sample time too. I just wanted to pipe in on the AudioBus is the essential standard most great apps support. The rush to AUv3 or nothing has been oversold here. There are apps that can only be glued together with AB3 like the @LuisMartinez Drummer's and iBassist for transport controls. Synthjacker could help us make loops from this apps but only with AB3 support to feed the MIDI and record the signal with some acceptable recording time. SynthJacker now offers the option to just save the whole recording so multi-minute recordings of anything could be made by just triggering drums and saving the recording. That's could be useful too.
Anyway, I love what it does for AudioLayer and NS2 sample sets and the SFZ output also works for import into Auria Pro's Sampler. It's really great and dumping enhancement requests are what we do here. We are a swamp of "free advice". Take it for what it's worth.
But, you forget that AudioBus is an app. There's no reason the MIDI input from SynthJacker couldn't be sent to Audiobus Virtual Midi, and the resulting output recorded with AUM or AudioShare in the output slot. Or ... simpler ... just sent to AUM in the same way you would send it to any other app.
I agree that Audiobus capability is a good thing to have, but don't see the compelling need for a one-time capturing solution like SynthJacker, that requires no live performance capability.
Audio input is not enough, because SynthJacker needs to be able to drive the app it is sampling, with MIDI output to synth. That amounts to hosting, doesn’t it? That was where I hit a wall with Audiobus on the first attempt.
Since there probably won’t be too many new IAA apps, I think it’s better for me, with limited resources, to concentrate on the AU support. It also opens the path to adding an effects chain to the instrument AU. That hasn’t been asked too much, but it’s something that I would like to get opinions about.
But re: MIDI, your app already sends midi to external devices, right? You can just send midi to Audiobus' midi port. No need for you figure out hosting plugins if you support AB3 audio input as that would enable capable AU hosts like AUM and AB3 and maybe Ape matrix to send their audio chains to synthjacker as they can send their output to Audiobus outputs.
My understanding from other devs is that that is a lot less effort than creating s robust AU host that can handle effects chains. Some apps that have tried to become AU hosts have really struggled to get it right.
I think I can see the confusion, although I’m not a developer either..For it to work as is currently, it would need to be an IAA host, but it might work if it was implemented with Audiobus compatibility as both midi sender and audio receiver so you’d need multiple ports, one to drive the synth and the other Audiobus receiver (which goes in the output slot) to receive the audio.
I’m guessing but that might be easier than making it a simple IAA host though
This it true. But this just uses SynthJacker to config a MIDI sequence.
On the back end it slices the recording, normalizes the resulting wave, aiff or CAF samples
and places them in folders and uses name to simply loading into AudioLayer, NS2 and creates an SFZ file for loading those samplesets into Samplers that accept that format.
So having it trigger MIDI notes AND be the recording target is why AudioBus support would allow us to use it with more flexibility on the input side and use it's code to slice, process and label the results. And just configure a run and 10 minutes later a huge piano
instrument is done and ready to load up and play. Sometimes some hand tweaking is required but every update has solved the issues that required tweaking and the results are getting better and better. It does (in my experience) become unstable for runs exceeding 13 minutes. But when we give @coniferprod the setup that creates the issues he seems to add a fix in another update. The active users seems to have slowed up so he could be looking at new apps to created some income. But I think there are update that would spur more purchases with more discussion here.
I don’t think you understand yet. The developer has said that he’s working on making it so that the app can send midi to apps, then import a file recorded by that app for the slicing and dicing part. So, SynthJacker could send MIDI to Audiobus, AUM, Gadget, or whatever, and then import the resulting wave file (recorded by whatever means) to complete the process.
If I’ve understood that correctly, then I honestly can’t see what else is needed. Send the midi to AUM and record a stem at whatever point in the signal chain you like. Import it and your processed whatever can now be an instrument.
I am Looking into purchasing Synth Jacker for a doing just one specific thing.
Sampling just one sound.
I am a bit stupid with technology so please be kind. I had written a song with a sequence on a Korg M1 many years ago. I used a specific Combi Sound "MidiStack1" The M1 died a slow death. I was looking at buying another. Then the iM1 app came out and it sounds fantastic! I got my old sounds back, but no sequencer (I hated that old M1 sequencer!)
I got NanoStudio2, great sequencer, great sounds! I cannot use iM1 in Nano Studio 2
I need that exact midistack1 sound for that one song as a sequence.
My Question is:
How do I use Synth Jacker to Faithfully sample the iM1 and then use that sound in Nano Studio 2?
The iM1 uses Inter-App Audio, but SynthJacker does not support IAA directly. But, if you have some audio hardware, it should be possible. This thread explains it better than I can:
https://forum.audiob.us/discussion/32160/using-synthjacker-to-record-any-app-on-your-ipad-or-phone
If you have specific questions about the method, it might be good to ask them in that thread, so that you will get more information in the right context.
Also, to create samples that can be easily imported into Obsidian in Nanostudio 2 you will first need to set a couple of sample naming options in SynthJacker.
Went there, that made no sense to me
Thanks anyway
Over at the NS2 forum @dendy made a great tutorial on how to do it. Check it out.
@ralis
Maybe he meant that
https://www.blipinteractive.co.uk/community/index.php?p=/discussion/503/synthjacker-auto-sampler-app
@coniferprod - is this summary accurate? I typically have to route midi out to my audio interface and back into the iPad to route midi to anything other than a loaded AUv3 app.
Are you intending to allow the routing of midi to local apps?
Then are you going to allow importing files for slicing into what I think of as "note" files that are creating by detecting the silence between samples and then slicing with the nose threshold crossing.
I've never seen this plan in our discussions. But it would open the door to a lot of new flexible use cases.
Not internal routing, but instead generating a MIDI file with SynthJacker, then playing it back with the app you want to sample (if it can’t play MIDI files, get a MIDI player and hook it up with AUM – details pending), then record the output. Then import the audio file to SynthJacker for slicing and post-processing.
That scenario is the lowest hanging fruit. Again, I have nothing against Audiobus, but to use it would mean going all in, and I have also other related scenarios in mind apart from IAA - for example, this method would allow easy autosampling of desktop VST/AU, maybe using iCloud or Dropbox to transfer files.
I'd be happy as a clam with this option. Simple and effective. Not to mention you could reuse the midi file with all kinds of different setups.
@coniferprod please at least consider adding a virtual MIDI port to the app, so that Synthjacker can send MIDI directly to iOS synths without having to save out a MIDI file. That way we could select Synthjacker's virtual MIDI port as an input in AUM and then record the audio from there. It would save a lot of hassle.
Same here great idea!
So this means that SynthJacker in a sense will have a midi port out right? Since it will be sending the midi file playback into AUM and we just record the output files which in turn SynthJacker will process. I really like this idea! Thanks so much!
+1
ENHANCEMENT REQUEST LIST:
The group mind has ID'ed an ideal workflow with the least effort using these features 1-3.
Least effort for users, not the developer.
Option 5 isn’t necessary if option 4 is implemented.
My text was ambiguous (I have edited it). I intended the message that 1-3 seems to be the "ideal workflow" for many.
I expect adding AudioBus and IAA to be the heavy lifting for the developer.
The key with synthjacker to get any new features is to show that a lot of new purchases will follow.
It appears the developer was working on another product and might have been ready to let SynthJacker
just trickle in new sales. But threads like these encourage updates with new use cases supported.
Several of us have been prodding him to keep the updates coming and he's been delivering new features
at a steady clip. If 1-3 are easy he'll probably push them out. 4-5 are probably dead on arrival unless
dozens (or maybe more) declare it would be a must buy.
I keep suggesting features and proposing a price increase or maybe IAP options to get some sales for the effort.
On the subject of waiting for updates... I was really hoping @Virsyn would ship SF2 or SFZ loading since I have got all the ESX24's off my Mac and want more sample sets. SynthJacker is my fallback and NS2 looks like a much better option to avoid the headaches of AUv3's in Cubasis or any other DAW.
New SynthJacker demo video: how to autosample AUv3 instrument Audio Units on iOS