Loopy Pro: Create music, your way.
What is Loopy Pro? — Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.
Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.
Download on the App StoreLoopy Pro is your all-in-one musical toolkit. Try it for free today.
Comments
Is adding stuff to other layers planned on the roadmap ? Tempted to buy the desktop app but need much more customization for me to pull the trigger. Also. Eventual custom shades etc ?
Another experiment
Nice one. Use Vimeo instead of Youtube no massive process time to wait.
Thought you had to pay to upload to vimeo
You have a total of 5gb that you can upload on the free version. There may also be a weekly limit. But heyho works much better for me.
That’s excellent. The eye treatment is quite clever!
@NimboStratus thanks for the detailed explanation and cool resulting video.
Dude! That’s frickin awesome 🤩
So that’s a background picture imported into VS right?
I’ve spent like a total of 5 minutes with this app so far and my mind is blown. “I like this!” (Said out loud with a really exaggerated southern draw)
@sinosoidal maybe I got lost here on the discussions and someone already mentioned it. I would use it with my own stuff, and know the tempo. But I see a benefit for those vjing for other people and needing a Tap Tempo on the app.
I thought that was the case... but it seems to behave the same regardless of whether I have it loaded as an instrument OR FX.
Maybe I'm missing something though. I played for a bit last night and I couldn't tell the difference between loaded as FX vs Instrument.
I thought that maybe you could use the LFOs, etc. that you set up within VS to effect other other sounds/instruments outside of VS. But I've yet to figure out if you can actually do that yet @sinosoidal
Yeah it’s pretty efficient, I’m using it on an Air 1 and it’s still capable if I don’t go too nuts on layers
VS was updated to 1.0.1. Here is the changelog:
If you like VS please rate it on App Store.
https://apps.apple.com/us/app/vs-visual-synthesizer/id1560330289
thank you @NimboStratus for your directions
now able to change patterns with different background videos with MIDI notes (as you can see the videos don't connect seamlessly - the MIDI notes do - there's some black space in between)
note: video quality is good (first upload twitter has somehow lowered it, maybe will get better)
Because it is loading a new video every time. It takes time to start the playback. But its cool! Even with the black space. I will need to find a solution for this requirement.
Agree! Not meant as a comment.. But of course nice if that would be possible.
I'm happy with the possibility to use different background videos as is now.
You need to "connect" the incoming sound to parameters for it to do anything.
The first thing to do is to set the threshold for the envelope trigger. The arrow at the bottom left brings the modulators up. On the right you should see a spectrograph of the incoming signal. Set the threshold so that it's just getting crossed when the sound peaks. The envelope fires whenever the amplitude of the incoming sound crosses the threshold. Just adjust the first of the four envelopes for this example.
Now you need to connect the envelope to what you want to modulate. I've found the easiest way to do this is to long-press one of the controls on any of the layers except the base layer, which can't be modulated. You'll get a matrix showing that control and a few others. This is where you can set how much each modulator affects each control. Apply some amount of AM1 (Audio Modulator 1) to some parameters, and you should see the effect on the animation.
Described much better here: https://www.imaginando.pt/products/vs/help/layer-modulations
Yes. Thanks. However, I've already figured out all of this... I've pretty much got a complete handle on how the app works in stand-alone AND as an AU.
I'm just saying that when I load VS as an instrument OR music effect in apeMatrix... it behaves pretty much the same regardless. All of the midi and/or sound reactivity is the same regardless of which version I'm using. There's got to be a good reason why there are separate versions though. They both respond to all the same input in the same way. I can't tell a difference between either of them.
Also... the settings in VS show that it has midi out activated. But, I haven't figured out how to get anything outside of VS to react to the midi out that it's sending. Have you?
Another test, sorry use the same track as before with a slight update.
I didn't realize you were using Ape Matrix. In AUM, in order to route audio to something it has to be in an FX slot. Maybe it makes no difference in Ape Matrix. The designations of whether or not something is an FX are just part of the Apple app type identifier; unless the programmer actually writes code to limit the functionality based on the type identifier, the versions would normally act the same. Most hosts depend on the app type id to decide where an app can go. Maybe Ape Matrix doesn't care.
I don't see any midi coming out from either the AUv3 or the standalone. I'm not sure what I'd use it for, but seems like it could be useful for something if it works.
Wicked.
Loved what you've done with the materials within the eyes.
Yeah.
Awesome.
That’s probably it.
And yes… if midi out really does work that could be interesting to play with
I'm not used to ApeMatrix, but on AUM you wont get audio input unless you declare it as an audio effect, route all the tracks to a bus and use the bus as input of the audio track where VS is inserted as FX. Maybe ApeMatrix overrides this burocracy.
We are not using the midi out and we can't imagine a use for it right now. Can anyone foresee a use for it?
Using MIDI out to easily record parameter tweaking (in Xequence for instance) and be able to play it back? Sort of straightforward automation recording/playback.
A few things that come to mind...
Send the LFO out to control a filter sweep in sync with a video modulation. Trigger volume swells in sync with an audio modulation envelope. Record the modulators to play back into the app so that you can remove the audio input...
But more importantly, if midi out isn’t used, then the options shouldn’t be there or it leads to wasted time learning that it doesn’t do anything, bug reports, confusion, etc.
I guess it's maybe not essential as most (or all?) things can also be achieved otherwise?
Yep. It’s just odd to have an option exist that doesn’t do anything.
This app is awesome. I just tried using the iPad app hooked up to my Mac thru IDAM with ableton live (linked) sending it midi easily and keeping the iPad in full screen. It works great actually…
Yes, it's somewhat the same in apeMatrix. If I want to route audio into VS, it has to be loaded as a music effect. However, if my input is midi... it doesn't matter which version of VS is loaded, ie. music effect vs instrument.
My question is... when would I want to load VS as an instrument vs a music instrument?
Regarding midi out... first of all, I was just trying to figure out if it worked and what's it's for since it's there.
Secondly, I can see an interesting use for midi out. Normal use of VS is to create audio and/or midi that will effect various graphic layers within VS. I can see the opposite use where you are creating graphics and LFOs that effect external sounds.
In other words, right now the audio source is the master, and the graphics are the slave. If midi out worked, the graphics could be master and the audio would be slave.
It could be usefull if you just want to feed it midi and not audio. Or if you just want to give it a spin very quickly. Taking that option out will make it harder to find in the list.
Generating midi from the visuals seems a bit far-fetched. Maybe it is better to take that option out to avoid further confusing.