Loopy Pro: Create music, your way.
What is Loopy Pro? — Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.
Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.
Download on the App StoreLoopy Pro is your all-in-one musical toolkit. Try it for free today.
Comments
That is what I love about you. Your ability to latch onto an idea and experiment to find a fit. IOS needs more developers like you. That said I must also give Ovo from Holon.ist and Paul the musician. My you all continue to push the boundaries.
Ok. I’m going to focus on the iPhone as remote sensor with camera. I have an iPhone 11 Max Pro w TrueDepth. And a facile tongue.
I’ll check that out. I need to isolate precise gestures or filter that chatter reliably to use this for page turning or loop triggers, etc.
45 degrees is great for signaling precise events like triggering loops, mapping DAW transport controls, or anything people do with pedals. Sustain and loop recording for a guitarist.
This could be good for live musicians that use big rigs of gear.
Hand free is the key. The hands are busy so people use their feet but face, eye, head and body gestures on camera will be like magic and so easy to set up. We just need to protect the phone from thieves at the gig. Probably use a clamping holder.
Man, this looks fun! But dang it... I already have so many controllers. There's that one that works off of the gyro AC Sabre I think?
Still... this is next level. Put it on my sale watch list.
Heck, who knows? Maybe the dev will have a mother's day sale?! I know that'd push this sorry muthah off the fence. lol
It's on sale at $10... the regular price is $199. Hurry!
(Just kidding...)
The MIDI programming design is really top notch but it's less than useful without AUv3 since
screen organization is key to productivity.
BUT, if you have 2 IOS devices that can run IOS 11 or greater you can use one for MusiKraken and the other as you do now and have a great remote MIDI controller that
can use used for hands free and even mic signaling.
Don't wait for a sale... feed the developer and get more features by doing the user testing and inventing new features that could be implemented. It creates a virtuous circle.
I think a purely camera based app that can run in the background would be a game changer
for many if the camera can be used while the app is in the background. Maybe this already works. I just got the app.
Face tracking looks like it could be ideal for me while using a keyboard controller to trigger/signal my AUM rigs of AUv3 apps. Probably "Face Tracking" on my phone while AUM is active on the iPad.
Oh, I'm sure it's worth it. It's just that I'm trying to curb appaholism. I've got more than I need. I was just trying to goad a sale in an attempt to sabotage my current freeze on new app purchases.
I don't blame you... I was waiting too but I got a generous iTune Gift card so I'm working off my wish list without concern for the future. I also picked up all the SWAM instruments which I found to be pricey when it was my money.
This app has great value if you want to do something hands free... most of the other functionality already exists in AUv3 packages. But the camera integration is in a unique category... the gyro stuff requires one or two hands. It misses the need. My hands are already busy... and my eyes are on the current DAW/Live user interface. I just want to avoid
the need for my bluetooth pedal and this could be it.
STATUS ON EXPERIMENTS:
I charged my old iPhone 5S and installed MusiKraken. I configured a "Mouth Tracker" input
and connected it to a Bluetooth Network with my iPad AUM session as the target.
It works but the latency makes this unusable as a foot switch replacement. No biggie.
The iPhone 11 Pro Max will be my input sensor. The code to interpret the camera details needs a better CPU. It might be useful for touch surface controls but hardly a replacement for a BlueBoard with 1 second latencies.
NEXT STEPS:
Experiment with MusicKraken as the expression controller for SWAM instruments while playing a hardware MIDI Keyboard controller.
Experiment with MusiKraken as a Music App "Page Turner" replacement for the BlueBoard
using the camera for 4 distinct events. Also look into Guitar Stomp controls in TH-U, Amplitube and ToneStack Pro.
How many are we talking and what's your Paypal? No one has ever accused me of missing a hint, my friend.
The midi in does not allow any values to be converted since i only contains an orange port midi out. This is a shame since if midi in contained both a green and orange port, then
its in coming values could be converted from note/CC etc. That way I could connect a Holon.ist sensor to Musikraken and have the best of both worlds.
having the beers is not the problem, but finding time to drink them...
That would require a "MIDI to value converter" which does the opposite as the "Value to MIDI converter", so that you can filter out one aspect (note value, a specific CC and so on) and convert that to a number (= green port). I did consider something like this a while ago, but didn't find it useful for me, that is why I never built it. But your use-case makes sense, maybe I should add something like this...
That would be brilliant if you could. I do have apps that work with AUM but they presently will not work with Musikraken. Such an addition would allow me to use my Movesense sensor and use its in coming signals at the same time use Musikrakens trackers. I like the fact that with hand and face tracker you get a visual representation of what you are controlling.
Ah! The priceless commodity that we all give away.
Here I go again. New thought if Musikraken could play videos or youtubes inside the app then the hand/face/body trckers could convert that into CCs or notes to play. Just thought it would be a super coll addtition to already wonderful app.
that would be fun! But I doubt that there is a nice (non-hacky) way to access the image data of youtube videos inside of an iOS app. Ok, you could access the videos from the iOS photo gallery, but that would not be the same... But it would be fun to cut together a series of public domain videos and then let them make music as an art project. Maybe one day...
Tube Au allows acces to youtube videos. I use it sometimes to sample from. If Musikraken could allow access to Tube then solution is found.
Hmmm.... Musikraken does allow access to Tube but it does not recognise it as a camera and merely plays as an overlay whilst the trackers work in the background. Any work around you can think of here?
I did a short search and maybe I could use the new MediaRecorder API from iOS for something like this. But might be difficult to integrate into MusiKraken. Not a priority, but I will do it on my TODO list to try this out...
I know you have a lot on your plate. Thanks for popping on your todo list
I finally finished the file sharing feature of MusiKraken. It took a while to get it working the way I wanted, but I hope it now works in all (or at least most) combinations. It should be in the store since a few minutes.
You can now save files on iCloud and synchronise that with other devices. And you can access the internal files using Apples files app (or when connected to a mac) and add / remove / copy files directly in that storage.
And you can also share project files via e-mail or whatever. The files have the extension .mkproj (for MusiKraken Project) and you should also be able to put the file online somewhere, download it in Safari (and hopefully more browsers) on the iOS device and open it directly in the app (I had to make the file structure extra complicated so that Safari doesn't detect it as an existing file type and tries to display it instead of downloading it. I hope this works in all combinations...).
If your device doesn't have the hardware necessary for one of the modules in the project file, it will have a white border in the editor, so you can replace it with something else.
(and please tell me if something doesn't work on your device, iCloud somehow behaves differently on each device that I have and supporting your own file types in an app is a bit tricky and badly documented, that is why I wanted to make sure it works perfectly on my devices before I release it).
For the one-year aniversary update of MusiKraken, I added 7 new effects that can be combined with all existing modules:
-Beat: Convert fast value changes to short notes. By combining this with the various input modules, you can for example play drums by shaking (Accelerometer) or rotating (Motion Sensor pitch) your device or by punching in the air (body or hand tracking). You can also connect any MIDI output port to the note input port to for change the note (or chord) the beat is played on. This way you can for example use a second value output to control the note height.
-Threshold: This starts playing a note when the connected value goes over a threshold, and stops it again if the value goes back again. So you can control the note length this way and is combinable with any of the input modules, so many new input controls become possible. This also outputs two new values while the note is being played: The intensity value is defined by how far the value is from the threshold. And the sideIntensity is defined by the sideValue input from another module. Once the note starts, the sideValue at that moment is stored and the difference to this stored value is sent to the sideIntensity output. This way you can for example control pitch bend with a second input value.
-Speed: This measures the change speed of one value and sends that out to the valueOut port. So for example if you move the fingers on your touchpad quickly, the value goes up, and if you move slowly, the value goes down. You can also route MIDI note events through this, and each time the direction of your value changes (for example if the fingers on the touchpad change the movement direction), these active notes will be retriggered. Which is useful for example for rebowing in violin VSTs. You of couse can combine this with any value conversion, so for example you can create a setup where shaking the device (Accelerometer) increases the note height.
-Envelope: This uses a simple ADSR (attack, decay, sustain, release) Envelope as used in synthesizers, but applies it to an output value. So every time you send a MIDI note event into its MIDIIn port, the value goes up to 1 in the attack time, goes down to the sustain value in the decay time, and stays at the sustain value until you stop the note again, in which case it will go down to 0 during the release time. If you route the MIDI events through this module, it will also keep the last note alive until the release time is finished.
-LFO: This creates a Low Frequency Oscillator, and sends the computed values out of the value out port. This makes it possible to for example control vibrato using one of the input modules. The middle value, the frequency and the amplitude can be controlled by separate inputs. If no port is connected to one of these input ports, then the module uses the default value for that, which is the minimum value in the settings.
-Latch: Very simple module: Keeps notes alive when a note is started, and stops it again if the same note is played again.
-Arpeggiator: A simple arpaggiator with the usual patterns, selectable velocities for each step. You can also define how many times a note is repeated and if notes of additional octaves should be filled in. And by connecting anything to the multiplier input port, you can control by how much all velocities are multiplied. By combining multiple arpeggiators (and maybe with other effects like the chord splitter, the channel switcher and the transposer), you can create complex setups.
Is it possible to create your own scales as well?
I am working on some kind of chord progression editor which also allows you create note progressions. But it is not yet finished, so far you can only use one of the many scales predefined in the app. I hope I can add the chord progressions to the next update...
Awesome! Thank you for the answer.
Congrats on the latest update—lots of great new features! I’ve always imagined that this app could be my musical instrument while I’m out walking. I’ve always imagined an app that would respond to voice commands (“set tempo”, “record beat”, “record chord progression” etc) to compose on the go and then use the position of the iPhone in space in front of me to play. The new Beat feature is perfect! Anyway, another vote for chord progressions.
@Snarp thank you for your app!!
Can MusiKraken work in the background (or how do you do that)?
I mean the following: I want to use the camera (for instance mouth open/close to send MIDI CCs) to control another app on my (the same) IPad. For instance I want to control the scale of a video in Visual Synth.
But then I have Visual Synth open/on screen and MusiKraken in the background. But it doesn’t seem to be sending CCs in reaction to my mouth movements then. Also not when MusiKraken is ‘in front’ and Visual Synth in slide over..
It does send CCs when only MusiKraken is on screen, but then I cannot see in real-time what is happening to the video in Visual Synth. It does work when using two devices: an iPhone running MusiKraken (on screen) and Visual Synth ‘ receiving’ on the iPad..
Hi @janpieter , no, the app cannot access the camera and most other sensors (except the microphone) while the app is in the background or in split view / slide over. It would probably be a security issue if accessing the camera from the background would be possible, but not sure why it behaves the same way when using split screen or slide over.
I will probably add the possibility to run MusiKraken in the background soon (currently working on some iOS optimizations), but you will only be able to use the effects and the audio units in the background...
Thank you! Clear!
Hi @Snarp, thanks for providing us this powerful app! With the new features like Threshold and Beat, I'd use it much more as a versatile MIDI router able to process motion-based input than for experimenting with camera-based input.
My project is to build a minimal "busker" setup with only my -acoustic- guitar, my iphone, an audio interface and a vibration speaker. No pedal. Then it's about getting the most out of the in-built sensors on the iphone, and as I'm playing guitar and singing, gestures and face expressions are not so actual but moving the device is.
Musikraken is to my knowledge the only scripting-free ios app that integrates both sensors and routing with sufficient versatility. Reproducing this functionality by combining say Holon.ist or TC-data on the sensor side with Audiobus, MidiFire or MidiFlow on the routing side would require a fair amount of scripting in StreamByter or Mozaic and tens of times the effort. So I'm all for it... if it were not for a couple of critical shortcomings: