Loopy Pro: Create music, your way.
What is Loopy Pro? — Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.
Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.
Download on the App StoreLoopy Pro is your all-in-one musical toolkit. Try it for free today.
Kodelife, new app from Hexler
This looks interesting from Hexler, new app that lets you edit shaders
Might be good to create content for K Machine
Comments
Looks brilliant. Now we NEED to know when VS will accept i porting of shaders.
That would be dreamy is VS could import GLSL code.
That would make VS a beast in my eyes.
I agree, be a great cteative combo
Wow, cool! Does LumaFusion support shaders?
yes please to VS shader import!!
No, you’d need to import a video recording of the shader.
The code you create with KodeLife would need to be modified to work with K Machine’s internal variables. If you simply paste K Machine code into KodeLife, it won’t run. A template could probably be created for going from KodeLife to K Machine compatible code and vice versa.
There’d be a similar process that would be needed to write code for the VS app if and when it allows you to run your own shaders in it.
The need to modify code for each particular app you’re using is because each app has different input and output resources and ways of handling them. KodeLife includes time, mouse, and audio input split into three different bands which you can specify. The code between the input and output should be relatively similar although once again, different apps will support different standards of code.
I hope Hexler posts some example code on their website so we can learn how to use the app more easily. Their instruction manual is limited to talking about the features of the app and currently says nothing about how to code in OpenGL.
Thats disappointing I didn't realise it was more complex than simply saving as a shader then opening it in k machine
I’ve been able to get a ShaderToy shader to work inside KodeLife using the template included with KodeLife. 🥳
Cool. I've been using VEDA plugin for Atom, but it doesn't run on iPad.
https://atom.io/packages/veda
On the other hand, Veda supports ISF (Interactive Shader Format), which is convenient for audio-reactive shaders:
https://isf.video/
More info on that would be awesome, I'm not very good with this stuff and guess it takes more than just copy + paste? I got the template open in Kodelife but ... then what
OT (in the wrong category)
Slow news day.
I’m not just throwing shade. Standards people. Music Forum, eh?
These are great to create MUSIC videos with, eventually .. narrow minded, eh?
@Paulinko I did manage to paste in some code from "the book of shaders" via the koder app. It's a bit of a pain to copy and paste when not having an external keyboard. Anyway, ignore my question earlier, thanks.
For the ShaderToy shaders, you need to copy main code into the last section of the template. The top section will be okay as it is. For example, replace lines 34-38 in the template with the code from the Image tab on the ShaderToy.com site. Some shaders may have image files iChannel0 thru iChannel3 which you’d need to download and assign in KodeLife. Others may have an A and a B tab and I’m not sure how to do those ones yet.
Thanks man! I did manage do get this one in, now I want to get smart enough to edit in the audio/midi options available. I tested one of the kodelife defaults and it reacts incredible well to input audio.
Hi. Sorry for the stupid question but in non coder talk, could anyone tell me what this does, what use it would be? Altering music apps in some way. I don’t know what shaders are (except graphics .. things I remember). Sorry again!
Also if I dumped shader toy code to it would it also respond to an audio file?
@NimboStratus
For me, the whole idea is to have the audio input amplitude drive some parameters in the code for KodeLife so the resulting graphic animations are in sync with the music. Ideally it would also be a place where I can develop code to be used in other apps like K Machine and VS if and when it supports having your own shader code.
In KodeLife you have three bands of audio you can use to drive the parameters. You use sliders in the KodeLife app to select frequency ranges for the bands and you can also amplify each of the bands to get a better response for your graphics to the music.
In KodeLife you can use the Spectrum variable which is an array that holds the change in amplitude for low, middle, and high frequencies to drive the graphic animations. You can use shaders others have created from sites such as ShaderToy or create your own. You will need to modify the code so that it uses the Spectrum variable built-in to KodeLife to animate your videos. Often the shaders made by others use mouse coordinates to drive the animations. You can often try and substitute your audio Spectrum variable for the mouse variable in the code as a quick way to get audio/graphic animation going.
Alternatively, if you want to develop your own shaders, you can go to a site like TheBookOfShaders to learn how to make your own.
For audio input in KodeLife, you can use iOS Audio or if you have a USB audio interface, you can select it in the settings as the source for your audio. You can also select your audio sample rate (e.g. 44100 or 48000) and latency in samples (e.g. 256 samples).
Do you have any example code ? Just wondering if it’s worth a punt on this app whilst I wait for VS to get support to add own shader code (if it ever happens)
You can download the free ShaderToy app to see more examples of shaders. I stopped the screen recording of this shader after about a minute. If you really don’t want to get into editing or creating your own shaders, you could simply find shaders that respond well to your finger moving on the screen and conduct your music video performance. You can see the code for the Demo Volumetric shader by Inigo Quilez.
Yes!
Check out this shader Fractal Land and be sure to move your mouse or finger around on the shader.
If you import the shader into KodeLife and change the resolution to 1280 by 720 you get very crisp lines.
Very impressive stuff, thanks for the tip. Will need to study this.
I haven't seen any demos for Kodelife yet. Do you need to know how to code in OpenGL in order to use this app?
Even stupider here. What’s a shader?
Only if you want to create your own. You can copy/paste (with slight tweaking) shaders from sites like shadertoy.com
It’s the thing that graphics “cards” use to draw graphics/effects on your screen. You’ve watched VS draw bouncy lines and circles that react to music? Shaders are drawing them. They’re very low-level, incredibly fast and run in parallel (think two, three thousand or more at a time. Each shader responsible for a single pixel on your screen.
There are (at least) three different languages that can be used to create a shader: OpenGL Shader Language, High Level Shader Language (DirectX), and Metal Shading Language (Apple). Most apps support GLSL but Kodelife claims to support all three (on their website, haven’t verified their iOS app). These languages are compiled into instructions that the graphics cards can execute.
You may be less stupider now but probably more confuzled. ;-)
Thank u sir!. This is an excellent description.
If I understand correctly it’s like universal language to interact with graphic cards or generators, sort of like graphic cards API. Looks interesting. Thanks!