Loopy Pro: Create music, your way.

What is Loopy Pro?Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.

Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.

Download on the App Store

Loopy Pro is your all-in-one musical toolkit. Try it for free today.

Video sequencer

There are many tools to create visuals from music, but are there tools that create music from visuals?

What i like to see:

An app or AU where you can load a video. Now you can place markers in the video window. Those are trigger points to set an action. Every pixel on a video screen contains data, like the RGB value.
I think you can do a lot with the data, because it has a set of 3 values. For example, you can convert the values into MIDI-data.
When a clip is playing, every video pixel is constantly changing value. that would result in a constant MIDI-data stream on every trigger point. Solution: Set a threshold to a data value.
Also nice: You can set an LFO to each trigger point (x/y value), so your trigger points move on the screen.

This would be a cool MIDI-sequencer, and very experimental...

Comments

  • Neat idea. I think for it to be musical (ish, anyway), the app would have to have a very nice/flexible/intelligent/etc quantizer of sorts. A single frame of 4k video has over 8 million pixels! That would choke any MIDI system before it even started.

  • edited June 2021

    Another type of sequencer:
    A square picture of a black/white pattern.
    It could be any Pattern (Abstract, hand drawn, picture posterised to black/white)
    Now you can place trigger points somewhere on the pattern.
    Let the trigger points move with LFO's
    Each trigger point can have a note value.
    As soon as a trigger point "hits" the white --> Note On.
    When a trigger point "leaves" the white --> Note Off.

    This would be an advanced version of Rosetta Collider :)

  • edited June 2021

    @syrupcore said:
    Neat idea. I think for it to be musical (ish, anyway), the app would have to have a very nice/flexible/intelligent/etc quantizer of sorts. A single frame of 4k video has over 8 million pixels! That would choke any MIDI system before it even started.

    A large video format could be downsized to a small format, like a couple hundred pixels in with or height. The goal is to set a couple of trigger points on the video window, so you sample a few chosen pixels.

  • edited June 2021

    Slight aside, anyone know of any video synthesis on iOS? Is something along these lines even possible?

  • edited June 2021

    @waynerowand said:
    Slight aside, anyone know of any video synthesis on iOS? Is something along these lines even possible?

    I came across Pixel Nodes a year ago. I think, its not very stable, but a fun app to play with.
    https://apps.apple.com/us/app/pixel-nodes/id1313351782

  • @Identor said:

    @waynerowand said:
    Slight aside, anyone know of any video synthesis on iOS? Is something along these lines even possible?

    I came across Pixel Nodes a year ago. I think, its not very stable, but a fun app to play with.
    https://apps.apple.com/us/app/pixel-nodes/id1313351782

    Thank you! The feedback loop looks really fun.

  • I’m pretty sure some of these could be tried out/prototyped in max/msp, maybe even pure data, which I think is free/open source... I think you can get a fairly decent length demo of max as well, but it’s not that cheap to buy a full licence of if you get hooked...

  • Another Program for (interactive) visuals is Vuo
    https://vuo.org

  • There’s sort of the still-frame equivalent: https://warmplace.ru/soft/phonopaper/
    And also https://warmplace.ru/soft/pixivisor/

  • Alexander Zolotov also has an app that generates sound from the live input of your camera.
    The app uses the spectral synthesis algorithm of the Virtual ANS engine, another app of him.
    It’s on old app. Doug at The Sound Test Room Tested it in 2014. Here’s the YouTube video:

    And here’s the link to the AppStore:

    https://apps.apple.com/app/nature-oscillator/id906588504

Sign In or Register to comment.