Loopy Pro: Create music, your way.

What is Loopy Pro?Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.

Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.

Download on the App Store

Loopy Pro is your all-in-one musical toolkit. Try it for free today.

MIDI Recorder/Visualizer: testers/ideas wanted!

Hey all! I'm working on a little visualizer tool – think Synthesia but in reverse and real-time.

Here's a quick demo of the base functionality:
Bonus points if you can identify the song.

Planned features include:

  • Transport control markers (like measure/bar lines)
  • Chord/scale detection on a user-selected region
  • Full history
  • Note velocity scale modes

Let me know if you'd be interested in testing within the next few weeks or if you have any ideas for cool/useful features.

«134

Comments

  • Dude this looks cool I’m definitely interested in testing. I’ll think of some ideas as well.

    Maybe the ability to alter speed to slow it down so you can learn to play it? They have midi files of songs all over the Internet, for example you could teach yourself “true love waits” By Radiohead. But bpm is dictated by the host so not sure how that would work… I guess you could slow host tempo down to do that.

    Display chord, key, tempo of the incoming midi would be cool.

    This would be a great learning tool as well.

    Make the keyboard playable be really cool too. This way You could play the actual keyboard and play along to the midi. This would be awesome for learning piano. You can practice by playing the keyboard on screen, in time. You could then send the midi out to another instance of atom 2 and record it to compare the 2 versions. See how close you were.

    Allow the use of midi keyboard or BT Keyboard to play the visual notes. If you get it right it turns green if you hit a wrong note or key it turns red. Or something in this vain. Almost like a piano practice game.

  • Looks so cool! It'd be neat if notes were colored by midi channel, so it'd be possible to visualize what notes different tracks are playing.

  • I'm interested.
    Have you considered zooming in and recording the incoming MIDI stream?
    Something like a simple Rec, Stop, Play and start point adjustment?

  • edited September 2021

    I'm interested. The Synesthesia reference threw me off track, as did the term Visualizer, though you did get my attention.

    Consider a name with -gram or -graph.

    Speaking of Synesthesia, which supports ISF (interactive shader format), I wonder how difficult it would be to implement this as a custom shader and run it in Synesthesia? Or in VS on iPadOS.

    It would be nice if Mozaic had this display.

    But I digress.

    I think of this in the same category as the volume vs frequency "EQ" visualization for audio, and I think it would be useful to throw one into AUM or apeMatrix so I can see what the MIDI is doing while I'm performing. For monitoring purposes.

  • Personally speaking, I think the reverse Synthesia is a perfect description and I think it would be a great utility for harmonisation in DAW's that don't allow you to view the notes of other tracks whilst step sequencing. But with that in mind, I think it's important that the visualiser functions well in a standard AUv3 window as well as at larger formats. (mini piano keys would help).

  • edited September 2021

    @TonalityApp said:
    Hey all! I'm working on a little visualizer tool – think Synthesia but in reverse and real-time.

    Here's a quick demo of the base functionality:
    Bonus points if you can identify the song.

    Planned features include:

    • Transport control markers (like measure/bar lines)
    • Chord/scale detection on a user-selected region
    • Full history
    • Note velocity scale modes

    Let me know if you'd be interested in testing within the next few weeks or if you have any ideas for cool/useful features.

    I’m doing a lot of work with MIDI at the moment and have 1-2 monitors running all the time. I’d be happy to run this one and keep you updated with feedback if you’re looking for more testers.

    I can tell you in advance that quick and flexible filtering/splitting/combining/layering/visualising of multiple concurrent midi channels would be the thing that will make something like this most valuable to me. Being able to quickly switch between showing different channels in parallel lanes and combining them into one source with visual cues (different colours for different channels etc.) would be very powerful.

    I’ve also developed algorithmic logic in the past for very in depth chord naming (you can see it at https://GitHub.com/OscarSouth/theHarmonicAlgorithm) and I’d be happy to assist with implementing that if that’s useful to you.

  • I was mixing up Synthesia (which I've never heard of) with https://synesthesia.live/

  • edited September 2021

    @Poppadocrock said:
    Maybe the ability to alter speed to slow it down so you can learn to play it? They have midi files of songs all over the Internet, for example you could teach yourself “true love waits” By Radiohead. But bpm is dictated by the host so not sure how that would work… I guess you could slow host tempo down to do that.

    I do like the idea of slowing down the captured MIDI data (or at least zooming on the time axis).

    Display chord, key, tempo of the incoming midi would be cool.

    Definitely.

    This would be a great learning tool as well... Almost like a piano practice game.

    Didn't think of this! I'm not sure about loading pre-made MIDI files yet (you can always route a MIDI file player into the AU) and the gamification is a bit outside my goals for the moment, but we can always talk more once I've completed the essential functionality.

    @auxmux said:
    Looks so cool! It'd be neat if notes were colored by midi channel, so it'd be possible to visualize what notes different tracks are playing.

    Yup great idea, seconded by @OscarSouth. Implementing that soon.

    @rs2000 said:
    I'm interested.
    Have you considered zooming in and recording the incoming MIDI stream?
    Something like a simple Rec, Stop, Play and start point adjustment?

    Zooming yes, recording probably. I want to stay away from making more of a piano roll editor (Atom 2 is great), but bare-bones MIDI file recording/playback is likely on the table.

    @mojozart said:
    Consider a name with -gram or -graph.

    Good idea.

    I think of this in the same category as the volume vs frequency "EQ" visualization for audio, and I think it would be useful to throw one into AUM or apeMatrix so I can see what the MIDI is doing while I'm performing. For monitoring purposes.

    Exactly, like a spectrogram but for MIDI.

    @jonmoore said:
    But with that in mind, I think it's important that the visualiser functions well in a standard AUv3 window as well as at larger formats. (mini piano keys would help).

    Definitely. The video above is actually recorded from an iPhone 6s inside AUM, so I don't think the screen size will be too much of an issue. That said, I'm considering allowing the user to set a range of MIDI notes to display instead of the entire available keyboard.

    @OscarSouth said:
    I’m doing a lot of work with MIDI at the moment and have 1-2 monitors running all the time. I’d be happy to run this one and keep you updated with feedback if you’re looking for more testers.

    That would be great!

    I can tell you in advance that quick and flexible filtering/splitting/combining/layering/visualising of multiple concurrent midi channels would be the thing that will make something like this most valuable to me. Being able to quickly switch between showing different channels in parallel lanes and combining them into one source with visual cues (different colours for different channels etc.) would be very powerful.

    Individually-colored channel overlays are the next thing I'm working on! By parallel lanes, do you mean entirely separate keyboard/track columns for each channel, each containing all 88 key lanes?

    I’ve also developed algorithmic logic in the past for very in depth chord naming (you can see it at https://GitHub.com/OscarSouth/theHarmonicAlgorithm) and I’d be happy to assist with implementing that if that’s useful to you.

    I also have a good amount of experience with chord naming algorithms, but I appreciate the offer and will let you know if I ever need help. Your repo looks amazing – I've always had a soft spot for functional programming and the idea of using Markov chains and overtone filtering is very cool!

    I have a couple of your emails, but @rs2000, @jonmoore, @OscarSouth, could you PM me an email to use with TestFlight? The beta isn't quite ready but I'll add you as soon as it is.

  • @TonalityApp said:

    I can tell you in advance that quick and flexible filtering/splitting/combining/layering/visualising of multiple concurrent midi channels would be the thing that will make something like this most valuable to me. Being able to quickly switch between showing different channels in parallel lanes and combining them into one source with visual cues (different colours for different channels etc.) would be very powerful.

    Individually-colored channel overlays are the next thing I'm working on! By parallel lanes, do you mean entirely separate keyboard/track columns for each channel, each containing all 88 key lanes?

    Yeah, the reason for parallel lanes (that could hopefully be zoomed into a specific region) is that I do a lot of orchestrating with MIDI, so when you have a lot of orchestral instruments over 16 channels all doubling, converging, diverging and weaving through homogeneous blocks of texture, there are so many notes existing in the same times and places but are very different in timbre, and a MIDI roll with overlapping notes doesn’t convey any of that textural/timbral information at all, even when individual notes are coloured.

    Many of the instruments play inside pretty small registers at any one time so you’d only need windows of 5-10 ish semitones most of the time and could probably fit quite a lot of information on the screen. Must be quite a lot of orchestrators out there that could use a tool like this!

    Also for this workflow, having display configuration ‘snapshots’ that you can easily flick between as the instrumentation changes throughout a piece would be insanely powerful.

    I can export some MIDI files of the kind I’m talking about if you want so you can have a play with visualising them.

    I’ve also developed algorithmic logic in the past for very in depth chord naming (you can see it at https://GitHub.com/OscarSouth/theHarmonicAlgorithm) and I’d be happy to assist with implementing that if that’s useful to you.

    I also have a good amount of experience with chord naming algorithms, but I appreciate the offer and will let you know if I ever need help. Your repo looks amazing – I've always had a soft spot for functional programming and the idea of using Markov chains and overtone filtering is very cool!

    Thanks for checking it out :)

    One of the things I did with chord naming was to design a method for choosing the ‘most likely’ simple triad from larger clusters of notes, based on the bass note and the overtonic relationships in the upper structures.

  • @OscarSouth said:
    Yeah, the reason for parallel lanes (that could hopefully be zoomed into a specific region) is that I do a lot of orchestrating with MIDI, so when you have a lot of orchestral instruments over 16 channels all doubling, converging, diverging and weaving through homogeneous blocks of texture, there are so many notes existing in the same times and places but are very different in timbre, and a MIDI roll with overlapping notes doesn’t convey any of that textural/timbral information at all, even when individual notes are coloured.

    Many of the instruments play inside pretty small registers at any one time so you’d only need windows of 5-10 ish semitones most of the time and could probably fit quite a lot of information on the screen. Must be quite a lot of orchestrators out there that could use a tool like this!

    Also for this workflow, having display configuration ‘snapshots’ that you can easily flick between as the instrumentation changes throughout a piece would be insanely powerful.

    I can export some MIDI files of the kind I’m talking about if you want so you can have a play with visualising them.

    Gotcha - makes a lot of sense. That would be great!

    Thanks for checking it out :)

    One of the things I did with chord naming was to design a method for choosing the ‘most likely’ simple triad from larger clusters of notes, based on the bass note and the overtonic relationships in the upper structures.

    Very cool! I'm actually doing something similar on a different project but from a DSP perspective – using overtone relations and likely root notes to filter out relevant data from time-domain information.

  • Thanks for checking it out :)

    One of the things I did with chord naming was to design a method for choosing the ‘most likely’ simple triad from larger clusters of notes, based on the bass note and the overtonic relationships in the upper structures.

    Very cool! I'm actually doing something similar on a different project but from a DSP perspective – using overtone relations and likely root notes to filter out relevant data from time-domain information.

    I’d love to hear more about this! Let me know if there’s any more to read about.

    I actually don’t do that much/anything in terms of DSP myself as I’m primarily a performer, so by the time the sound has passed into the physical domain it’s already on the way to your ears! MIDI data is an ‘ideological’ step in the analysis and sound creation process to me so I spend more contemplative time there compared to the action, violence and solace of sound.

  • edited September 2021

    I just now also also had an imagination of a semitransparent 2.5D isometric piano roll visualiser that could display different MIDI channels (timbres) on the Z plane and highlight/focus on different layers at a time!

    I think I’m exiting the conversation into imagination here though, haha.

  • @OscarSouth said:
    I’d love to hear more about this! Let me know if there’s any more to read about.

    I actually don’t do that much/anything in terms of DSP myself as I’m primarily a performer, so by the time the sound has passed into the physical domain it’s already on the way to your ears! MIDI data is an ‘ideological’ step in the analysis and sound creation process to me so I spend more contemplative time there compared to the action, violence and solace of sound.

    Lots of cool research in the area! Anything related to autocorrelation, Fourier analysis, wavelet decomposition, and even some machine learning on time series will give an idea.
    https://www.sciencedirect.com/science/article/abs/pii/S0165168405002124
    https://www.sciencedirect.com/science/article/pii/S2405844020300888

    I'm mostly just experimenting at the moment, though maybe I'll make a write-up at some point.

    @OscarSouth said:
    I just now also also had an imagination of a semitransparent 2.5D isometric piano roll visualiser that could display different MIDI channels (timbres) on the Z plane and highlight/focus on different layers at a time!

    I think I’m exiting the conversation into imagination here though, haha.

    Yeah would be cool but maybe a bit out of scope for the moment :D

  • edited September 2021

    I would definitely be interested in testing this. Since you mentioned MIDI recording, it would be cool to have the ability to quickly rewind through a session. Even without editing, an AU that tracks the history of note inputs would be useful.

  • @Skyblazer said:
    I would definitely be interested in testing this.

    Great! Can you PM me your email?

    Since you mentioned MIDI recording, it would be cool to have the ability to quickly rewind through a session. Even without editing, an AU that tracks the history of note inputs would be useful.

    Scrolling through the entire history is already implemented!

  • @TonalityApp Yes, just recording, no editing.

  • @rs2000 said:
    @TonalityApp Yes, just recording, no editing.

    Very doable. I'm also thinking about allowing you to select a region for export as a MIDI file – would there be any other export options you'd want?

  • @TonalityApp said:

    @rs2000 said:
    @TonalityApp Yes, just recording, no editing.

    Very doable. I'm also thinking about allowing you to select a region for export as a MIDI file – would there be any other export options you'd want?

    Sounds good!
    Since one MIDI channel should be enough, the only additional option I could imagine is to export either as a type 0 or as a type 1 MIDI file for better compatibilty.

  • @rs2000 said:
    Since one MIDI channel should be enough

    I was planning to export with multiple channels if multiple channels are input – maybe I'm misinterpreting?

    to export either as a type 0 or as a type 1 MIDI file for better compatibilty.

    Noted.

  • I like the visualization aspect. It's like a piano roll you create on the fly instead of watching play by itself. This kind of view may not be very conducive to actual note editing because it compresses the available viewing space, but it definitely gives one a sense of the note patterns.

    Possible app names:
    -Keyroller
    -Noteroller
    -Viewtunes
    -Rollout
    -The Musician's Visualizer

    Just a a few off the top of my head. Feel free to use any of these.

  • edited October 2021

    @NeuM said:
    Possible app names:
    -Keyroller
    -Noteroller
    -Viewtunes
    -Rollout
    -The Musician's Visualizer

    Just a a few off the top of my head. Feel free to use any of these.

    Thanks! I'm tentatively calling it Pianogram at the moment, but still considering alternatives.


    I like the visualization aspect. It's like a piano roll you create on the fly instead of watching play by itself. This kind of view may not be very conducive to actual note editing because it compresses the available viewing space, but it definitely gives one a sense of the note patterns.

    Right now the view is actually fully scrollable and zoomable (it could be near-infinitely zoomed if someone wanted that, but I capped it at what I think is a reasonable limit for now). The photos above both show the same events generated by dragging my finger across the AUM keyboard as fast as possible. The second photo is zoomed on the first few notes. My reasoning is that even if I don't allow editing, I want to provide detailed analysis options.

  • This is a really cool starting point. I’ve been running orchestrations into it and it’s really nice to see them visualised! More informative than the stream of midi messages in list form that I was looking at before!

  • @OscarSouth said:
    This is a really cool starting point. I’ve been running orchestrations into it and it’s really nice to see them visualised! More informative than the stream of midi messages in list form that I was looking at before!

    Nice! Glad it's of some use even without channel filtering or any of the other features still to come

  • @TonalityApp said:

    @NeuM said:
    Possible app names:
    -Keyroller
    -Noteroller
    -Viewtunes
    -Rollout
    -The Musician's Visualizer

    Just a a few off the top of my head. Feel free to use any of these.

    Thanks! I'm tentatively calling it Pianogram at the moment, but still considering alternatives.

    I like the visualization aspect. It's like a piano roll you create on the fly instead of watching play by itself. This kind of view may not be very conducive to actual note editing because it compresses the available viewing space, but it definitely gives one a sense of the note patterns.

    Right now the view is actually fully scrollable and zoomable (it could be near-infinitely zoomed if someone wanted that, but I capped it at what I think is a reasonable limit for now). The photos above both show the same events generated by dragging my finger across the AUM keyboard as fast as possible. The second photo is zoomed on the first few notes. My reasoning is that even if I don't allow editing, I want to provide detailed analysis options.

    I like Pianogram!

  • @NeuM said:

    @TonalityApp said:

    @NeuM said:
    Possible app names:
    -Keyroller
    -Noteroller
    -Viewtunes
    -Rollout
    -The Musician's Visualizer

    Just a a few off the top of my head. Feel free to use any of these.

    Thanks! I'm tentatively calling it Pianogram at the moment, but still considering alternatives.

    I like the visualization aspect. It's like a piano roll you create on the fly instead of watching play by itself. This kind of view may not be very conducive to actual note editing because it compresses the available viewing space, but it definitely gives one a sense of the note patterns.

    Right now the view is actually fully scrollable and zoomable (it could be near-infinitely zoomed if someone wanted that, but I capped it at what I think is a reasonable limit for now). The photos above both show the same events generated by dragging my finger across the AUM keyboard as fast as possible. The second photo is zoomed on the first few notes. My reasoning is that even if I don't allow editing, I want to provide detailed analysis options.

    I like Pianogram!

    Ditto, great name.

  • “Knock, knock”
    “Who’s there?”
    “Pianogram”

  • Don’t call it Musualizer. Or Vidistream.

    I can’t be any more help than that at the moment.

  • @Liquidmantis said:
    “Knock, knock”
    “Who’s there?”
    “Pianogram”

    Pianogram who?

    @gusgranite said:
    Don’t call it Musualizer. Or Vidistream.

    I can’t be any more help than that at the moment.

    Thanks :D

  • @TonalityApp said:

    @rs2000 said:
    Since one MIDI channel should be enough

    I was planning to export with multiple channels if multiple channels are input – maybe I'm misinterpreting?

    Of course, recording multiple channels would be even better but I wasn't sure if you'd be willing to implement some kind of MIDI channel filter for displaying.
    I highly appreciate your efforts.

  • @rs2000 said:
    Of course, recording multiple channels would be even better but I wasn't sure if you'd be willing to implement some kind of MIDI channel filter for displaying.
    I highly appreciate your efforts.

    Ah gotcha, thanks! I think channel-related features will make it more useful in general so they're fairly high on my list.

Sign In or Register to comment.