Loopy Pro: Create music, your way.

What is Loopy Pro?Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.

Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.

Download on the App Store

Loopy Pro is your all-in-one musical toolkit. Try it for free today.

Algorithmic Orchestration! (with video demonstration/introduction)

Hi all! I don't post here nearly enough these days but I do have some cool art to share, so I'm dropping in!

This is firstly a tech demo. In the interest of demonstration I've included an example piece of music -- a transcription of Igor Stravinskys 'Suite No.1 for small orchestra, I.Andante'

“My music of today is so much based on the new musical technology. We use the technology as a material for our musical art”
Igor Stravinsky, 1957

In terms of creative application, I see three immediate directions of approach:

1: Transcription of existing/traditional orchestral scores into algorithmic representation. This is primarily an analytic process which provides a deep insight into the structural properties of an existing work.

2: Transcription of algorithmically created works onto playable orchestral scores for performance by a contemporary orchestra. This represents a personal compositional process/methodology alongside existing/traditional cognitive 'technologies' of composition. Close attention to principles of practical orchestration will be necessary throughout the algorithmic and notational phases of composing.

3: Direct performance of the orchestral work via an interface of pure text ('Live Coding'). As code is typed and executed in real time, the musical composition will build up and develop. Material could be memorised and rehearsed or improvised. Principles of practical orchestration could be adhered to or ignored under this approach.

Approaches 2 & 3 logically presents a fourth application:

4: A combination of 2 & 3 in performance at the same time.

I'm planning to create future demonstrations showcasing all these approaches

I'd love to hear any questions or opinions that anyone has about this concept! I've been working hard on it and I have a lot of fascinating and exciting (to me at least!) additions in development to further enrich the scope of the project.

Oscar South

GitHub for my own music experimentation codebase:
https://github.com/OscarSouth/theHarmonicAlgorithm

Comments

  • Really astonishing. Thanks so mich

  • Could you send midi from a keyboard into the program and have it orchestrated?

  • @Wrlds2ndBstGeoshredr said:
    Could you send midi from a keyboard into the program and have it orchestrated?

    Orchestration (the organisation of timbre through time) is equally nuanced as composition (the organisation of pitch through time). It adds another dimension of consideration and experience to the music and can't really be reduced to an automatic procedure any more so than composition itself can (IE. it kind of can, but not really).

    (.. so to answer your question -- no.)

    It's not really a 'program' as such -- just code that's executed on the fly starting with a blank page. Also not anything along the lines of an 'auto music generator' or any of these fancy 'deep learning' data in -> data out things that are going around.. Well actually, it kind of has elements of that stuff, but also not really and there's a much more human 'performative' touch to the process.

    Not that easy to summarise! I need to make more examples with music -- art seems much simpler when existing in it's finished form!

  • Just for extra context, this is the external Alesis Nanosynth sound module Oscar is targeting with the MIDI events generated by his software:

    It sound pretty convincing for a product released in 1997, I think. Only 8MB of 16-bit 48 kHz ROM sounds for it's oscillators to use.

    Looking at the forum records it appears this work has spanned 3-4 years of effort.
    For the programmers out there Haskell is a highly regarded language but tricky to learn.
    So, if you like a challenge you should download the code and check it out.

    Some of the algorithmic ideas exposed here could also be coded for note and timing MIDI events on IOS using the very accessible Moziac scripting app.

  • @OscarSouth said:

    @Wrlds2ndBstGeoshredr said:
    Could you send midi from a keyboard into the program and have it orchestrated?

    Orchestration (the organisation of timbre through time) is equally nuanced as composition (the organisation of pitch through time). It adds another dimension of consideration and experience to the music and can't really be reduced to an automatic procedure any more so than composition itself can (IE. it kind of can, but not really).

    (.. so to answer your question -- no.)

    It's not really a 'program' as such -- just code that's executed on the fly starting with a blank page. Also not anything along the lines of an 'auto music generator' or any of these fancy 'deep learning' data in -> data out things that are going around.. Well actually, it kind of has elements of that stuff, but also not really and there's a much more human 'performative' touch to the process.

    Not that easy to summarise! I need to make more examples with music -- art seems much simpler when existing in it's finished form!

    I guess I don’t understand what it is then, but interested to see what you make with it.

  • I might not have read or listened to your description carefully enough but I’m not understanding how in your example Stravinsky’s score is translated to your algorithmic code. How are the notes and expressive markings translated from sheet music to computer in the first place?

  • @Wrlds2ndBstGeoshredr said:

    @OscarSouth said:

    @Wrlds2ndBstGeoshredr said:
    Could you send midi from a keyboard into the program and have it orchestrated?

    Orchestration (the organisation of timbre through time) is equally nuanced as composition (the organisation of pitch through time). It adds another dimension of consideration and experience to the music and can't really be reduced to an automatic procedure any more so than composition itself can (IE. it kind of can, but not really).

    (.. so to answer your question -- no.)

    It's not really a 'program' as such -- just code that's executed on the fly starting with a blank page. Also not anything along the lines of an 'auto music generator' or any of these fancy 'deep learning' data in -> data out things that are going around.. Well actually, it kind of has elements of that stuff, but also not really and there's a much more human 'performative' touch to the process.

    Not that easy to summarise! I need to make more examples with music -- art seems much simpler when existing in it's finished form!

    I guess I don’t understand what it is then, but interested to see what you make with it.

    I don't think I do myself completely -- is really a variety of different things that I've worked on or with over a number of years that seem to overlap and combine to do cool things. I'll share the cool things that I manage to make with it!

    @Stochastically said:
    I might not have read or listened to your description carefully enough but I’m not understanding how in your example Stravinsky’s score is translated to your algorithmic code. How are the notes and expressive markings translated from sheet music to computer in the first place?

    I can definitely try to explain that!

    A: I'm pretty handy at score-reading. Been a music theory geek since before I was even an instrumentalist so I can pick out patterns and structure in the notes relatively quickly.

    B: I'm also quite good with the Haskell programming language and very fluent with the TidalCycles syntax for musical pattern definition -- have done a good few live performances with that by now and lots of practising in between.

    A+B=C: I look at the score, conceptualise in my head how that score could be represented inside the methodology for coding orchestral music that I've built on top of the base TidalCycles syntax, then type out that code.

    This is a very personal/human process. The codebases involved are all open source so anyone could build and use the tools, but how or why each individual uses them and what they create with them would be unique to each individual.

    It really does all come down to Stravinsky's incredibly prophetic quote:
    “My music of today is so much based on the new musical technology. We use the technology as a material for our musical art”
    Igor Stravinsky, 1957

    (I actually only discovered this quote this morning! Mind blowing!)

  • @McD said:
    Just for extra context, this is the external Alesis Nanosynth sound module Oscar is targeting with the MIDI events generated by his software:

    It sound pretty convincing for a product released in 1997, I think. Only 8MB of 16-bit 48 kHz ROM sounds for it's oscillators to use.

    Looking at the forum records it appears this work has spanned 3-4 years of effort.
    For the programmers out there Haskell is a highly regarded language but tricky to learn.
    So, if you like a challenge you should download the code and check it out.

    Some of the algorithmic ideas exposed here could also be coded for note and timing MIDI events on IOS using the very accessible Moziac scripting app.

    Thanks very much for elaborating on the details that I skipped over and adding richness/insight in the process. Especially, thanks for sharing the NanoSynth details -- the fact that this old thing is the heart and soul of a sound that feels soooo human and natural to me by the time it hits my ears constantly blows my mind. I love it.

Sign In or Register to comment.