Loopy Pro: Create music, your way.

What is Loopy Pro?Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.

Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.

Download on the App Store

Loopy Pro is your all-in-one musical toolkit. Try it for free today.

Can a competent person make an AUv3 midi plugin with no coding experience?

Because I am that person and I want to. Seems like it shouldn’t be terribly difficult if I understand the logic of what I’m trying to do. And what I want to do is replicate the Yamaha RS7000/RM1X midi effects (octave, harmonizer, midi delay, velocity, gate, clock shift, transpose, and beat stretch) in a single plugin with an analogous UI. If I’m not a dummy and have a rudimentary understanding of coding principles, how long can expect to spend learning and executing?

(Also, experienced developers, feel free to steal this idea and make this. It’s save me the hassle.)

«1

Comments

  • edited September 2018

    I'm not a programmer but i'll start off, I would imagine you need to have a crack at learning a basic programming language first. Give it a few months and see where you are at? Must be many online communities working on this same goal. I'm sure someone more qualified will add to the thread :-) Good luck man! Anything is possible if the willpower to study is there and i love it if you came back with a AUv3 midi plugin you created yourself :-)

  • edited September 2018

    My personal opinion: Realistically, no way. :) (sorry!) Lots of experience in programming and converting problems to computer-solvable logic required, plus creating an AUv3 plugin is "extra-difficult" in programming land, plus the extra difficulty of understanding an old, heavily optimized protocol like MIDI... but I'm known to be a conservative, pessimist glass-half-empty-pointing-outer, so I'm sure others will be much more cheerful! :)

  • Thanks for the input. I will say that I have a pretty deep understanding of the midi protocol, and the functionality I’m looking for is pretty basic (mostly just note-shifting and clock shifting ... except the beat stretch which I’m not sure is even possible in plugin form). Sounds like this is something beyond my “tinkering” aspirations and experience coding-wise tho...

  • @legsmechanical said:
    Thanks for the input. I will say that I have a pretty deep understanding of the midi protocol, and the functionality I’m looking for is pretty basic (mostly just note-shifting and clock shifting ... except the beat stretch which I’m not sure is even possible in plugin form). Sounds like this is something beyond my “tinkering” aspirations and experience coding-wise tho...

    You may find some kind of AUv3 "template" where you can drop your logic in, but just saying that it won't be as easy as you expect, probably. I'm a totally different "coding person" in that regard anyway as I write every single line of code from scratch (no libraries used whatsoever, ever), so I probably take longer than most programmers for my projects... but at least if something goes wrong, I can fix it! :)

  • I assume it's about the Midi functionality of that Yamaha gear, not to communicate with the original hardware, which would be an even bigger bugger ;)
    Expect to spend 80% of time in user interface compared to coding your Midi procedures.
    I'm a long time developer in business apps, but with no hands-on IOS experience.
    If I were to start that project, my estimation would be at least 1 year before usable results show up. o:)

  • @Telefunky said:
    I assume it's about the Midi functionality of that Yamaha gear, not to communicate with the original hardware, which would be an even bigger bugger ;)
    Expect to spend 80% of time in user interface compared to coding your Midi procedures.
    I'm a long time developer in business apps, but with no hands-on IOS experience.
    If I were to start that project, my estimation would be at least 1 year before usable results show up. o:)

    Right.. replicating the midi functionality completely independent of the gear. GUYS.... you're not making me feel very optimistic about this! :)

  • you can find a free template for this easily enough, but there are always challenges with the UI(custom controls), UX (getting the right controls exposed),dsp(midi timing sample to host time/lbeat or Link timing). All of which need coding experience (c, obj c & swift).
    I’ve an auv3 midi delay app ready for release but to be able to capture input & render as output took some knowledge of lists and data structures.
    As for your other suggestions, they are in progress too, just need to finish the auv3 midi recorder.
    Contact me if you need some pointers - can never have enough midi fx!

  • @midiSequencer said:
    Contact me if you need some pointers

    He'll definitely &need **some[0] ;) :D

  • edited September 2018

    Nothing bad about that... we don't know how much time you have at your dispose and how engaged you will start.
    XCode (never peeked inside) is fairly straight forward, as long as you deal with common stuff for which there are examples from which to learn and copy.
    (guessing from the sheer amount of apps)
    In your specific subject it will be hardly the case, so you're almost entirely on your own.
    (but I don't know about existing toolkits on which you could build upon - there's a lot of open source stuff)

    On the other hand you may start with Lemur right on as a preparation.
    If you succeed in reasonable time, your chances for success increase significantly, as you made it through a serious test.

  • @midiSequencer said:
    you can find a free template for this easily enough, but there are always challenges with the UI(custom controls), UX (getting the right controls exposed),dsp(midi timing sample to host time/lbeat or Link timing). All of which need coding experience (c, obj c & swift).
    I’ve an auv3 midi delay app ready for release but to be able to capture input & render as output took some knowledge of lists and data structures.
    As for your other suggestions, they are in progress too, just need to finish the auv3 midi recorder.
    Contact me if you need some pointers - can never have enough midi fx!

    Music to my ears! :) Midi delay in quantum is so close to the yammy implementation that I always assumed you had used one of those machines (maybe I'm wrong?), so I'm happy to hear about this and the other forthcoming plugins (as much as I love playing with quantum, I'm just not much of a step sequencer guy when it comes to composing). And thanks for offering to share your knowledge.

  • Although it wont help with AUv3.
    Swift Playground is great for trying out your ability as a swift programmer.
    Swift Playgrounds by Apple
    Then there is all the AudioKit @analog_matt which is an amazing resource.
    Then Apples xcode app
    Its all free, all excellent, loads of resource to make an ios app but not a ios auv3, that’s different apparently :'(

    I gave up and went back to Arduino, much more fun, less frustrating but again no auv3

  • edited September 2018

    The AudioKit blog has a getting-started tutorial on building AUv3 MIDI Plug-ins here:

    https://audiokitpro.com/auv3-midi-tutorial-part1/

    We'll have more robust AUv3 code examples in the coming months...

  • If MobMuPlat was AUv3 already, you could do it with limited efforts in PureData, including UI elements. For now, it's only IAA, but maybe someone knows an iOS PureData runtime environment that can act as AUv3??

  • edited September 2018

    Another idea: Couldn't one write a MidiFire/StreamByter AUv3 script that reads MIDI controller values from knobs, buttons and faders and uses these to adjust script parameters?

  • @rs2000 said:
    Another idea: Couldn't one write a MidiFire/StreamByter AUv3 script that reads MIDI controller values from knobs, buttons and faders and uses these to adjust script parameters?

    I would also suggest looking into StreamByter/MidiFire.
    Also Audulus 3 has AUv3 support coming, it is in beta I hear.
    These are both challenging, but WAY easier than coding from scratch or whatever.
    But both are quite powerful, and could potentially be enough to do those midi tricks.

  • @legsmechanical

    As a musician/music theorist (if that’s a thing) who’s recently done something relatable (I used Haskell language to build the ‘working machinery’ of an in depth music theory/analysis thesis I wrote into a practically useful command line app) I say you can do it, as long as you can keep the enthusiasm and passion to create it burning for long enough to see the project to it’s conclusion.

    If your experience is like mine, I think you’ll need to go down some deep tangents of learning and be both consistently enthusiastic and methodical in order to absorb all the necessary theory along the way. I spent about 2 months reading books, papers and articles on Haskell & Category Theory before I started writing code. I think a lot of the theory you’ll need is a lot less abstract/mathematical than what I was pursuing for Haskell, but no less of an endeavour to master to a practical level.

    No idea of timescales for your own goals but mine (which included writing modules providing advanced pitchclass analysis functions and tailored machine learning algorithms from base libraries) took about 2 months of research and probably a month and a half of intense work to bring the work to a functional state.

    So I say just go for it and let your enthusiasm carry you. Worked for me and I got the results I was after (I’m thinking of rewriting my Haskell app in Swift sometime too). Your idea also fills a currently unfilled niche on iOS.

    If you’re interested, can see my own (open source) project mentioned above here:

    https://github.com/OscarSouth/theHarmonicAlgorithm

  • Don't forget how much time you'll have to spend on the Audiobus forums during coding and after release to answer annoying questions and deal with picky... er... discerning customers! >:) :p

  • @vitocorleone123 said:
    Don't forget how much time you'll have to spend on the Audiobus forums during coding and after release to answer annoying questions and deal with picky... er... discerning customers! >:) :p

    :)

    Thanks everyone. This is all really helpful info. Streambyter definitely seems up to the task for making discreet versions of each of the effects, but the fact that it focuses so much on fixed values and doesn't seem to deal with midi clock (there's no way to offset things based on clock as far as I can tell) makes it too limited. Maybe I'm missing something there though...

  • @analog_matt said:
    The AudioKit blog has a getting-started tutorial on building AUv3 MIDI Plug-ins here:

    I read through this earlier this month, and it was super helpful as far as orienting myself goes -- it kind of spurred my interest in moving forward with this idea. Thanks for providing such an awesome resource.

  • @SevenSystems said:

    @midiSequencer said:
    Contact me if you need some pointers

    He'll definitely &need **some[0] ;) :D

    I still think swift is strange, but forcing myself to learn it for all UI

    let closeAction = UIContextualAction(style: .normal, title: "Share", handler: { (ac:UIContextualAction, view:UIView, success:(Bool) -> Void) in
    ... etc

  • @legsmechanical said:

    Music to my ears! :) Midi delay in quantum is so close to the yammy implementation that I always assumed you had used one of those machines (maybe I'm wrong?), so I'm happy to hear about this and the other forthcoming plugins (as much as I love playing with quantum, I'm just not much of a step sequencer guy when it comes to composing). And thanks for offering to share your knowledge.

    Midi echo for me is my most used fx especially with regular rhythms. When used with a synth ARP its wonderful (to my ears).

    This midi delay is however slightly different to Quantum version as it has no concept of step (which fixed the note duration). So it delays both the midi note on & midi note off allowing you to echo in realtime.

  • Don't forget that Audulus AUv3 is coming soon and so building it in that will probably be a lot easier :)

  • @analog_matt said:
    The AudioKit blog has a getting-started tutorial on building AUv3 MIDI Plug-ins here:

    https://audiokitpro.com/auv3-midi-tutorial-part1/

    We'll have more robust AUv3 code examples in the coming months...

    Yes this is the auv3 midi source I was alluding to. Gene's publication got me the starting point to build my auv3 midi apps by filling in the gaps on musical context, internalRender, auparams, presets etc.

    For the UI, Audiokit has some nice control examples (all in swift - so I had to learn it!) that are very usable if a bit styled. Indeed Audiokit/Synthone etc is a wonderful learning place (if you know or want to learn swift).

  • @midiSequencer said:

    Midi echo for me is my most used fx especially with regular rhythms. When used with a synth ARP its wonderful (to my ears).

    This midi delay is however slightly different to Quantum version as it has no concept of step (which fixed the note duration). So it delays both the midi note on & midi note off allowing you to echo in realtime.

    Midi echo/delay is such an infinitely useful effect that I'm really surprised it's not a standard feature of all sequencers. It can come super close to replacing a dsp delay for basic purposes with the added advantage of pitch, velocity, gate modification as well. And like you said, when used in conjunction with an arp..... my god. I load anemic tracks made elsewhere into my rs7000 solely to use the midi delay and nearly always end up with a much-improved jam (or end up having a ton of fun with a track before it hits the cutting room floor). I could go on and on and on..... :)

  • I am an hvac tech and not a programmer but was able to write some very nice software for android using a program called basic4android. It has a visual designer where you can do layouts with buttons,edit text,etc and code in a very easy Visual Basic like language and then it compiles to native byte code for android. I mention this because they also have something like it for iOS called B4i and I may port my apps to iOS if I decide to stay. They also offer a free basic for java to play with too. The company is called Anywhere Software. I may catch some heat from programmers for mentioning this but I think it’s worth a look and did well for me.

  • What about us incompetent people?

  • @vitocorleone123 said:
    Don't forget how much time you'll have to spend on the Audiobus forums during coding and after release to answer annoying questions and deal with picky... er... discerning customers! >:) :p

    Quite true. We are finickity customers. We know what we likes, and we wants value for our pennies, dagnubbit! So, with that in mind...

    IS IT READY YET?!? OK... WHEN?!? ARE THERE ANY BUGS? WHEN’S IT GOING ON SALE?

    :D :p

  • @legsmechanical said:

    @midiSequencer said:

    Midi echo for me is my most used fx especially with regular rhythms. When used with a synth ARP its wonderful (to my ears).

    This midi delay is however slightly different to Quantum version as it has no concept of step (which fixed the note duration). So it delays both the midi note on & midi note off allowing you to echo in realtime.

    Midi echo/delay is such an infinitely useful effect that I'm really surprised it's not a standard feature of all sequencers. It can come super close to replacing a dsp delay for basic purposes with the added advantage of pitch, velocity, gate modification as well. And like you said, when used in conjunction with an arp..... my god. I load anemic tracks made elsewhere into my rs7000 solely to use the midi delay and nearly always end up with a much-improved jam (or end up having a ton of fun with a track before it hits the cutting room floor). I could go on and on and on..... :)

    quantum to model d with arp together with bram’s excellent Kosmonaut for sound on sound & tapped delay does it for me.

  • I got lucky! I made a thorough nuisance of myself describing what I envisioned in every appropriate thread, while I dived to learning what I need to develop it on my own if needed. Before I had to get in to the really hard stuff @midiSequencer picked up the ball and is close to releasing what would probably have taken me a year to accomplish, only 1000% better. I couldn’t be happier! B)

  • edited September 2018

    Coding is not really your main problem here. Unless you have a good mathematics and physics background, the complexity of the algorithms involved in audio effects would be too overwhelming. It's not rocket science but still it's not your basic task when learning a coding language either.

    Swift in itself, the new language for apple apps, is not that difficult though. It's some homebrewed javascript basically. You can still have a try for fun.

Sign In or Register to comment.