Loopy Pro: Create music, your way.
What is Loopy Pro? — Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.
Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.
Download on the App StoreLoopy Pro is your all-in-one musical toolkit. Try it for free today.
Why do Midi slides sound so un-natural?
Hey guys,
Somethin just popped into my head this morning. I'm sure there are a tonne of quite deep scientific explanations on Wiki somewhere but I wanted to ask the more technical-minded musicians (whom I think frequent this forum).
So.. as the title says.. midi synthesis of real isntr. can sound so incredible, but slides are often the giveaway aren't they.
I mean, iFretless Brass,Sax,Bass are all incredible, I think they sound immense and very useable indeed but in application, you have to be tactical about how you use them.
As we all know, some midi instr. will sound 'just perfect' when playing in short bursts, whereas some sustained notes might sound less so.
I sometimes employ masking tricks, such as adding a 2nd instr. (ifretless Brass for example), and changing the blending mode. ---it can have the effect of somewhat blurring the two instr. to the point where the bend/slide is less 'comical'.
---I hate using that word, sounds harsh! I am sure many many scientists around the world have worked very hard to get things where they are today.
But back to the point, just why does a slide sound so artificial?
I was reading a book about standard acoustic guitar where it described the fluctuating/pulsing frequencies that move along the strings (from the point where you pluck/strike it, outward along the string all the way along the neck, all the way back up again (on each side of the pluck), until the vibrations eventually die off. 'exciting the strings' as they put it.
Regardless of all that, when you hit a fret in iFretless and then slide up a tone, the final (2nd) note will sound great (as good as the first note), so there is no degradation/adverse effect, but it's just the slide in particular which sounds artificial.
Can anyone offer some insight or explanation as to why this happens? I'm intrigued.
Thanks for reading.
Comments
It's related to why microtonal pitch is so alien to midi (alien as in not the same nature, not alien as in John Hurt, or Paul, or Dark Star) (or Star Trek) (or Lost In Space) (or Close Encounters of the Thir… where was I? What was I talking about? I've forgot).
I'm not certain but want to say that the resolution with MIDI DATA isn't very fine. So the results with a MIDI slide will sound steppy. Know what i mean. The resolution with OSC is higher. I'm kind of surprised osc hasn't caught on more. But I speak of recording slides from hardware.
Couple of thoughts -
First is understanding midi is a control language, and has nothing to do with the timbre of the sound you here. Sounds like you are using fretboard style controllers, like ifretless - when you "slide", you are generating a pitch bend midi control - and every synth responds to that based on it's own code.
Secondly, you mention synths that emulate real world instruments. These are usually "sample based" to some degree and there in lies the challenge. I believe someone here mentioned recently, from an article they had read, that samples need to be no more than a Minor third apart before the illusion of the actual instrument falls apart. So a good grand piano patch has every third note sampled and only "stretches" that sound by 3 semitones.
However, using pitch bend it's the original sample played that's getting stretched - and hence comical. Try using an "actual" synthesizer (non sample based) like GeoShred, Moog, etc and you will find no comical-ness, as pitchbend command is talking to oscillators rather than a single sample.
I do agree the ifretless team has done a great job handling the limitations of sample based bends. But in general most sample based synths aren't going to do well bending more than a third or so - which in reality, unless it's a whammy bar king (Belew, Vai, etc or a fretless bass), is pretty much the limit.
As always, I may be totally wrong but I'm a .....
I'm no scientist but I've thought about this a lot and read some Sound on Sound over the years:
Sample-based instruments have difficulties changing pitch without also changing body resonance (eg an acoustic guitar has specific resonances that don't change when you bend notes; most samplers alter the pitch of the sample for slides, thus changing the resonances as well). Even in a multi-sampled instrument, upward slides will chipmunk a bit. Also, even in a clever library, the second note will usually play out as if it had been plucked; in real life, your finger will dampen the strings during the slide, affecting the decay.
Modelled instruments usually have trouble with an instrument's natural quantization (eg when you slide on an acoustic guitar, you're subtly altering the tension of the string as you slide AND you're crossing frets. That gives you a smooth-ish slide as well as distinct steps; it's easy to program one or the other but they interact differently depending on the speed and distance of the slide)
Then you also have issues like fret/neck buzzing on stringed instruments and major- but-brief timbral changes on wind instruments due to changes in embouchre. Not to mention whether or not the new note plays out at the same volume and brightness... All of these are small concerns overall but there are so many of them that they wind up having a big effect.
It´s the same with real legato and portamento recordings. They sound so much better compared to the "faked" stuff via pitchbending. But you need some good scripting engines to get that right.
Also real recorded vibrato of strings f.e. sounds mostly much better as via an LFO.
In good physical modeling synths it might sound better as there would be not just a strechted sample (like metioned). It would emulate the real behavier of instruments.
But there are also combinations of real samples and physical modeling.
Mmhh, normally pitch bend is always 14bit midi, means 16384 steps (8192 up and down). Like mentioned it should sounds very smooth with synths (if the synth let you). Stretching a sample will always sounds a bit unnatural compared to a real acoustic instrument. At least until someone comes with a genius algorithm or whatever to change that.
There are also huge differences in time stretching tools f.e. The IRCAM stuff is the best i think.
Good question. @Blipsford_Baubie Midi pitch bend is 14 bit, which amounts to 128 times the resolution of normal midi CC controls (which at 7 bit, can be too coarse when trying to control something the ear is sensitive to), just so the resolution should be quite good/smooth.
Slides on virtual electric bass, at least, sound sillier than the real thing, because they are too smooth, like a slide whistle. A real, fretted bass will step through the notes chromatically, with a little buzz as you go over each fret.
Perhaps they simply sound silly because they are used in a silly way?
I very much doubt it is the resolution of the slide, everything to do with the chipmunk effect of changing a sample's pitch. As pointed out earlier, this changes the resonances as well as the fundamental tone, and I think that changing it dynamically emphasises that it is unnatural - it is like you are changing the size of the guitar body when you bend a string. This is pretty much an inherent issue with using samples, and probably the only way around it is through good physical modelling which separately models the resonances from the string vibrations. The effect is not really noticable with pure synth tones as they are artificial to start with so there is not the expectation.
Great answers.
I think technique has a lot to do with it. The way a player on a physical instrument slides is going to be different for each player/occasion (harder attack before to start the slide? more fret pressure at the start? linear slide? slower at the end? running out of breath on a sax?...). Learning how to recreate those subtle differences with a non-traditional interface is going to be a challenge at best. Then you have all of the problems with samples mentioned above on the receiving end.
Also, just to pick up where @Cib left off... 8192 steps for pitch bend (in each direction) seems like plenty of resolution but you may have to take the 'bend range' setting in the target app into account.
Yes, of course the more semitones/octaves you use for your slides the more steppy it will get.
But synths like Model 15 sounds still great over 4 or more octaves if you want that.
But for samples you will never get that and the higher or lower you pitch the more it will sound unnatural.
The only way is still physical modeling for this. But physical modeling is also not perfect yet offers f.e. some great things like changing material/timbre in real time. It´s so hard to create usable instruments over several octaves with Logic´s Sculpture f.e. but it sounds magical if you play a custom made guitar which strings morph from nylon to glass.
For me physical modeling is the future of synthesis, at least i wish more developers would try to achieve new heights here.
FingerFiddle sounds great using modeling.
I don't know what magic lies within, but I'm always impressed with the slide guitar sounds coming out of Yonac's Steel Guitar app. Is it sample-based?
I think so. Although sample based synths have problem sounding realistic while pitching up/down , some professional libraries have actually pre recorded samples for slides.