Loopy Pro: Create music, your way.

What is Loopy Pro?Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.

Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.

Download on the App Store

Loopy Pro is your all-in-one musical toolkit. Try it for free today.

Comparisons of Geo and Naada instruments in GeoShred with SWAM instruments

13

Comments

  • @Luxthor said:

    @mojozart said:
    The Naada Instruments are documented here:

    https://www.moforte.com/geoShredAssets6000/help/effects/naada.html

    I saw this page, it was confusing for me and still is. It’s better than nothing. Gavinski screenshots helped me to differentiate collections. Thank you anyway! 🫶

    This page is from the reference manual for what parameters are on each instrument. It's not marketing, It's targeted for preset creators who want to know what the various parameters do.

  • edited November 2023

    @moForte said:

    @wingwizard said:

    @moForte said:

    • The GeoShred Keyboard is much more fluid than a physical controller simply because the touch sample rate is higher than the sensor sample rate on controllers. That being said it is not tactile beyond being smooth glass. It is a visual instrument that you look at when playing.

    Hi I very much like geoshred but please implement or consider implementing velocity as at the moment I always use vel kb due to the large increase in playability. It uses y axis as velocity - so velocity isn’t velocity but a representation of it. Velocity keyboard has a superb implementation of tap velocity - you can adjust the sensitivity and the velocity range expressed to tailor it for your own preference. I would recommend implementing this to get a more real and playable experience in geoshred. It’s really important and completely transforms the experience. Thanks.

    I would love it if you would send me a video of how you adjust the range. We have tested Velocity keyboard for many years, and have only been able to get a narrow range of velocities; validated with MIDI monitor. We have had a few customers use it and say that they can't reach the high velocity impulsive expression of the Cello. I'm open to what you have learned. You can reach me directly at [email protected].

    Thanks for the response, and for engaging, it’s really cool. I will try to do this and experiment a bit with the settings. I’m wondering though if it might be the case despite there being a lower (I would not describe it as narrow compared to y axis velocity in practice due to the physical smallness of that axis) velocity range, the absolute increase in velocity playability is for me the larger factor. I mean that, velocity with y axis, isn’t velocity at all, and playing with it doesn’t engage touch or feel. It actually pulls you out of the physicality of the playing experience which is very counter to such beautifully playable modelled instruments as Naada or swam. And I do think you and audio modelling are my favourite two instrument devs, by a long long way. You play y axis velocity visually and because it’s not a continuum like pressure you are effectively playing blind. For me, as a touch musician, whcih I think is exactly why your instruments are so great for me, it really removes an entire aspect of playability. I would rather 10 velocity levels triggered by touch than 127 by y axis. The dynamic of what is played is so heavily affected and yiu end up with a lot of strange off dynamics produced by guessing volumes by eye. It also causes cognition to come into play when you really don’t want that playing and improvising. I think any lack of range in touch is offset by having to employ guesswork with y axis with the result that the more expressive parts tend just to be a bunch of hits around the top end of the axis, overestimated and the same with the quieter parts, it’s particularly a problem with sparse melodies having this exaggerated dynamic

    Anyway, i hope you don’t mind me going into my feelings on the specifics of that dynamic. The detail is nt of criticism but intent to express what I mean and why, :)

    It might be the case that others with a different musical background don’t feel that as being significant but for me it’s everythung. I’m actually surprised that people feel y axis velocity is velocity and generally say this or that app has velocity when it really doesn’t. But I’ve noticed this is coming from a majority of people with a strong synth rather than acoustic instrument background. I’ve seen positive comments on other acoustic instrument app sounds which I personally find unplayable so there is a big divide there imo.

    At least as an option it would be great i feel.

  • @wingwizard said:

    @moForte said:

    @wingwizard said:

    @moForte said:

    • The GeoShred Keyboard is much more fluid than a physical controller simply because the touch sample rate is higher than the sensor sample rate on controllers. That being said it is not tactile beyond being smooth glass. It is a visual instrument that you look at when playing.

    Hi I very much like geoshred but please implement or consider implementing velocity as at the moment I always use vel kb due to the large increase in playability. It uses y axis as velocity - so velocity isn’t velocity but a representation of it. Velocity keyboard has a superb implementation of tap velocity - you can adjust the sensitivity and the velocity range expressed to tailor it for your own preference. I would recommend implementing this to get a more real and playable experience in geoshred. It’s really important and completely transforms the experience. Thanks.

    I would love it if you would send me a video of how you adjust the range. We have tested Velocity keyboard for many years, and have only been able to get a narrow range of velocities; validated with MIDI monitor. We have had a few customers use it and say that they can't reach the high velocity impulsive expression of the Cello. I'm open to what you have learned. You can reach me directly at [email protected].

    Thanks for the response, and for engaging, it’s really cool. I will try to do this and experiment a bit with the settings. I’m wondering though if it might be the case despite there being a lower (I would not describe it as narrow compared to y axis velocity in practice due to the physical smallness of that axis) velocity range, the absolute increase in velocity playability is for me the larger factor. I mean that, velocity with y axis, isn’t velocity at all, and playing with it doesn’t engage touch or feel. It actually pulls you out of the physicality of the playing experience which is very counter to such beautifully playable modelled instruments as Naada or swam. And I do think you and audio modelling are my favourite two instrument devs, by a long long way. You play y axis velocity visually and because it’s not a continuum like pressure you are effectively playing blind. For me, as a touch musician, whcih I think is exactly why your instruments are so great for me, it really removes an entire aspect of playability. I would rather 10 velocity levels triggered by touch than 127 by y axis. The dynamic of what is played is so heavily affected and yiu end up with a lot of strange off dynamics produced by guessing volumes by eye. It also causes cognition to come into play when you really don’t want that playing and improvising. I think any lack of range in touch is offset by having to employ guesswork with y axis with the result that the more expressive parts tend just to be a bunch of hits around the top end of the axis, overestimated and the same with the quieter parts, it’s particularly a problem with sparse melodies having this exaggerated dynamic

    Anyway, i hope you don’t mind me going into my feelings on the specifics of that dynamic. The detail is nt of criticism but intent to express what I mean and why, :)

    It might be the case that others with a different musical background don’t feel that as being significant but for me it’s everythung. I’m actually surprised that people feel y axis velocity is velocity and generally say this or that app has velocity when it really doesn’t. But I’ve noticed this is coming from a majority of people with a strong synth rather than acoustic instrument background. I’ve seen positive comments on other acoustic instrument app sounds which I personally find unplayable so there is a big divide there imo.

    At least as an option it would be great i feel.

    That would make sense to me to have it as an option, and I think you make some good points. Mind you, for me, dealing with a screen is always going to be slightly cerebral compared to playing a traditional instrument, as, no matter what, you really have to use your eyes. Unless maybe you practise a lot, but certainly it will be much more challenging, having no frets etc to help you feel your way around with eyes closed.

    I would be wary of deciding what people’s backgrounds are, btw, in terms of synth vs acoustic! I played acoustic guitar (rhythm, not lead), sang, wrote songs etc for many years before ever touching a synth, for example, even though I also DJed and listened to a lot of electronic stuff back in my guitar playing days. So my background is far more acoustic, even though I rarely pick up a guitar these days. In my acoustic days I had absolutely zero interest in any FX etc other than a bit of reverb, I really loved just a pure acoustic tone.

  • edited November 2023

    @Gavinski said:

    @wingwizard said:

    @moForte said:

    @wingwizard said:

    @moForte said:

    • The GeoShred Keyboard is much more fluid than a physical controller simply because the touch sample rate is higher than the sensor sample rate on controllers. That being said it is not tactile beyond being smooth glass. It is a visual instrument that you look at when playing.

    Hi I very much like geoshred but please implement or consider implementing velocity as at the moment I always use vel kb due to the large increase in playability. It uses y axis as velocity - so velocity isn’t velocity but a representation of it. Velocity keyboard has a superb implementation of tap velocity - you can adjust the sensitivity and the velocity range expressed to tailor it for your own preference. I would recommend implementing this to get a more real and playable experience in geoshred. It’s really important and completely transforms the experience. Thanks.

    I would love it if you would send me a video of how you adjust the range. We have tested Velocity keyboard for many years, and have only been able to get a narrow range of velocities; validated with MIDI monitor. We have had a few customers use it and say that they can't reach the high velocity impulsive expression of the Cello. I'm open to what you have learned. You can reach me directly at [email protected].

    Thanks for the response, and for engaging, it’s really cool. I will try to do this and experiment a bit with the settings. I’m wondering though if it might be the case despite there being a lower (I would not describe it as narrow compared to y axis velocity in practice due to the physical smallness of that axis) velocity range, the absolute increase in velocity playability is for me the larger factor. I mean that, velocity with y axis, isn’t velocity at all, and playing with it doesn’t engage touch or feel. It actually pulls you out of the physicality of the playing experience which is very counter to such beautifully playable modelled instruments as Naada or swam. And I do think you and audio modelling are my favourite two instrument devs, by a long long way. You play y axis velocity visually and because it’s not a continuum like pressure you are effectively playing blind. For me, as a touch musician, whcih I think is exactly why your instruments are so great for me, it really removes an entire aspect of playability. I would rather 10 velocity levels triggered by touch than 127 by y axis. The dynamic of what is played is so heavily affected and yiu end up with a lot of strange off dynamics produced by guessing volumes by eye. It also causes cognition to come into play when you really don’t want that playing and improvising. I think any lack of range in touch is offset by having to employ guesswork with y axis with the result that the more expressive parts tend just to be a bunch of hits around the top end of the axis, overestimated and the same with the quieter parts, it’s particularly a problem with sparse melodies having this exaggerated dynamic

    Anyway, i hope you don’t mind me going into my feelings on the specifics of that dynamic. The detail is nt of criticism but intent to express what I mean and why, :)

    It might be the case that others with a different musical background don’t feel that as being significant but for me it’s everythung. I’m actually surprised that people feel y axis velocity is velocity and generally say this or that app has velocity when it really doesn’t. But I’ve noticed this is coming from a majority of people with a strong synth rather than acoustic instrument background. I’ve seen positive comments on other acoustic instrument app sounds which I personally find unplayable so there is a big divide there imo.

    At least as an option it would be great i feel.

    That would make sense to me to have it as an option, and I think you make some good points. Mind you, for me, dealing with a screen is always going to be slightly cerebral compared to playing a traditional instrument, as, no matter what, you really have to use your eyes. Unless maybe you practise a lot, but certainly it will be much more challenging, having no frets etc to help you feel your way around with eyes closed.

    I would be wary of deciding what people’s backgrounds are, btw, in terms of synth vs acoustic! I played acoustic guitar (rhythm, not lead), sang, wrote songs etc for many years before ever touching a synth, for example, even though I also DJed and listened to a lot of electronic stuff back in my guitar playing days. So my background is far more acoustic, even though I rarely pick up a guitar these days. In my acoustic days I had absolutely zero interest in any FX etc other than a bit of reverb, I really loved just a pure acoustic tone.

    Yeah, i didnt mean the comment re people’s musical backgrounds in a judgemental or i dont know, a way that was intended to categorise, apologies if it came across like that. It’s just ive noticed it a few times. Perhaps with things like sequencing vs playing and just a very different mindset musically. But of course you’re right and as for me - my background is overwhelmingly singer-songwriter, piano/guitar, but i get bored very easily and only really played synths for a decade. For a long time everyone thought i was really into guitar bands when i just couldnt listen to that and the dread of books or tv shows or little Christmas presents related to bands when i stopped being able to watch or read musicians talking about anything or really anything related to music culture a long long time ago. I think though you may not be a hugely representative example, or me, or perhaps im wrong about that. I mean… i know im not, ever, in any respect, and am fine with it lol.

    Im not sure i have the same feelings exactly about the screen but i definitely take your point. The reason thats my take is that i was so blown away, despite it not being perfect, with the roli blocks (hate seaboards), that i was shocked by how close velocity kb is to that experience when you get it set up well with an instrument. It’s so playable for me. Its what stopped me getting that erae or eros touch thing. My main issue with it is that apps are not optimised for it and you often run into not being able to pitch bend individually. Which is terrible.

    Actually, i can go back to mugician by rob something or other as the first very early experience for me playing a grid arrangement on a touch screen and that was probably the thing that pulled me away from acoustic instruments for a bit. It just made a lot of sense to me and i found it really natural to play. So theres that layout thing too

    How visual playing is, is an interesting thing, it can be physical… not referential or relational. Or a mix. That’s a battle though if im honest and my good days are where im at my dumbest and least cognitive. Otherwise im probably just not a good enough musician.

    But humans expect that when they hit something harder, it will produce a louder racket. And by simulating these expectations, more of ourselves can be given to creating and expressing stuff with what they produce rather, i think. It’s actually one of the reasons i like ai or anything that takes linguistics out of communication. But on the Flipside having to learn techniques really gives you a fine feel for, and allows you to get to know the qualities those techniques are playing with, regardless of whether the techniques are products of technological inefficiencies. I hope we develop new ones with advancing tech rather than using it just to make the older easier.

  • @wingwizard said:

    @Gavinski said:

    @wingwizard said:

    @moForte said:

    @wingwizard said:

    @moForte said:

    • The GeoShred Keyboard is much more fluid than a physical controller simply because the touch sample rate is higher than the sensor sample rate on controllers. That being said it is not tactile beyond being smooth glass. It is a visual instrument that you look at when playing.

    Hi I very much like geoshred but please implement or consider implementing velocity as at the moment I always use vel kb due to the large increase in playability. It uses y axis as velocity - so velocity isn’t velocity but a representation of it. Velocity keyboard has a superb implementation of tap velocity - you can adjust the sensitivity and the velocity range expressed to tailor it for your own preference. I would recommend implementing this to get a more real and playable experience in geoshred. It’s really important and completely transforms the experience. Thanks.

    I would love it if you would send me a video of how you adjust the range. We have tested Velocity keyboard for many years, and have only been able to get a narrow range of velocities; validated with MIDI monitor. We have had a few customers use it and say that they can't reach the high velocity impulsive expression of the Cello. I'm open to what you have learned. You can reach me directly at [email protected].

    Thanks for the response, and for engaging, it’s really cool. I will try to do this and experiment a bit with the settings. I’m wondering though if it might be the case despite there being a lower (I would not describe it as narrow compared to y axis velocity in practice due to the physical smallness of that axis) velocity range, the absolute increase in velocity playability is for me the larger factor. I mean that, velocity with y axis, isn’t velocity at all, and playing with it doesn’t engage touch or feel. It actually pulls you out of the physicality of the playing experience which is very counter to such beautifully playable modelled instruments as Naada or swam. And I do think you and audio modelling are my favourite two instrument devs, by a long long way. You play y axis velocity visually and because it’s not a continuum like pressure you are effectively playing blind. For me, as a touch musician, whcih I think is exactly why your instruments are so great for me, it really removes an entire aspect of playability. I would rather 10 velocity levels triggered by touch than 127 by y axis. The dynamic of what is played is so heavily affected and yiu end up with a lot of strange off dynamics produced by guessing volumes by eye. It also causes cognition to come into play when you really don’t want that playing and improvising. I think any lack of range in touch is offset by having to employ guesswork with y axis with the result that the more expressive parts tend just to be a bunch of hits around the top end of the axis, overestimated and the same with the quieter parts, it’s particularly a problem with sparse melodies having this exaggerated dynamic

    Anyway, i hope you don’t mind me going into my feelings on the specifics of that dynamic. The detail is nt of criticism but intent to express what I mean and why, :)

    It might be the case that others with a different musical background don’t feel that as being significant but for me it’s everythung. I’m actually surprised that people feel y axis velocity is velocity and generally say this or that app has velocity when it really doesn’t. But I’ve noticed this is coming from a majority of people with a strong synth rather than acoustic instrument background. I’ve seen positive comments on other acoustic instrument app sounds which I personally find unplayable so there is a big divide there imo.

    At least as an option it would be great i feel.

    That would make sense to me to have it as an option, and I think you make some good points. Mind you, for me, dealing with a screen is always going to be slightly cerebral compared to playing a traditional instrument, as, no matter what, you really have to use your eyes. Unless maybe you practise a lot, but certainly it will be much more challenging, having no frets etc to help you feel your way around with eyes closed.

    I would be wary of deciding what people’s backgrounds are, btw, in terms of synth vs acoustic! I played acoustic guitar (rhythm, not lead), sang, wrote songs etc for many years before ever touching a synth, for example, even though I also DJed and listened to a lot of electronic stuff back in my guitar playing days. So my background is far more acoustic, even though I rarely pick up a guitar these days. In my acoustic days I had absolutely zero interest in any FX etc other than a bit of reverb, I really loved just a pure acoustic tone.

    Yeah, i didnt mean the comment re people’s musical backgrounds in a judgemental or i dont know, a way that was intended to categorise, apologies if it came across like that. It’s just ive noticed it a few times. Perhaps with things like sequencing vs playing and just a very different mindset musically. But of course you’re right and as for me - my background is overwhelmingly singer-songwriter, piano/guitar, but i get bored very easily and only really played synths for a decade. For a long time everyone thought i was really into guitar bands when i just couldnt listen to that and the dread of books or tv shows or little Christmas presents related to bands when i stopped being able to watch or read musicians talking about anything or really anything related to music culture a long long time ago. I think though you may not be a hugely representative example, or me, or perhaps im wrong about that. I mean… i know im not, ever, in any respect, and am fine with it lol.

    Im not sure i have the same feelings exactly about the screen but i definitely take your point. The reason thats my take is that i was so blown away, despite it not being perfect, with the roli blocks (hate seaboards), that i was shocked by how close velocity kb is to that experience when you get it set up well with an instrument. It’s so playable for me. Its what stopped me getting that erae or eros touch thing. My main issue with it is that apps are not optimised for it and you often run into not being able to pitch bend individually. Which is terrible.

    Actually, i can go back to mugician by rob something or other as the first very early experience for me playing a grid arrangement on a touch screen and that was probably the thing that pulled me away from acoustic instruments for a bit. It just made a lot of sense to me and i found it really natural to play. So theres that layout thing too

    How visual playing is, is an interesting thing, it can be physical… not referential or relational. Or a mix. That’s a battle though if im honest and my good days are where im at my dumbest and least cognitive. Otherwise im probably just not a good enough musician.

    But humans expect that when they hit something harder, it will produce a louder racket. And by simulating these expectations, more of ourselves can be given to creating and expressing stuff with what they produce rather, i think. It’s actually one of the reasons i like ai or anything that takes linguistics out of communication. But on the Flipside having to learn techniques really gives you a fine feel for, and allows you to get to know the qualities those techniques are playing with, regardless of whether the techniques are products of technological inefficiencies. I hope we develop new ones with advancing tech rather than using it just to make the older easier.

    I didn't think you were being judgemental, don't worry, just thought I'd mention that I personally didn't fit that, but you're probably right that we're outliers.

    And you did make me pull velocity keyboard out again, hadn't used it for a while, and I played around with it with the Geoshred Pipa. I really enjoyed it, definitely more intuitive for me personally than the Geoshred keyboard, I will probably go back to using VK as my main controller for Geoshred haha. I did notice one thing, sorry to be a bit off topic, but I couldn't see that the pitch correction speed slider in VK is doing anything at all to do with pitch correction. Instead it seemed to be affecting velocity sensivitivity. Weird. Is it like that for you too? It doesn't seem to be in the in app manual either, must have been added after they wrote that, or just overlooked.

  • @moForte said:
    The santoor as with all zither like instruments is challenging. The physics modeling algorithms are compute intensive. That's why we and audio modeling have initially focused on monophonic instruments and low string-number chordaphones. We say that the algorithms are not quite quantum mechanics, but it's close. I'm actually not kidding about that ;-) Quantum mechanics is how I got into simulating audio as an undergrad physics student. Ask me sometime how an impulse is like a 1-D particle and how the Heisenberg uncertainty principle applies to DSP. For laughs look at my string and membrane simulations from 1982/83. Done in the days when computer graphics was done on a storage oscilloscope.

    So back to the Santoor. At the moment we are not doing high string-number instruments. However as GeoShred migrates to desktops (Mac first) there will be enough compute power available to do these types of instruments. You will note that Audio Modeling is starting to head in this direction as well with their String Sections.

    So, it’s not about the exciter, simple Karplus Strong, and the model with a bunch of filters in between, haha. If you want to simulate nature, then you actually simulate everything in existence. Enthusiastic musicians like me should stay away from this matter.

    I checked all the demos I could find, and four of the instruments resonated pleasantly and inspired me in a different way than any other. Those are: Pan Flute, Erhu, Gaohu, and Sarangi. But everything else is phenomenal and authentic, at least for my ears.

    I’m glad to hear plans around Santoor, exciting times are in front of us. I will be the first in line to buy this instrument.

    Ok, one more digression. Back in the day at Stanford/CCRMA (1994-1997) when we were doing physical modeling research we build an 8 blade overclocked 56k DSP engine called the Frankenstein. Bill Putnam now of UA worked out how to interface it with a NeXT machine. We did whole ensembles in real time all physical models, no samples. There are some sound samples here that you can listen to. You can hear demos of the guitar and flute here ... from 1995

    https://scandalis.com/Jarrah/PhysicalModels/index.html#StanfordCCRMA

    We are at the point now where desktop multiprocessor machines can run full modeled ensembles, and we are within a stones throw of being able to do whole orchestras. I believe that Audio Modeling (our friends and partners) will reach that goal soon.

    You are aware of the fact that our pocket mobile phones are 19 billion transistor “supercomputers” with a 2147 GFLOPS GPU. I need to remind myself of this from time to time. There were times when a 4 MHz, 8-bit CPU was ultra fast for me.

    Thank you very much for your thorough answer. You gave me lots of starting points to feed my constant hunger for more information. 🤩

  • @Gavinski said:

    @wingwizard said:

    @Gavinski said:

    @wingwizard said:

    @moForte said:

    @wingwizard said:

    @moForte said:

    • The GeoShred Keyboard is much more fluid than a physical controller simply because the touch sample rate is higher than the sensor sample rate on controllers. That being said it is not tactile beyond being smooth glass. It is a visual instrument that you look at when playing.

    Hi I very much like geoshred but please implement or consider implementing velocity as at the moment I always use vel kb due to the large increase in playability. It uses y axis as velocity - so velocity isn’t velocity but a representation of it. Velocity keyboard has a superb implementation of tap velocity - you can adjust the sensitivity and the velocity range expressed to tailor it for your own preference. I would recommend implementing this to get a more real and playable experience in geoshred. It’s really important and completely transforms the experience. Thanks.

    I would love it if you would send me a video of how you adjust the range. We have tested Velocity keyboard for many years, and have only been able to get a narrow range of velocities; validated with MIDI monitor. We have had a few customers use it and say that they can't reach the high velocity impulsive expression of the Cello. I'm open to what you have learned. You can reach me directly at [email protected].

    Thanks for the response, and for engaging, it’s really cool. I will try to do this and experiment a bit with the settings. I’m wondering though if it might be the case despite there being a lower (I would not describe it as narrow compared to y axis velocity in practice due to the physical smallness of that axis) velocity range, the absolute increase in velocity playability is for me the larger factor. I mean that, velocity with y axis, isn’t velocity at all, and playing with it doesn’t engage touch or feel. It actually pulls you out of the physicality of the playing experience which is very counter to such beautifully playable modelled instruments as Naada or swam. And I do think you and audio modelling are my favourite two instrument devs, by a long long way. You play y axis velocity visually and because it’s not a continuum like pressure you are effectively playing blind. For me, as a touch musician, whcih I think is exactly why your instruments are so great for me, it really removes an entire aspect of playability. I would rather 10 velocity levels triggered by touch than 127 by y axis. The dynamic of what is played is so heavily affected and yiu end up with a lot of strange off dynamics produced by guessing volumes by eye. It also causes cognition to come into play when you really don’t want that playing and improvising. I think any lack of range in touch is offset by having to employ guesswork with y axis with the result that the more expressive parts tend just to be a bunch of hits around the top end of the axis, overestimated and the same with the quieter parts, it’s particularly a problem with sparse melodies having this exaggerated dynamic

    Anyway, i hope you don’t mind me going into my feelings on the specifics of that dynamic. The detail is nt of criticism but intent to express what I mean and why, :)

    It might be the case that others with a different musical background don’t feel that as being significant but for me it’s everythung. I’m actually surprised that people feel y axis velocity is velocity and generally say this or that app has velocity when it really doesn’t. But I’ve noticed this is coming from a majority of people with a strong synth rather than acoustic instrument background. I’ve seen positive comments on other acoustic instrument app sounds which I personally find unplayable so there is a big divide there imo.

    At least as an option it would be great i feel.

    That would make sense to me to have it as an option, and I think you make some good points. Mind you, for me, dealing with a screen is always going to be slightly cerebral compared to playing a traditional instrument, as, no matter what, you really have to use your eyes. Unless maybe you practise a lot, but certainly it will be much more challenging, having no frets etc to help you feel your way around with eyes closed.

    I would be wary of deciding what people’s backgrounds are, btw, in terms of synth vs acoustic! I played acoustic guitar (rhythm, not lead), sang, wrote songs etc for many years before ever touching a synth, for example, even though I also DJed and listened to a lot of electronic stuff back in my guitar playing days. So my background is far more acoustic, even though I rarely pick up a guitar these days. In my acoustic days I had absolutely zero interest in any FX etc other than a bit of reverb, I really loved just a pure acoustic tone.

    Yeah, i didnt mean the comment re people’s musical backgrounds in a judgemental or i dont know, a way that was intended to categorise, apologies if it came across like that. It’s just ive noticed it a few times. Perhaps with things like sequencing vs playing and just a very different mindset musically. But of course you’re right and as for me - my background is overwhelmingly singer-songwriter, piano/guitar, but i get bored very easily and only really played synths for a decade. For a long time everyone thought i was really into guitar bands when i just couldnt listen to that and the dread of books or tv shows or little Christmas presents related to bands when i stopped being able to watch or read musicians talking about anything or really anything related to music culture a long long time ago. I think though you may not be a hugely representative example, or me, or perhaps im wrong about that. I mean… i know im not, ever, in any respect, and am fine with it lol.

    Im not sure i have the same feelings exactly about the screen but i definitely take your point. The reason thats my take is that i was so blown away, despite it not being perfect, with the roli blocks (hate seaboards), that i was shocked by how close velocity kb is to that experience when you get it set up well with an instrument. It’s so playable for me. Its what stopped me getting that erae or eros touch thing. My main issue with it is that apps are not optimised for it and you often run into not being able to pitch bend individually. Which is terrible.

    Actually, i can go back to mugician by rob something or other as the first very early experience for me playing a grid arrangement on a touch screen and that was probably the thing that pulled me away from acoustic instruments for a bit. It just made a lot of sense to me and i found it really natural to play. So theres that layout thing too

    How visual playing is, is an interesting thing, it can be physical… not referential or relational. Or a mix. That’s a battle though if im honest and my good days are where im at my dumbest and least cognitive. Otherwise im probably just not a good enough musician.

    But humans expect that when they hit something harder, it will produce a louder racket. And by simulating these expectations, more of ourselves can be given to creating and expressing stuff with what they produce rather, i think. It’s actually one of the reasons i like ai or anything that takes linguistics out of communication. But on the Flipside having to learn techniques really gives you a fine feel for, and allows you to get to know the qualities those techniques are playing with, regardless of whether the techniques are products of technological inefficiencies. I hope we develop new ones with advancing tech rather than using it just to make the older easier.

    I didn't think you were being judgemental, don't worry, just thought I'd mention that I personally didn't fit that, but you're probably right that we're outliers.

    And you did make me pull velocity keyboard out again, hadn't used it for a while, and I played around with it with the Geoshred Pipa. I really enjoyed it, definitely more intuitive for me personally than the Geoshred keyboard, I will probably go back to using VK as my main controller for Geoshred haha. I did notice one thing, sorry to be a bit off topic, but I couldn't see that the pitch correction speed slider in VK is doing anything at all to do with pitch correction. Instead it seemed to be affecting velocity sensivitivity. Weird. Is it like that for you too? It doesn't seem to be in the in app manual either, must have been added after they wrote that, or just overlooked.

    Haha glad you’re finding it cool to play as well and I’m not the only one. I just tried the pitch correction slider and I can’t make sense of it in pipa either. just tried it with swam flute and switching between at the lowest and highest setting… im not sure if im imagining it or not, but it maybe feels like it lags and follows a tiny bit on one, and is moving with my finger on another… im really unsure if im imagining this. Like a played a bit faster and thought i could feel one snap more and one more of a gradient. Honestly though, could be going mental. Almost like the laggy one has a bit of a glide feeling.

  • Fascinating to see your journey in the development of modeled sound and virtual instruments, @moForte / Pat! And seriously looking forward to GeoShred on macOS!

  • @wingwizard , I’ve struggling to understand what it is that the y-axis velocity sensitive nature of GS does that velocity keyboard does differently (I’m not a user of VK btw) . Given that using an iPad almost certainly needs visual input (I’m not sure I’ve seen any GS musician not looking at the screen) then I’m not sure what difference there is between your brain deciding where to tap vertically on the y-axis compared with how hard to tap on VK (does it measure the size of your figure area?).. genuine question 😊

  • @GeoTony said:
    @wingwizard , I’ve struggling to understand what it is that the y-axis velocity sensitive nature of GS does that velocity keyboard does differently (I’m not a user of VK btw) . Given that using an iPad almost certainly needs visual input (I’m not sure I’ve seen any GS musician not looking at the screen) then I’m not sure what difference there is between your brain deciding where to tap vertically on the y-axis compared with how hard to tap on VK (does it measure the size of your figure area?).. genuine question 😊

    Hi. Velocity kb uses the strength of your tap on the screen - so it actually has velocity whereas geoshred doesnt, in terms of playing rather than a parameter. How hard you tap the screen in vk changes the velocity of the note played. I dont know how it works beyond that if its using the accelerometer or how granular it is etc but thats how it works. You have sliders and can adjust how sensitive it is and the range of velocities that it triggers to sculpt how you want it to react. It’s really really nice - not perfect but in my opinion infinitely better for impact type instruments particularly.

    With y axis-there is no velocity element at all to your playing it’s just manually choosing a position that by guesswork represents what you think will trigger a certain velocity. There is no actual playing or feel its a switch, with regard to the velocity parameter itself. If you play a piano say, the further you go right or left across the octaves changes the pitch of the note you play. The harder you hit a note the higher the velocity, the harder the string is impacted by the hammer inside. You dont have to think or engage your brain, its physics.

    The visual input thing is a bit of a red herring in our discussion in that it’s related only in my opinion in that you NEED to engage it with y axis velocity because there is no actual velocity. I can close my eyes and play velocity on any instrument in reality or vk because it has played velocity. That makes a world of difference. We kind of drifted into how visually playing is a part of ipad playing or instrument playing but that was the reason it came up. Imagine if playing a piano there was no longer any difference in how loud it was no matter how hard you played. Instead you had to press a bit higher up on the keys. It would completely decimate how expressive the instrument is.

    I got really into this area of playability, and why mpe came in over midi etc, and mpe instruments so vk was very exciting for me. It’s not perfect. But it does turn your ipad into an actual mpe instrument. I know thats not absolute but to me the gulf is really great. It’s like losing a sense when i go back to playing instruments without touch velocity.

    Sorry my brain ois a bit frazzled today -innocently- so that was pretty long winded lol
    Vk does also have a touch radius option you can use to change any parameter including velocity but ive not found it to be as useful and have heard the ipad is less sensitive to it.

  • I should add to this that pressure was the other missing mpe parameter in all ipad apps. But i think it’s less jarring or distant in being simulated via y axis in that it is on a continuum and you cannot access points of pressure without first going trhough every interceding point. It’s also possible to simply tilt your ipad a bit when playing so that pressing down harder causes you to change y axis position a bit and it feels pretty natural, my brain makes any adjustments. I do not have to look at the screen.

    I look at the screen for a rough approximation or visual feel of where i am tonally. That’s all. I dont look at the ce seen for velocity or pressure, i dont engage my eyes for the touch and feel elements of playing on vk, any more than i do for the syncopation or rhythm of what im playing.

  • @wingwizard seconding your assent of VB. It’s not perfect by any means but very very good. I use it quite a bit. There are a handful of other apps that use a similar method to gage velocity and how hard you hit the screen but I can’t think of them right off hand.

  • edited November 2023

    Velocity kb uses the strength of your tap on the screen - so it actually has velocity whereas geoshred doesnt

    On iPhone perhaps?

    No iPad has a pressure sensitive screen, right?

    Is there a way to convert the velocity to a CC that GeoShred understands?

  • @mojozart said:

    Velocity kb uses the strength of your tap on the screen - so it actually has velocity whereas geoshred doesnt

    On iPhone perhaps?

    No iPad has a pressure sensitive screen, right?

    Is there a way to convert the velocity to a CC that GeoShred understands?

    I think it uses how much of your finger is touching the screen to convert that to how much pressure you theoretically should be using. Something like that. I remember reading about it a while back and I think that’s what was said.

  • edited November 2023

    Thank you very much for your thorough answer. You gave me lots of starting points to feed my constant hunger for more information. 🤩

    We have lots of informative decks here: https://www.moforte.com/blog/

    Here is our most current deck about the history of physical modeling. Lots of interesting examples:

    https://www.moforte.com/aes-sf-5-6-2020/

  • @NeuM said:
    Fascinating to see your journey in the development of modeled sound and virtual instruments, @moForte / Pat! And seriously looking forward to GeoShred on macOS!

    We will start beta testing in a couple months. Send me your email address and I will add you to the list of alpha testers. [email protected]

  • edited November 2023

    @GeoTony said:
    @wingwizard , I’ve struggling to understand what it is that the y-axis velocity sensitive nature of GS does that velocity keyboard does differently (I’m not a user of VK btw) . Given that using an iPad almost certainly needs visual input (I’m not sure I’ve seen any GS musician not looking at the screen) then I’m not sure what difference there is between your brain deciding where to tap vertically on the y-axis compared with how hard to tap on VK (does it measure the size of your figure area?).. genuine question 😊

    Look, I think that the developer of VKB has a great idea. It may work well with samplers, but with physical models its challenging. Every time I test it, I find that it delivers an unpredictable narrow range of velocities. When I want to get the GeoCello to respond with a hard string/body impulse response, an emotional expression, I tap at the top of the key and get it every time. Same for overblow on the flutes. I'm unable to do that reliably with Velocity keyboard. I've spent hours looking at Velocity Keyboard in MIDI Monitor side by side with GeoShred's "KeyY Touch" (and KeyZ touch aka 3D Touch). I'm not able to get either the full range of velocities or reliably reproducible velocities from VKB.

    If someone has figured out how to get this to work, I'm open to looking at it and considering figuring out how to do something similar. I'm open to looking at videos showing how to tune VKB and get good wide range, reliable results. Send me a video at [email protected]

    For me playing positionally on the keys for velocity has become quite natural. As a cue, I tap softly on the bottom of the key and hard at the top of the key, and I have the genuine sensation that I'm hitting the bow hard on the string or overblowing the flute.

  • edited November 2023

    @HotStrange said:

    @mojozart said:

    Velocity kb uses the strength of your tap on the screen - so it actually has velocity whereas geoshred doesnt

    On iPhone perhaps?

    No iPad has a pressure sensitive screen, right?

    Is there a way to convert the velocity to a CC that GeoShred understands?

    I think it uses how much of your finger is touching the screen to convert that to how much pressure you theoretically should be using. Something like that. I remember reading about it a while back and I think that’s what was said.

    I'm pretty sure that it's using the accelerometer. I get varying results depending on how hard the surface is that the iPad is sitting on.

    And also there is a range of iPhones from 7,8,9,X that support 3D touch, so you have direct Velocity, KeyZ (Pressure), KeyY and KeyX.

    Marco Parisi's GeoCello and GeoTenorSax demos were done on an iPhone X. There are users who buy them used, just for GeoShred. Personally, I bought a wholesale lot of 5 used iPhone X to use for testing.

    Marco Parisi (love this guy) GeoCello. If you know this piece you know its from one of the greatest movie scenes ever. Marco nails it, not just technically but emotionally; which is what musical expression is all about.

    Marco Parisi GeoTenor Sax

  • @moForte said:

    @HotStrange said:

    @mojozart said:

    Velocity kb uses the strength of your tap on the screen - so it actually has velocity whereas geoshred doesnt

    On iPhone perhaps?

    No iPad has a pressure sensitive screen, right?

    Is there a way to convert the velocity to a CC that GeoShred understands?

    I think it uses how much of your finger is touching the screen to convert that to how much pressure you theoretically should be using. Something like that. I remember reading about it a while back and I think that’s what was said.

    I'm pretty sure that it's using the accelerometer. I get varying results depending on how hard the surface is that the iPad is sitting on.

    And also there is a range of iPhones from 7,8,9,X that support 3D touch, so you have direct Velocity, KeyZ (Pressure), KeyY and KeyX.

    Marco Parisi's GeoCello and GeoTenorSax demos were done on an iPhone X. There are users who buy them used, just for GeoShred. Personally, I bought a wholesale lot of 5 used iPhone X to use for testing.

    Marco Parisi (love this guy) GeoCello. If you know this piece you know its from one of the greatest movie scenes ever. Marco nails it, not just technically but emotionally; which is what musical expression is all about.

    Marco Parisi GeoTenor Sax

    Oh that’s excellent. Beautiful playing.

    You could be right about that. I know it does vary depending on the surface you’re using.

    The App Store description is a bit vague. Overall it works fairly well for me but there’s no real 100% way to go toe to toe with a hardware MPE controller unfortunately.

    Would be awesome to see 3D touch come back just for that. I don’t even know exactly why they took it away in the first place.

  • @wingwizard said:
    I should add to this that pressure was the other missing mpe parameter in all ipad apps. But i think it’s less jarring or distant in being simulated via y axis in that it is on a continuum and you cannot access points of pressure without first going trhough every interceding point. It’s also possible to simply tilt your ipad a bit when playing so that pressing down harder causes you to change y axis position a bit and it feels pretty natural, my brain makes any adjustments. I do not have to look at the screen.

    I look at the screen for a rough approximation or visual feel of where i am tonally. That’s all. I dont look at the ce seen for velocity or pressure, i dont engage my eyes for the touch and feel elements of playing on vk, any more than i do for the syncopation or rhythm of what im playing.

    Hi,
    Just to say you are not alone here.
    I fully agree with you but right now not motivated enough to take the pen and explain why… 😅

    In a few words : VKB allows to trigger around 5 velocity zones quite accurately. For me it’s far enough considering that the channel/poly aftertouch of an expressive instrument like Swam/Naada can take the relay to convey all required expressiveness. All I want is : being able to start my sound in the soft, medium or hard attack zones.
    What I miss in VKB is the XY pad, very convenient for performable modulation (contrary to the knobs).

    VKB is using accelerometers to estimate the kinetic energy transferred to the screen. For that, you need to allow some recoil by positioning your iPad on a soft surface.

    Another alternative solution would be the iPad mic to constantly listen to the sound : when hitting the screen with you finger, a sound is produced and captured by the mic. The harder you hit, the louder the sound, the higher the velocity. Drawbacks : must find a way to activate the mic whatever is plugged in the iPad. Not applicable in loud environments (ie. with headphones only).

    iPad radius size API I think returns up to 5 difference radius. Practically, this is not very usable as you need to “spread” your finger contact surface on the screen in a non natural way. But this is still an additional feature of VKB.

    If the radius feature resolution was much finer, we could also use measuring the radius at impact to simulate the velocity : the larger the radius, the higher the velocity. This is not done in VKB currently and I think it could be “quite” easily done/added in GeoShred.

    Well, the best solution still remains that Apple releases “pro” iPads with 3D touch, as they did until iPhone X.

  • Personally I don’t have a problem with the GS away of working for velocity. It’s second nature to me now and I like how it allows a gradual increase / decrease in volume which reflects how real instruments can be played. I appreciate how it might not suite everybody though.
    @Paulo164 you can do some of what you mention I.e. soft , medium hard attack zones by modifying the Key Y Velocity Curve as shown below.

  • @GeoTony said:
    Personally I don’t have a problem with the GS away of working for velocity. It’s second nature to me now and I like how it allows a gradual increase / decrease in volume which reflects how real instruments can be played. I appreciate how it might not suite everybody though.
    @Paulo164 you can do some of what you mention I.e. soft , medium hard attack zones by modifying the Key Y Velocity Curve as shown below.

    Hi @GeoTony ,
    Thank you but I know this. My remark about being able to trigger just 3 velocity zones is applicable only provided I could do it by intuitive touch pressure. I would keep the 127 levels if I had to use a visual Y-axis.
    But for me, having to rely on a precise hit visual target instead of just pressure more or less hard, is - I admit - a pleasure-killer. So in GS, I prefer far more to override velocity and assign it to the XY pad, generally along the expression CC. This works well for me and put the velocity question back in the “intuitive” zone.

  • @Paulo164 said:

    @wingwizard said:
    I should add to this that pressure was the other missing mpe parameter in all ipad apps. But i think it’s less jarring or distant in being simulated via y axis in that it is on a continuum and you cannot access points of pressure without first going trhough every interceding point. It’s also possible to simply tilt your ipad a bit when playing so that pressing down harder causes you to change y axis position a bit and it feels pretty natural, my brain makes any adjustments. I do not have to look at the screen.

    I look at the screen for a rough approximation or visual feel of where i am tonally. That’s all. I dont look at the ce seen for velocity or pressure, i dont engage my eyes for the touch and feel elements of playing on vk, any more than i do for the syncopation or rhythm of what im playing.

    Hi,
    Just to say you are not alone here.
    I fully agree with you but right now not motivated enough to take the pen and explain why… 😅

    In a few words : VKB allows to trigger around 5 velocity zones quite accurately. For me it’s far enough considering that the channel/poly aftertouch of an expressive instrument like Swam/Naada can take the relay to convey all required expressiveness. All I want is : being able to start my sound in the soft, medium or hard attack zones.
    What I miss in VKB is the XY pad, very convenient for performable modulation (contrary to the knobs).

    VKB is using accelerometers to estimate the kinetic energy transferred to the screen. For that, you need to allow some recoil by positioning your iPad on a soft surface.

    Another alternative solution would be the iPad mic to constantly listen to the sound : when hitting the screen with you finger, a sound is produced and captured by the mic. The harder you hit, the louder the sound, the higher the velocity. Drawbacks : must find a way to activate the mic whatever is plugged in the iPad. Not applicable in loud environments (ie. with headphones only).

    iPad radius size API I think returns up to 5 difference radius. Practically, this is not very usable as you need to “spread” your finger contact surface on the screen in a non natural way. But this is still an additional feature of VKB.

    If the radius feature resolution was much finer, we could also use measuring the radius at impact to simulate the velocity : the larger the radius, the higher the velocity. This is not done in VKB currently and I think it could be “quite” easily done/added in GeoShred.

    Well, the best solution still remains that Apple releases “pro” iPads with 3D touch, as they did until iPhone X.

    Thanks :) I remember reading about the screen tech and Apple saying that (I can’t remember which is capacitive and the other now, I’m tired, I’m a musician haha) the one which has pressure sensitivity is is poor in other ways that create a bugger issues-perhaps response time, and with teh art and pencil applications that being a vital factor - iLOVE the pencil and wouldn’t lose it for anything.

    I spent a long time wasting money on mpe expressive etc controllers that are somewhat affordable. It was wasted because I didn’t enjoy playing any of them (eigenharps, sensel touch etc) except the roli block. The seaboard is a joke for me I have no idea why anyone would create a keyboard design with the material that comoletely compromises it as a playable instrument - it’s so much at odds with itself, and the material I’ve mentioned before I awful, it’s sticky when sliding and gliding are the focus.
    Even with the block the sensitivity is poor, very bad qc, and lots of variability - I just mean I had to hit much harder than I should have even adjusted. It also died after a few months of non use so I don’t buy roli.

    I had a kboard too. Not playable as a piano with the vibrato features as you can’t stay in tune and on the affordable ones vibrato is vertical -that’s a non starter for me.

    I think I’m wittering on just to say that the mpe physical controllers are all flawed also, the ones I tried. And I ended up in a place where the iPad with a proper app is the best option and there’s no way I could justify further expenditure. I just wanted as similar experience as the one I had playimg the block which was the best musical experience I’ve ever had along with learning real instruments. And I get enough of that from vk. And the main reason for that is the tap sensitivity.

    Like a lot of things the theory is so different from the reality. I sat around working out what I thought I wanted, and needed and which thing is best, but I didn’t factor in stuff like can I be arsed having anither instrument, getting it out, turning it on, getting it set up, can I do all that on iPad, does it even feel that nice… compared to just having it all on iPad. That more than made up the difference for me. There’s also the fact that my main instrument is piano. Mpe is really for little parts here and there, mainly. Where I need something more expressive or organic. Hard to justify massive outlay for that when something comes close enough in an app.

    And the simple fact that I really enjoy playing velocity kb and can mess about in it happily for hours like I do on instrument instruments. I cant do that (so much) in an app without the accelerometer velocity it’s not as much fun.

    I didn’t realise it was quite as low res which I think demonstrates how significant that mode is in that it still feels so good.

  • edited November 2023

    @GeoTony said:
    Personally I don’t have a problem with the GS away of working for velocity. It’s second nature to me now and I like how it allows a gradual increase / decrease in volume which reflects how real instruments can be played. I appreciate how it might not suite everybody though.
    @Paulo164 you can do some of what you mention I.e. soft , medium hard attack zones by modifying the Key Y Velocity Curve as shown below.

    This is t what he was saying as it’s y axis. I think it’s worthwhile to play both and see how you feel. :) > @moForte said:

    @GeoTony said:
    @wingwizard , I’ve struggling to understand what it is that the y-axis velocity sensitive nature of GS does that velocity keyboard does differently (I’m not a user of VK btw) . Given that using an iPad almost certainly needs visual input (I’m not sure I’ve seen any GS musician not looking at the screen) then I’m not sure what difference there is between your brain deciding where to tap vertically on the y-axis compared with how hard to tap on VK (does it measure the size of your figure area?).. genuine question 😊

    Look, I think that the developer of VKB has a great idea. It may work well with samplers, but with physical models its challenging. Every time I test it, I find that it delivers an unpredictable narrow range of velocities. When I want to get the GeoCello to respond with a hard string/body impulse response, an emotional expression, I tap at the top of the key and get it every time. Same for overblow on the flutes. I'm unable to do that reliably with Velocity keyboard. I've spent hours looking at Velocity Keyboard in MIDI Monitor side by side with GeoShred's "KeyY Touch" (and KeyZ touch aka 3D Touch). I'm not able to get either the full range of velocities or reliably reproducible velocities from VKB.

    If someone has figured out how to get this to work, I'm open to looking at it and considering figuring out how to do something similar. I'm open to looking at videos showing how to tune VKB and get good wide range, reliable results. Send me a video at [email protected]

    For me playing positionally on the keys for velocity has become quite natural. As a cue, I tap softly on the bottom of the key and hard at the top of the key, and I have the genuine sensation that I'm hitting the bow hard on the string or overblowing the flute.

    I love thé parisi brothers (have I gone mental or are there two of them lol) too. They really are amazing mpe musicians I remember a lot of their roli demos.

    Just want to say that I don’t mean to have been annoying with any of this - I just wanted to put across my own experience and perspective as someone who is really really into mpe playing and modelled instruments. I mean that’s my main focus with electronic or synthesis. And also, trying to be clear that I love geoshred. And really appreciate your work. And LOVE Naada., I mean I’m the one who has been on here boring everyone about how much I like it, shouting at clouds about the current sale etc.

    I also press harder with higher y axis presses… but it’s really trying to forget the axis control and get into it which has limited effectiveness for me. The instinct feels a bit thwarted.

    I do understand completely your points. For me, adding the accelerometer thing or even thinking about how there might be ways to fine tune that would transform the app. Seriously, it’s a MAJOR issue with iPad instruments for me. It dictates my thinking and use much more than other stuff. Every controller app i look straight for that and if it doesn’t have it I don’t buy it - there was one up here just recently. I’ve never used the other early mpe kb one after I found out it only had slide glide etc it was of no use to me.

    I get that you have become quite used to y axis. I, not suggesting this is the case here but I do think that when you are much closer and working within something, factors like (if) tap velocity onky has five or so layers of resolution as someone said, seem really damning. And looking at a midi meter and seeing that. But the user experience isn’t related to a midi meter. I evidently find 5 layers of tap velocity much more expressive than 127 or god knows how many with y axis because I’m SIMOLY not playing y axis, it’s not possible. It has ti be a trigger no matter how familiar you are with it, and I know I’ve been calling every conceptual witness I can for the defence here lol but I think that makes it worthwhile considering, as it would add another hugely expressive aspect or selectable control method to your already brilliant app.

    I do want to say that I definitely don’t share the experience of the velocity range being unpredictable. And I particularly find y axis unpredictable. With y axis or geoshred I will always get roughly in the area of velocity I wanted but roughly in the area translates to nowhere near it in the reality of musical expression. If I played roughly in the area of a c on piano id be out of tune, I feel out of tune dynamically without touch velciuty, I mean I use velocity kb every day and have done since it came out and I don’t think ever have an issue with unpredictability, the range may be limited, and from what I’ve just said you’d think that it would suffer similarly by virtue of that, but it’s consistent and I don’t have that problem. Because it’s connected to my playing, it is my playimg. Heads too tired to analyse why. And analysis is bad for me. I have to speak to my cat for a bit after too much of it, she sorts me out. It may be its to do with knowing how to place the ipad - people say a soft surface, I use it in a stand as well.

    That MusiKraken app is the one that most excites me as a controller.I think it looks on paper to be the most developed spcoecially mpe controller app on iOS by a distance but I found it difficult to understand immediately so haven’t got further yet. I mean he added a camera and body tracking control to it after speaking with a customer not long ago (I’m not saying that for any reason!) and is just really into what he’s doing,

    I’d really really love someone to investigate further with a mixture of algorithms, accelerometer, radius whatever, what works best with tap velciuty and pressure. It may not be perfect but im sure someone interested enough could come up with something really amazing as a side project.

    If someone who knows reads this could you tell me if this idea would work or if it’s stupid:

    • a while ago, somewhat insanely, I wondered if you could lay a transparent layer of silicon over the iPad screen and an app could recognise pressure etc. I thought this while reading about how pressure sensitive touchscreens work by detecting the distance bwteen the finger and the Err sensor. I thought at the time, probably wrongly that the only difference between the screen technologies is physical rather than electrical. And I read about people building pressure sensitive touchscreens with cheap sensors and silicon so wondered if the iPad could serve as the sensor base, the silicon laid over it would change the distance between finger and sensor.. except now as I’m typing and a few years older, I’m thinking how could that work. Wouldn’t you need one sensor to sense the point of impact and the inferior sensor a second trigger to get the distance between them.

    I mean, if that’s the case, could you then theoretcially build a silicon sheet with a sensor array and connect it to the iPad as a midi instrument, then have an app that detects the distance between sensors maybe through the time it takes for the screen sensor to be triggered after the silicon sheet one?

    Yeah yeah mad professor stuff

  • All good @Paulo164 and @wingwizard 😊 Healthy debate, new ideas, shared knowledge… keep them coming 🙏

  • I’ve also been thinking why is it that in grid controllers it’s kind of the standard (imean I was reading about linnstrument initially so not referring to any specific apps just general ) for notes to bend on x axis rather than trigger the next note which is great but for the y axis to trigger a new note on the row above rather than further expression of parameter? This makes no sense to me. I’m sure someone will correct me but I can’t think of a single musical use for that. Whenever it happens for me it’s annoying and a mistake. I guess if you really desperate to play horrendous sweep type things lol. I think I remember there being a setting to turn off new note trigger in somethung

  • A comparison between GeoViolin and Naada Carnatic Violin which are both IAP (In App Purchases) within GeoShred.
    The available parameters for both are shown towards the end of the video.
    The main differences besides the sound they produce are…
    Naada has double stopping I.e. two notes played at once.
    Naada allows Pizzicato playing
    Naada allows ‘ensemble’ playing I.e. mimicking more than one violin playing in unison (not shown)
    Geo provides 3 types of Harmonics
    The video also shows a trick whereby if you hold a note down and close the window the note continues to play, in effect a sustain facility. This is useful for adding a simple drone to a piece of music.

  • @wingwizard said:
    I’ve also been thinking why is it that in grid controllers it’s kind of the standard (imean I was reading about linnstrument initially so not referring to any specific apps just general ) for notes to bend on x axis rather than trigger the next note which is great but for the y axis to trigger a new note on the row above rather than further expression of parameter? This makes no sense to me. I’m sure someone will correct me but I can’t think of a single musical use for that. Whenever it happens for me it’s annoying and a mistake. I guess if you really desperate to play horrendous sweep type things lol. I think I remember there being a setting to turn off new note trigger in somethung

    Haha ! I exposed this same exact idea to Pat a few years ago. The suggestion was quite well received but I don’t think it passed the initial test to backlog 😆
    I think we have the same vision about how a good MPE controler app should be. Maybe we should join to create a company 😆
    Kidding.
    And, just to make things clear, I also love Geoshred. But the more I love something, the more I would like it to be the way I really want, a big default of mine…

  • @moForte said:

    @GeoTony said:
    @wingwizard , I’ve struggling to understand what it is that the y-axis velocity sensitive nature of GS does that velocity keyboard does differently (I’m not a user of VK btw) . Given that using an iPad almost certainly needs visual input (I’m not sure I’ve seen any GS musician not looking at the screen) then I’m not sure what difference there is between your brain deciding where to tap vertically on the y-axis compared with how hard to tap on VK (does it measure the size of your figure area?).. genuine question 😊

    Look, I think that the developer of VKB has a great idea. It may work well with samplers, but with physical models its challenging. Every time I test it, I find that it delivers an unpredictable narrow range of velocities. When I want to get the GeoCello to respond with a hard string/body impulse response, an emotional expression, I tap at the top of the key and get it every time. Same for overblow on the flutes. I'm unable to do that reliably with Velocity keyboard. I've spent hours looking at Velocity Keyboard in MIDI Monitor side by side with GeoShred's "KeyY Touch" (and KeyZ touch aka 3D Touch). I'm not able to get either the full range of velocities or reliably reproducible velocities from VKB.

    If someone has figured out how to get this to work, I'm open to looking at it and considering figuring out how to do something similar. I'm open to looking at videos showing how to tune VKB and get good wide range, reliable results. Send me a video at [email protected]

    For me playing positionally on the keys for velocity has become quite natural. As a cue, I tap softly on the bottom of the key and hard at the top of the key, and I have the genuine sensation that I'm hitting the bow hard on the string or overblowing the flute.

    I had the same experience with VKB and was unable to gain any performance benefit by using it. I'm glad it works for others, but it has no advantage for me compared to GeoShred's playing surface layout.

  • @Paulo164 said:

    @wingwizard said:
    I’ve also been thinking why is it that in grid controllers it’s kind of the standard (imean I was reading about linnstrument initially so not referring to any specific apps just general ) for notes to bend on x axis rather than trigger the next note which is great but for the y axis to trigger a new note on the row above rather than further expression of parameter? This makes no sense to me. I’m sure someone will correct me but I can’t think of a single musical use for that. Whenever it happens for me it’s annoying and a mistake. I guess if you really desperate to play horrendous sweep type things lol. I think I remember there being a setting to turn off new note trigger in somethung

    Haha ! I exposed this same exact idea to Pat a few years ago. The suggestion was quite well received but I don’t think it passed the initial test to backlog 😆
    I think we have the same vision about how a good MPE controler app should be. Maybe we should join to create a company 😆
    Kidding.
    And, just to make things clear, I also love Geoshred. But the more I love something, the more I would like it to be the way I really want, a big default of mine…

    I would seriously love to work in a company on these kind of concepts. I spent a lot of time doing just that. 😂

Sign In or Register to comment.