Loopy Pro: Create music, your way.

What is Loopy Pro?Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.

Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.

Download on the App Store

Loopy Pro is your all-in-one musical toolkit. Try it for free today.

Want to be a MUSIC/SOUND pro in 2025? THINK TWICE

123578

Comments

  • . > @AudioGus said:

    @Gavinski said:
    Yes, technology, including synths and drum machines did indeed lead to plenty of job losses among professional musicians. And yes, the threat posed by AI is (if, if, if it continues to develop at the speed it has in recent years) on a whole other level compared to drum machines etc.

    Pre-1970s, the music industry relied heavily on live musicians for studio work and for live performance. When my parents were young, you didn't go out to see a DJ, you went out to see a whole band. Even in small villages, there was regular work for bands. Go back a generation or two before that and people were going out to dance to big bands or even entire orchestras. Drummers and string sections, especially, were badly hit by trends towards electronic dance music over the last 50-odd years.

    Were there upsides too? Yes. But that's not really the point here. The point is that technology can and does often lead to
    the same 'work' being done by fewer hands, and music tech is no exception.

    I am kind of amazed talking to graphics (cough/art) people in the games industry and when I use music industry/tech metaphors they just stare at me blankly and have no idea what I am talking about. I keep telling them that what happened between the 80s to the early 2000s in music is what is happening now with visuals but over a much much smaller span of time. They just scrunch their face and say something stupid like "But there wasn't Ai music then...". Sigh

    It's the paradox of this age we're in Gus. More and more info, fewer and fewer well informed people.

  • @Gavinski said:
    . > @AudioGus said:

    @Gavinski said:
    Yes, technology, including synths and drum machines did indeed lead to plenty of job losses among professional musicians. And yes, the threat posed by AI is (if, if, if it continues to develop at the speed it has in recent years) on a whole other level compared to drum machines etc.

    Pre-1970s, the music industry relied heavily on live musicians for studio work and for live performance. When my parents were young, you didn't go out to see a DJ, you went out to see a whole band. Even in small villages, there was regular work for bands. Go back a generation or two before that and people were going out to dance to big bands or even entire orchestras. Drummers and string sections, especially, were badly hit by trends towards electronic dance music over the last 50-odd years.

    Were there upsides too? Yes. But that's not really the point here. The point is that technology can and does often lead to
    the same 'work' being done by fewer hands, and music tech is no exception.

    I am kind of amazed talking to graphics (cough/art) people in the games industry and when I use music industry/tech metaphors they just stare at me blankly and have no idea what I am talking about. I keep telling them that what happened between the 80s to the early 2000s in music is what is happening now with visuals but over a much much smaller span of time. They just scrunch their face and say something stupid like "But there wasn't Ai music then...". Sigh

    It's the paradox of this age we're in Gus. More and more info, fewer and fewer well informed people.

    Ahh tru dat. I keep getting blindsided by accountants, lawyers, CEOs, doctors etc. I am my own kind of idiot.

  • @UrbanNinja said:
    It truly is,

    Though as to the rest, if you will look back I agree with others rather often here, especially when I think their remarks are grounded in sound reasoning. Most of the folk here seem pretty smart including most folk I have ever disagreed with on some matter.

    Sure you do, you're a clever guy no doubt. And you spit your coffee when admins joke, hilarious! (now I'm joking).
    I did a comment within parentheses no less, about the energy consumption Bitcoin consumes and therefor I need a lecture in other things that consumes energy - I did spit my coffee but not with laughter. Let's leave it, if you don't mind.

  • edited May 29

    All those remarks weren't directed at your brief remark but at stuff that has come up in the thread more generally; sorry if it seemed otherwise to you. If that was not apparent I imagine the fault is mostly mine.

    "Let's leave it"

    No problem; get a towel and wipe up your coffee and things will be better in the morning.

    BTW I don't think your original remark was wrong. It is factually correct and unimpeachable as far as I can tell.

  • @wim said:
    I want an AI assistant that listens all the time to my wife talking to the cat, asking herself questions, answering herself, talking to Siri, dictating texts, cursing her idiotic Apple watch, reading junk emails out loud, reacting to what she's watching on YouTube, comforting her stuffed animals, conversing with her plants, God, Jesus, the Holy Spirit ... and sometimes to me, while I'm working.

    I need it to sort through all that and to decide if something requires a response. I need it to pop up an alert on my screen and ideally show the text of only those comments that require a conversational response other than a grunt.

    It will need to be a somewhat intelligent bot because she uses the same tone, emotion, and inflection with Siri and everything else that she does with me. "I don't know" is generally the safest response, and chat bots suck at that.

    This would boost my productivity more than any other AI assistant I can imagine. My wife will be happier too because she simply can't understand why I often don't answer the first time she says something to me. My filters just aren't advanced enough. They miss stuff and come up with false positives all the time.

    Thats the funniest internet post I’ve read in years!

  • @Michael_R_Grant said:

    @Robin2 said:

    @UrbanNinja said:

    @Michael_R_Grant said:
    AI slop is never going to replace talented animators, writers and voice actors.

    You may be a little behind the times. It has been a while already now since they stopped hiring rooms full of animators drawing mice on mylar sheets. Manipulations of digitized recordings of voices of people living and dead have been used for years. Disney/Hollywood etc. have been using computer animation for years and are using AI extensively now. The only difference is more people have access to tools of similar quality (better quality than Hollywood had ten years ago) for less money.

    Doubtless there will continue to be a human element, but if anyone thinks human jobs will not be lost at exponentially greater rates than ever before with this revolution I think their head is firmly embedded in sand (no disrespect intended -just saying).

    To what extent the number of human jobs will diminish in the industry is something history will decide, not LoopyPro forum. You are free to disagree this will happen, but I and many others think the writing is already on the wall:

    Mene Tekel Mene Upharsin.

    Completely agree, (unfortunately). The view that ‘AI can never replace human creativity’ might, debatably, be true in a sense - it will always, in its current form at least, be a pastiche, however brilliant, created to fulfill what the AI model thinks we want rather than achieving its own goals and expression. However, taking that view is also sleep walking into the future, while the possibilities and dangers of AI should be faced head on rather than just assuming everything will be okay.

    AI is getting better at what it can do at astonishing, terrifying speed.

    Tesla’s robots are coming soon and may genuinely cause enormous unemployment around the world - one human employee in your factory that you have to pay year in, year out or a one off purchase of a Tesla robot for $30,000? Plenty of employers without a conscience who’ll go for that unfortunately i imagine.

    The possibilities are very worrying and should be taken seriously rather than just assuming it can never replace human beings - such complacency is why it happens.

    Tesla's robots are terrible. There is absolutely no way that the company will be producing a workable one that replaces any skilled jobs for $30k anytime soon. This whole debate is full of really ill-informed speculation about potential, rather than the facts on the ground:

    https://freedium.cfd/https://wlockett.medium.com/teslas-robot-is-utterly-pathetic-365874848eb6

    I'm plugged pretty hard into folk from the creative industries, including people who work in music, videogames, movies, TV and visual FX. I could count on one hand the number of them doing the work who think that AI could ever 'replace' what they do to a similar or better standard, or even a workable one. It's a house of cards where many companies are betting the farm and are going to crash hard.... probably far too late to prevent the consequences of laying off all the people they'll find they actually need. We're going to be drowning in AI slop from people who think they're creative but aren't, while actual creatives who are professionals in these industries will now be having to fix the mistakes, hallucinations and general bad output of AI.

    And for non-creative industries such as financial services, the amount of hallucinations and sheer wrong answers are ridiculous and make AI a non-starter as a replacement for human labour. When you have to get someone to check everything the AI is spitting out because otherwise you risk it being wrong and costing you loads of money, it has little value.

    And then we have the IT industry: https://old.reddit.com/r/ExperiencedDevs/comments/1krttqo/my_new_hobby_watching_ai_slowly_drive_microsoft/

    In the next 5 years, AI is likely going to decimate the worker economy when execs get dollar signs in their eyes. Lay off workers, use AI to replace them, spend way less money, make way more money - it's perfect, right?! Unfortunately this gargantuan bet on AI will also decimate the actual economy when the Emperor's New Clothes of it all becomes clear and people wipe the scales from their eyes. For example, the progress made with each LLM model is lower despite each one being trained on more and more data. The ratio between advancement and the necessary resources to advance gets ever worse. And the models are even going to run out of new data to be trained on before long! Unless someone comes up with a whole new way of doing things, progress is going to stall much more quickly than you might think on the present path.

    Oh, and then there's the environmental impact of all this which is being conveniently ignored: https://mashable.com/article/energy-ai-worse-than-we-thought

    I used to be a big advocate of AI. I thought it was going to be incredible. But the more I see, the more obvious its flaws are, and the more obvious it is that it isn't going to be the saviour of everything.

    Thanks for that! It was strangely comforting.

  • @timfromtheborder said:
    I'm not talking about preventing anything. I just think AI isn't that amazing and I have yet to see any evidence that it can, for instance, create music that sounds remotely good. I'm not telling Henry Ford that his car sucks, I'm telling Leonardo DaVinci that I don't want to fly his airplane.

    Very nicely done.
    I am in awe.

  • @UrbanNinja said:

    @Robin2 said:

    @UrbanNinja said:
    Haven't seen a thread bump on the evils of using drum machines and how lifeless they are compared to a real drummer here for a while. A genuinely hot topic back in the day before most everybody gave up about it with the advent of Disco ('not real music') A MIDI loop and sampler will never replace real music played by musicians playing real physical three-dimensional instruments. Most of the music production discussed here at Loopy Pro is of that sort.

    If we have real Luddites on this forum, why stop at trying to fight the AI revolution? Probably because they all use drum machines and samplers. Amusing when you think about it.

    "While he is waiting for Julia, he recognizes a song that a prole woman below the window is singing, which is a popular song written by the versificator, which is a machine that writes songs with no human input." -George Orwell Nineteen Eighty Four (1949)

    Maybe the Ludite Revolution in this thread will provide a spark that will halt this emerging Dystopian Age of robot-produced elevator music predicted Orwell, and God forbid artificial characters in a future Veggie Tales version 86. Not sure Vegas will offer good odds of it though.

    The suggestion that AI is just another technology, no different from previous technological innovations - the computer, film, sound recording, photography, the printing press etc. - just another tool, just another progression, is to be naive about its scope and potential (as I’ve said before, i hope I’m wrong about that). And anyway, plenty of previous technological innovations have proved to be things which we’d, at the very least debatably, be better off without.

    Just because it’s inevitable now doesn’t mean there’s no point in voicing concerns about it, does it?

    Well, like I said repeatedly, I personally share many of the typical clear headed concerns about AI that are expressed these days and have voiced such concerns myself. What I do not share is the view that every manner of expressing that concern or arguing against AI is not silly.

    If you want to analyze something someone said make sure they said it first.

    @Robin2 said:
    I think you’re reaching with your drum machine comparison. It only really works if the user only uses beats that the machine generates itself and doesn’t program in their own. Quantized beats may indeed be lifeless compared to a real drummer (depending on one’s opinion, not mine as it happens) but that’s a still pertinent criticism isn’t it, not an opinion which is consigned to history’s dustbin as being ridiculous?

    Well, I don't know, Robin, it depends on what you think the drum machine comparison was arguing against. I don't think a drum beat produced in Gadget with a simple pattern of Kick Snare Kick Snare Kick Snare + Hat Hat Hat Hat Hat Hat, or much of what was produced in the agonizing years of Disco actually sounds more human than an identical pattern if produced by AI. If folk here assure me they can hear some qualitative difference in something produced by a human finger on a glass screen I have a hard time believing it, or that some folk are not more than a little hypocritical if buying or using every algorithmic driven generative app that comes out like Piano Motif or Logic Session Drummer while railing about how AI doesn't sound human.

    I raised the examples from the days of yon when people were losing their minds about the advent of drum machines, sampling, and MIDI because they offered up the same arguments that some are using in this thread as critiques of AI without realizing that if they were consistent and the critique holds they should consider equally whether they should rail against things like synthesized drums, sampling, backing tracks, and apps like Piano Motifs, Logic Session Drummer and etc. that they use or why they do not. I didn't raise it to say there are not particularities about the rise of AI the human race has not faced hitherto.

    I don't think simply saying AI lacks human touch we know and love works as well as some seem to assume at the end of the day since AI is coded/tweaked/operated./evaluated by humans in the first place, or as if using drum machines with a finger on a glass screen does not often result in a vibe of a diminished human touch that no honest ear can likely tell oozes human creativity in the way the same exact pattern produced by AI could never hope to do. You are totally right to say the argument can be made that quantized beats sound more lifeless (less so using humanize functions), though with the caveat you don't share that sentiment (neither do I) but to make a similar argument against AI as proof it does not sound human without using that argument to the same degree toward quantized beats made on a TS909 in a Hip Hop hit is logically inconsistent and seems hypocritical or at least strained if one produces beats on a similar device, I think.

    And again, I'm not saying the concerns raised here are wrong -I share many of them. But one thing I find as questionable as find the advent of AI is fallacious, strained, possibly hypocritical or hallucinatory argumentation, which humans are still also just at good at producing as machines. Consider the paradox that some of the worst flaws of AI may turn out to exist precisely because AI is a creative endeavor produced by human beings.

    @Robin2 said:
    just because it’s happening, we don’t have to take an ‘everything is for the best in this best of all possible worlds’ approach to something we consider has numerous potential dangers and downsides.

    Yup; I totally agree.

    Personally - i realise of course that your reply wasn’t just referring to myself but to others who have commented as well - i don’t use generative apps like Piano Motifs or Logic’s Session Drummer because such things don’t sit well with me as they just feel like cheating to me.

    I’m not going to pretend that i don’t use AI at all though - that would be disingenuous and i guess that makes me a hypocrite if you want to take a hardline ‘if you believe one thing, you have to subscribe to your belief absolutely otherwise it has no validity’ approach? I don’t use generative AI, but i have used machine learning within graphics apps to help to quickly select an object or to enlarge images without degradation. There must be countless software features which involve some sort of machine learning without me realizing it. AI can be very helpful, of course it can, but i just think there are huge potential downsides coming along the road and on balance I’m not sure the benefits will outweigh the negatives in the long run.

    I do use drum machine and synth apps and also sample based machines but i simply don’t agree that they are akin to AI such that if you criticize one you should therefore logically criticize the others too. Drum machines/synths/samplers are very like traditional instruments in a sense: if you touch them in the right way they’ll make a musical sound. I think that is fundamentally different from simply asking for a finished track or clip to be generated for you by AI. Plus, AI encompasses and will affect most areas of life in a way that drum machines don’t!

    It’s funny that you pulled me up by saying that i should ‘make sure they said it first’. If you’d read my posts here, you would realise that I’ve not been saying how AI lacks the human touch and therefore dismissing it. It may currently lack the human touch in its results but I have consistently been banging on in my posts that people should not judge it, and the threats it poses, based on its current abilities but rather should be considering what happens if/when it really can flawlessly mimic the human touch (in this example).

    Of course the worst flaws of AI exist because it’s been created by human beings, how could it be anything else?

  • @Robin2 said:

    @UrbanNinja said:

    @Robin2 said:

    @UrbanNinja said:
    Haven't seen a thread bump on the evils of using drum machines and how lifeless they are compared to a real drummer here for a while. A genuinely hot topic back in the day before most everybody gave up about it with the advent of Disco ('not real music') A MIDI loop and sampler will never replace real music played by musicians playing real physical three-dimensional instruments. Most of the music production discussed here at Loopy Pro is of that sort.

    If we have real Luddites on this forum, why stop at trying to fight the AI revolution? Probably because they all use drum machines and samplers. Amusing when you think about it.

    "While he is waiting for Julia, he recognizes a song that a prole woman below the window is singing, which is a popular song written by the versificator, which is a machine that writes songs with no human input." -George Orwell Nineteen Eighty Four (1949)

    Maybe the Ludite Revolution in this thread will provide a spark that will halt this emerging Dystopian Age of robot-produced elevator music predicted Orwell, and God forbid artificial characters in a future Veggie Tales version 86. Not sure Vegas will offer good odds of it though.

    The suggestion that AI is just another technology, no different from previous technological innovations - the computer, film, sound recording, photography, the printing press etc. - just another tool, just another progression, is to be naive about its scope and potential (as I’ve said before, i hope I’m wrong about that). And anyway, plenty of previous technological innovations have proved to be things which we’d, at the very least debatably, be better off without.

    Just because it’s inevitable now doesn’t mean there’s no point in voicing concerns about it, does it?

    Well, like I said repeatedly, I personally share many of the typical clear headed concerns about AI that are expressed these days and have voiced such concerns myself. What I do not share is the view that every manner of expressing that concern or arguing against AI is not silly.

    If you want to analyze something someone said make sure they said it first.

    @Robin2 said:
    I think you’re reaching with your drum machine comparison. It only really works if the user only uses beats that the machine generates itself and doesn’t program in their own. Quantized beats may indeed be lifeless compared to a real drummer (depending on one’s opinion, not mine as it happens) but that’s a still pertinent criticism isn’t it, not an opinion which is consigned to history’s dustbin as being ridiculous?

    Well, I don't know, Robin, it depends on what you think the drum machine comparison was arguing against. I don't think a drum beat produced in Gadget with a simple pattern of Kick Snare Kick Snare Kick Snare + Hat Hat Hat Hat Hat Hat, or much of what was produced in the agonizing years of Disco actually sounds more human than an identical pattern if produced by AI. If folk here assure me they can hear some qualitative difference in something produced by a human finger on a glass screen I have a hard time believing it, or that some folk are not more than a little hypocritical if buying or using every algorithmic driven generative app that comes out like Piano Motif or Logic Session Drummer while railing about how AI doesn't sound human.

    I raised the examples from the days of yon when people were losing their minds about the advent of drum machines, sampling, and MIDI because they offered up the same arguments that some are using in this thread as critiques of AI without realizing that if they were consistent and the critique holds they should consider equally whether they should rail against things like synthesized drums, sampling, backing tracks, and apps like Piano Motifs, Logic Session Drummer and etc. that they use or why they do not. I didn't raise it to say there are not particularities about the rise of AI the human race has not faced hitherto.

    I don't think simply saying AI lacks human touch we know and love works as well as some seem to assume at the end of the day since AI is coded/tweaked/operated./evaluated by humans in the first place, or as if using drum machines with a finger on a glass screen does not often result in a vibe of a diminished human touch that no honest ear can likely tell oozes human creativity in the way the same exact pattern produced by AI could never hope to do. You are totally right to say the argument can be made that quantized beats sound more lifeless (less so using humanize functions), though with the caveat you don't share that sentiment (neither do I) but to make a similar argument against AI as proof it does not sound human without using that argument to the same degree toward quantized beats made on a TS909 in a Hip Hop hit is logically inconsistent and seems hypocritical or at least strained if one produces beats on a similar device, I think.

    And again, I'm not saying the concerns raised here are wrong -I share many of them. But one thing I find as questionable as find the advent of AI is fallacious, strained, possibly hypocritical or hallucinatory argumentation, which humans are still also just at good at producing as machines. Consider the paradox that some of the worst flaws of AI may turn out to exist precisely because AI is a creative endeavor produced by human beings.

    @Robin2 said:
    just because it’s happening, we don’t have to take an ‘everything is for the best in this best of all possible worlds’ approach to something we consider has numerous potential dangers and downsides.

    Yup; I totally agree.

    Personally - i realise of course that your reply wasn’t just referring to myself but to others who have commented as well - i don’t use generative apps like Piano Motifs or Logic’s Session Drummer because such things don’t sit well with me as they just feel like cheating to me.

    I’m not going to pretend that i don’t use AI at all though - that would be disingenuous and i guess that makes me a hypocrite if you want to take a hardline ‘if you believe one thing, you have to subscribe to your belief absolutely otherwise it has no validity’ approach? I don’t use generative AI, but i have used machine learning within graphics apps to help to quickly select an object or to enlarge images without degradation. There must be countless software features which involve some sort of machine learning without me realizing it. AI can be very helpful, of course it can, but i just think there are huge potential downsides coming along the road and on balance I’m not sure the benefits will outweigh the negatives in the long run.

    I do use drum machine and synth apps and also sample based machines but i simply don’t agree that they are akin to AI such that if you criticize one you should therefore logically criticize the others too. Drum machines/synths/samplers are very like traditional instruments in a sense: if you touch them in the right way they’ll make a musical sound. I think that is fundamentally different from simply asking for a finished track or clip to be generated for you by AI. Plus, AI encompasses and will affect most areas of life in a way that drum machines don’t!

    It’s funny that you pulled me up by saying that i should ‘make sure they said it first’. If you’d read my posts here, you would realise that I’ve not been saying how AI lacks the human touch and therefore dismissing it. It may currently lack the human touch in its results but I have consistently been banging on in my posts that people should not judge it, and the threats it poses, based on its current abilities but rather should be considering what happens if/when it really can flawlessly mimic the human touch (in this example).

    Of course the worst flaws of AI exist because it’s been created by human beings, how could it be anything else?

    Your last sentence sums it all up nicely. A.I. was created by people, so it's only natural that these systems will exhibits characteristics which we might ascribe as "almost human."

  • wimwim
    edited May 29

    @Robin2 said:
    Of course the worst flaws of AI exist because it’s been created by human beings, how could it be anything else?

    Humm ... more and more AI is being used in AI development. As the human element fades could it not diverge from that "imperfection"? Could it not all spin away from us? Or, if we're imperfect by nature, would that mean it would become less and less like us? Or is success defined as perfectly recreating those imperfections? In that case, it certainly could succeed.

    Those are mostly a flippant thoughts, but ... I confess this has got me thinking.

  • @wim said:

    @Robin2 said:
    Of course the worst flaws of AI exist because it’s been created by human beings, how could it be anything else?

    Humm ... more and more AI is being used in AI development. As the human element fades could it not diverge from that "imperfection"? Could it not all spin away from us? Or, if we're imperfect by nature, would that mean it would become less and less like us? Or is success defined as perfectly recreating those imperfections? In that case, it certainly could succeed.

    Those are mostly a flippant thoughts, but ... I confess this has got me thinking.

    On the imperfections route AI perhaps is seemingly following

    • Chatbots overly confident in stating wrong opinions/facts on forums while barely providing sourcing ✅
    • Has a side hobby of creating art by using stealing someone else’s copyright✅
    • Then proceeds to also promote their cheaper services than the other guy ✅
    • All the while being horrible for the environment ✅

    Maybe this is the real Turing test 🤷🏻‍♂️

  • ^ post of the day! 😂

  • edited May 29

    @offbrands said:

    @wim said:

    @Robin2 said:
    Of course the worst flaws of AI exist because it’s been created by human beings, how could it be anything else?

    Humm ... more and more AI is being used in AI development. As the human element fades could it not diverge from that "imperfection"? Could it not all spin away from us? Or, if we're imperfect by nature, would that mean it would become less and less like us? Or is success defined as perfectly recreating those imperfections? In that case, it certainly could succeed.

    Those are mostly a flippant thoughts, but ... I confess this has got me thinking.

    On the imperfections route AI perhaps is seemingly following

    • Chatbots overly confident in stating wrong opinions/facts on forums while barely providing sourcing ✅
    • Has a side hobby of creating art by using stealing someone else’s copyright✅
    • Then proceeds to also promote their cheaper services than the other guy ✅
    • All the while being horrible for the environment ✅

    Maybe this is the real Turing test 🤷

    Score

    What will all this will look like in 250 years?

  • @UrbanNinja said:

    @offbrands said:

    @wim said:

    @Robin2 said:
    Of course the worst flaws of AI exist because it’s been created by human beings, how could it be anything else?

    Humm ... more and more AI is being used in AI development. As the human element fades could it not diverge from that "imperfection"? Could it not all spin away from us? Or, if we're imperfect by nature, would that mean it would become less and less like us? Or is success defined as perfectly recreating those imperfections? In that case, it certainly could succeed.

    Those are mostly a flippant thoughts, but ... I confess this has got me thinking.

    On the imperfections route AI perhaps is seemingly following

    • Chatbots overly confident in stating wrong opinions/facts on forums while barely providing sourcing ✅
    • Has a side hobby of creating art by using stealing someone else’s copyright✅
    • Then proceeds to also promote their cheaper services than the other guy ✅
    • All the while being horrible for the environment ✅

    Maybe this is the real Turing test 🤷

    Score

    What will all this will look like in 250 years?

    Not that a fictional novel should be the barometer.

    In Dune, humanity destroys all the tech which could think like a human and leads to all kinds of chicanery.. to put it super lightly.

    But arguably most notably, an omnipresent worm god

    (I should read those last few books again)

  • @wim said:

    @Robin2 said:
    Of course the worst flaws of AI exist because it’s been created by human beings, how could it be anything else?

    Humm ... more and more AI is being used in AI development. As the human element fades could it not diverge from that "imperfection"? Could it not all spin away from us? Or, if we're imperfect by nature, would that mean it would become less and less like us? Or is success defined as perfectly recreating those imperfections? In that case, it certainly could succeed.

    Those are mostly a flippant thoughts, but ... I confess this has got me thinking.

    Well, the achievement of artificial general intelligence if it ever happens would suggest that the software could come to its own conclusions and then only it will be able to define what success is and what to do about human flaws - both its and ours!

  • edited May 29

    @offbrands said:

    Not that a fictional novel should be the barometer.

    In Dune, humanity destroys all the tech which could think like a human and leads to all kinds of chicanery.. to put it super lightly.

    But arguably most notably, an omnipresent worm god

    (I should read those last few books again)

    Fiction is often a very good barometer. I recently re-read Orwell's 1984; it kicks you in the teeth.

    "While he is waiting for Julia, he recognizes a song that a prole woman below the window is singing, which is a popular song written by the versificator, which is a machine that writes songs with no human input." -George Orwell Nineteen Eighty Four (1949)

  • edited May 29

    .

  • @UrbanNinja said:

    @offbrands said:

    Not that a fictional novel should be the barometer.

    In Dune, humanity destroys all the tech which could think like a human and leads to all kinds of chicanery.. to put it super lightly.

    But arguably most notably, an omnipresent worm god

    (I should read those last few books again)

    Fiction is often a very good barometer. I recently re-read Orwell's 1984; it kicks you in the teeth.

    "While he is waiting for Julia, he recognizes a song that a prole woman below the window is singing, which is a popular song written by the versificator, which is a machine that writes songs with no human input." -George Orwell Nineteen Eighty Four (1949)

    I haven’t read that one in years, probably worth it. That is quite the thought.

    Animal Fam is always my go-to of his.

    The ending knocked all my teeth out and I’ve recovered, it’s lead to a morbid curiosity that power almost always reveals.

  • edited May 29

    Maybe trying to make AI more human is the worst possible mistake.

    Turing 2.0 comes for me when AI murders someone for its own gain if that is ever possible.

  • @offbrands said:

    @UrbanNinja said:

    @offbrands said:

    @wim said:

    @Robin2 said:
    Of course the worst flaws of AI exist because it’s been created by human beings, how could it be anything else?

    Humm ... more and more AI is being used in AI development. As the human element fades could it not diverge from that "imperfection"? Could it not all spin away from us? Or, if we're imperfect by nature, would that mean it would become less and less like us? Or is success defined as perfectly recreating those imperfections? In that case, it certainly could succeed.

    Those are mostly a flippant thoughts, but ... I confess this has got me thinking.

    On the imperfections route AI perhaps is seemingly following

    • Chatbots overly confident in stating wrong opinions/facts on forums while barely providing sourcing ✅
    • Has a side hobby of creating art by using stealing someone else’s copyright✅
    • Then proceeds to also promote their cheaper services than the other guy ✅
    • All the while being horrible for the environment ✅

    Maybe this is the real Turing test 🤷

    Score

    What will all this will look like in 250 years?

    Not that a fictional novel should be the barometer.

    In Dune, humanity destroys all the tech which could think like a human and leads to all kinds of chicanery.. to put it super lightly.

    But arguably most notably, an omnipresent worm god

    (I should read those last few books again)

    Or just wait for the Dune Messiah movie, coming to theaters in 2026.

  • edited May 29

    @offbrands said:

    @wim said:

    @Robin2 said:
    Of course the worst flaws of AI exist because it’s been created by human beings, how could it be anything else?

    Humm ... more and more AI is being used in AI development. As the human element fades could it not diverge from that "imperfection"? Could it not all spin away from us? Or, if we're imperfect by nature, would that mean it would become less and less like us? Or is success defined as perfectly recreating those imperfections? In that case, it certainly could succeed.

    Those are mostly a flippant thoughts, but ... I confess this has got me thinking.

    On the imperfections route AI perhaps is seemingly following

    • Chatbots overly confident in stating wrong opinions/facts on forums while barely providing sourcing ✅
    • Has a side hobby of creating art by using stealing someone else’s copyright✅
    • Then proceeds to also promote their cheaper services than the other guy ✅
    • All the while being horrible for the environment ✅

    Maybe this is the real Turing test 🤷🏻‍♂️

    Machine learning systems don't independently "feed" themselves with sounds, images and information. They rely on people, who are all flawed and have individual preferences and biases, to train them.

    And I still don't see any validity to this "horrible for the environment" claim. If anything, these systems reduce pollution by leapfrogging real-world human activity which would be required to gather the equivalent amount of information and result in an identical amount of productivity. For example, movie (or TV productions) which involve hundreds or thousands of individuals, transportation, energy, waste production, etc. are essentially being eliminated now that generative video exists.

  • edited May 29

    @NeuM said:

    @offbrands said:

    @wim said:

    @Robin2 said:
    Of course the worst flaws of AI exist because it’s been created by human beings, how could it be anything else?

    Humm ... more and more AI is being used in AI development. As the human element fades could it not diverge from that "imperfection"? Could it not all spin away from us? Or, if we're imperfect by nature, would that mean it would become less and less like us? Or is success defined as perfectly recreating those imperfections? In that case, it certainly could succeed.

    Those are mostly a flippant thoughts, but ... I confess this has got me thinking.

    On the imperfections route AI perhaps is seemingly following

    • Chatbots overly confident in stating wrong opinions/facts on forums while barely providing sourcing ✅
    • Has a side hobby of creating art by using stealing someone else’s copyright✅
    • Then proceeds to also promote their cheaper services than the other guy ✅
    • All the while being horrible for the environment ✅

    Maybe this is the real Turing test 🤷🏻‍♂️

    Machine learning systems don't independently "feed" themselves with sounds, images and information. They rely on people, who are all flawed and have individual preferences and biases, to train them.

    And I still don't see any validity to this "horrible for the environment" claim. If anything, these systems reduce pollution by leapfrogging real-world human activity which would be required to gather the equivalent amount of information and productivity.

    I just stumbled into this article via Google News. Talks about AI systems being fed with their own output as it happens.

    https://www.theregister.com/2025/05/27/opinion_column_ai_model_collapse/

    Link to UNEP article on the environmental impacts of AI.

    https://www.unep.org/news-and-stories/story/ai-has-environmental-problem-heres-what-world-can-do-about

  • wimwim
    edited May 29

    @NeuM said:
    Machine learning systems don't independently "feed" themselves with sounds, images and information. They rely on people, who are all flawed and have individual preferences and biases, to train them.

    I don't think it's at all safe to assume that will always be the case, or even that it's the case for unknown actors right now. There is no technological barrier to cross and no legal barriers yet either. The only supposed limits are non-compulsory adherence to common sense practices, and those are wildly subjective.

  • @NeuM said:

    @offbrands said:

    @UrbanNinja said:

    @offbrands said:

    @wim said:

    @Robin2 said:
    Of course the worst flaws of AI exist because it’s been created by human beings, how could it be anything else?

    Humm ... more and more AI is being used in AI development. As the human element fades could it not diverge from that "imperfection"? Could it not all spin away from us? Or, if we're imperfect by nature, would that mean it would become less and less like us? Or is success defined as perfectly recreating those imperfections? In that case, it certainly could succeed.

    Those are mostly a flippant thoughts, but ... I confess this has got me thinking.

    On the imperfections route AI perhaps is seemingly following

    • Chatbots overly confident in stating wrong opinions/facts on forums while barely providing sourcing ✅
    • Has a side hobby of creating art by using stealing someone else’s copyright✅
    • Then proceeds to also promote their cheaper services than the other guy ✅
    • All the while being horrible for the environment ✅

    Maybe this is the real Turing test 🤷

    Score

    What will all this will look like in 250 years?

    Not that a fictional novel should be the barometer.

    In Dune, humanity destroys all the tech which could think like a human and leads to all kinds of chicanery.. to put it super lightly.

    But arguably most notably, an omnipresent worm god

    (I should read those last few books again)

    Or just wait for the Dune Messiah movie, coming to theaters in 2026.

    Messiah will cover book 2 maybe some of 3 (doubt) - worm god emperor is book 4.

  • @Robin2 said:

    @NeuM said:

    @offbrands said:

    @wim said:

    @Robin2 said:
    Of course the worst flaws of AI exist because it’s been created by human beings, how could it be anything else?

    Humm ... more and more AI is being used in AI development. As the human element fades could it not diverge from that "imperfection"? Could it not all spin away from us? Or, if we're imperfect by nature, would that mean it would become less and less like us? Or is success defined as perfectly recreating those imperfections? In that case, it certainly could succeed.

    Those are mostly a flippant thoughts, but ... I confess this has got me thinking.

    On the imperfections route AI perhaps is seemingly following

    • Chatbots overly confident in stating wrong opinions/facts on forums while barely providing sourcing ✅
    • Has a side hobby of creating art by using stealing someone else’s copyright✅
    • Then proceeds to also promote their cheaper services than the other guy ✅
    • All the while being horrible for the environment ✅

    Maybe this is the real Turing test 🤷🏻‍♂️

    Machine learning systems don't independently "feed" themselves with sounds, images and information. They rely on people, who are all flawed and have individual preferences and biases, to train them.

    And I still don't see any validity to this "horrible for the environment" claim. If anything, these systems reduce pollution by leapfrogging real-world human activity which would be required to gather the equivalent amount of information and productivity.

    I just stumbled into this article via Google News. Talks about AI systems being fed with their own output as it happens.

    https://www.theregister.com/2025/05/27/opinion_column_ai_model_collapse/

    Link to UNEP article on the environmental impacts of AI.

    https://www.unep.org/news-and-stories/story/ai-has-environmental-problem-heres-what-world-can-do-about

    🙏🏽🤘🏽

    (Thank you for the sources)

  • @NeuM said:

    @offbrands said:

    @wim said:

    @Robin2 said:
    Of course the worst flaws of AI exist because it’s been created by human beings, how could it be anything else?

    Humm ... more and more AI is being used in AI development. As the human element fades could it not diverge from that "imperfection"? Could it not all spin away from us? Or, if we're imperfect by nature, would that mean it would become less and less like us? Or is success defined as perfectly recreating those imperfections? In that case, it certainly could succeed.

    Those are mostly a flippant thoughts, but ... I confess this has got me thinking.

    On the imperfections route AI perhaps is seemingly following

    • Chatbots overly confident in stating wrong opinions/facts on forums while barely providing sourcing ✅
    • Has a side hobby of creating art by using stealing someone else’s copyright✅
    • Then proceeds to also promote their cheaper services than the other guy ✅
    • All the while being horrible for the environment ✅

    Maybe this is the real Turing test 🤷🏻‍♂️

    Machine learning systems don't independently "feed" themselves with sounds, images and information. They rely on people, who are all flawed and have individual preferences and biases, to train them.

    And I still don't see any validity to this "horrible for the environment" claim. If anything, these systems reduce pollution by leapfrogging real-world human activity which would be required to gather the equivalent amount of information and result in an identical amount of productivity.

    There is no basis of fact here while seemingly dismissing the future for human activity and potential discoveries in favor of an AI because it’s more productive?

    You’re inherently just boiling down the human experience of living, working, existing to a 1v1 of humans against ai where productivity is the only metric.

    Often it seems like you do this, as if your vested interest is to defend the interests of the AI, and only AI.

    For example, movie (or TV productions) which involve hundreds or thousands of individuals, transportation, energy, waste production, etc. are essentially being eliminated now that generative video exists.

    Yeah… this isn’t a good thing. This is a bad thing. All of these things help economic growth in the pursuit of lifting creatives out of poverty into prosperity.

    You often have real Mr. Smith takes, not sure why, doesn’t seem like it’s in the spirit of debate or even devils advocate.

    Why is that? Why do you seem to minimally question yet accept what you’re told about AI from these companies?

    Why is the metric of the lived experience boiled down to productivity being the only form of progress?

  • @Robin2 said:

    @NeuM said:

    @offbrands said:

    @wim said:

    @Robin2 said:
    Of course the worst flaws of AI exist because it’s been created by human beings, how could it be anything else?

    Humm ... more and more AI is being used in AI development. As the human element fades could it not diverge from that "imperfection"? Could it not all spin away from us? Or, if we're imperfect by nature, would that mean it would become less and less like us? Or is success defined as perfectly recreating those imperfections? In that case, it certainly could succeed.

    Those are mostly a flippant thoughts, but ... I confess this has got me thinking.

    On the imperfections route AI perhaps is seemingly following

    • Chatbots overly confident in stating wrong opinions/facts on forums while barely providing sourcing ✅
    • Has a side hobby of creating art by using stealing someone else’s copyright✅
    • Then proceeds to also promote their cheaper services than the other guy ✅
    • All the while being horrible for the environment ✅

    Maybe this is the real Turing test 🤷🏻‍♂️

    Machine learning systems don't independently "feed" themselves with sounds, images and information. They rely on people, who are all flawed and have individual preferences and biases, to train them.

    And I still don't see any validity to this "horrible for the environment" claim. If anything, these systems reduce pollution by leapfrogging real-world human activity which would be required to gather the equivalent amount of information and productivity.

    I just stumbled into this article via Google News. Talks about AI systems being fed with their own output as it happens.

    https://www.theregister.com/2025/05/27/opinion_column_ai_model_collapse/

    Link to UNEP article on the environmental impacts of AI.

    https://www.unep.org/news-and-stories/story/ai-has-environmental-problem-heres-what-world-can-do-about

    Did a little research.

    The UNEP Emissions Gap Report 2024 (UNEP Emissions Gap Report 2024) notes that global emissions, including China's, are not on track to meet the Paris Agreement's 1.5°C target, with projections suggesting a temperature rise of 2.6-3.1°C if current policies persist. Additionally, reports of illegal trade in ODS, such as HCFC-22 seizures in 2023, indicate enforcement issues, with actions like fines and confiscations documented (China Ozone Profile).

    The two largest polluting countries on earth are China and India. And believe you me that neither of them will give up their best attempts to gain preeminence in AI to satisfy UNEP recommendations. Just looking at the issue with a sense of reality with regard to what’s at stake.

  • edited May 29

    @offbrands said:

    @NeuM said:

    @offbrands said:

    @wim said:

    @Robin2 said:
    Of course the worst flaws of AI exist because it’s been created by human beings, how could it be anything else?

    Humm ... more and more AI is being used in AI development. As the human element fades could it not diverge from that "imperfection"? Could it not all spin away from us? Or, if we're imperfect by nature, would that mean it would become less and less like us? Or is success defined as perfectly recreating those imperfections? In that case, it certainly could succeed.

    Those are mostly a flippant thoughts, but ... I confess this has got me thinking.

    On the imperfections route AI perhaps is seemingly following

    • Chatbots overly confident in stating wrong opinions/facts on forums while barely providing sourcing ✅
    • Has a side hobby of creating art by using stealing someone else’s copyright✅
    • Then proceeds to also promote their cheaper services than the other guy ✅
    • All the while being horrible for the environment ✅

    Maybe this is the real Turing test 🤷🏻‍♂️

    Machine learning systems don't independently "feed" themselves with sounds, images and information. They rely on people, who are all flawed and have individual preferences and biases, to train them.

    And I still don't see any validity to this "horrible for the environment" claim. If anything, these systems reduce pollution by leapfrogging real-world human activity which would be required to gather the equivalent amount of information and result in an identical amount of productivity.

    There is no basis of fact here while seemingly dismissing the future for human activity and potential discoveries in favor of an AI because it’s more productive?

    You’re inherently just boiling down the human experience of living, working, existing to a 1v1 of humans against ai where productivity is the only metric.

    Often it seems like you do this, as if your vested interest is to defend the interests of the AI, and only AI.

    For example, movie (or TV productions) which involve hundreds or thousands of individuals, transportation, energy, waste production, etc. are essentially being eliminated now that generative video exists.

    Yeah… this isn’t a good thing. This is a bad thing. All of these things help economic growth in the pursuit of lifting creatives out of poverty into prosperity.

    You often have real Mr. Smith takes, not sure why, doesn’t seem like it’s in the spirit of debate or even devils advocate.

    Why is that? Why do you seem to minimally question yet accept what you’re told about AI from these companies?

    Why is the metric of the lived experience boiled down to productivity being the only form of progress?

    Simplifying things a bit for the sake of discussion, you can have full employment, little reduction in pollution and energy consumption or you can have less employment, less human activity and reduced energy consumption. You cannot have human activity and not have pollution. Both are permanently linked.

    I’ll take the future where things can be managed and I have no illusion that the entire planet is going to die if there are more computer servers and fewer Hollywood productions.

  • @NeuM said:

    @Robin2 said:

    @NeuM said:

    @offbrands said:

    @wim said:

    @Robin2 said:
    Of course the worst flaws of AI exist because it’s been created by human beings, how could it be anything else?

    Humm ... more and more AI is being used in AI development. As the human element fades could it not diverge from that "imperfection"? Could it not all spin away from us? Or, if we're imperfect by nature, would that mean it would become less and less like us? Or is success defined as perfectly recreating those imperfections? In that case, it certainly could succeed.

    Those are mostly a flippant thoughts, but ... I confess this has got me thinking.

    On the imperfections route AI perhaps is seemingly following

    • Chatbots overly confident in stating wrong opinions/facts on forums while barely providing sourcing ✅
    • Has a side hobby of creating art by using stealing someone else’s copyright✅
    • Then proceeds to also promote their cheaper services than the other guy ✅
    • All the while being horrible for the environment ✅

    Maybe this is the real Turing test 🤷🏻‍♂️

    Machine learning systems don't independently "feed" themselves with sounds, images and information. They rely on people, who are all flawed and have individual preferences and biases, to train them.

    And I still don't see any validity to this "horrible for the environment" claim. If anything, these systems reduce pollution by leapfrogging real-world human activity which would be required to gather the equivalent amount of information and productivity.

    I just stumbled into this article via Google News. Talks about AI systems being fed with their own output as it happens.

    https://www.theregister.com/2025/05/27/opinion_column_ai_model_collapse/

    Link to UNEP article on the environmental impacts of AI.

    https://www.unep.org/news-and-stories/story/ai-has-environmental-problem-heres-what-world-can-do-about

    Did a little research.

    The UNEP Emissions Gap Report 2024 (UNEP Emissions Gap Report 2024) notes that global emissions, including China's, are not on track to meet the Paris Agreement's 1.5°C target, with projections suggesting a temperature rise of 2.6-3.1°C if current policies persist. Additionally, reports of illegal trade in ODS, such as HCFC-22 seizures in 2023, indicate enforcement issues, with actions like fines and confiscations documented (China Ozone Profile).

    The two largest polluting countries on earth are China and India. And believe you me that neither of them will give up their best attempts to gain preeminence in AI to satisfy UNEP recommendations. Just looking at the issue with a sense of reality with regard to what’s at stake.

    Okay… the original links were cause you said you questioned that validity of how bad it is for the environment and then when provided with sources you’ve done a little research and pointed at India and China and are like hey look they aren’t gonna stop and that you’re just looking at the issue with a sense of reality.

    It’s whataboutism while alluding to what exactly?

  • @NeuM said:

    @offbrands said:

    @NeuM said:

    @offbrands said:

    @wim said:

    @Robin2 said:
    Of course the worst flaws of AI exist because it’s been created by human beings, how could it be anything else?

    Humm ... more and more AI is being used in AI development. As the human element fades could it not diverge from that "imperfection"? Could it not all spin away from us? Or, if we're imperfect by nature, would that mean it would become less and less like us? Or is success defined as perfectly recreating those imperfections? In that case, it certainly could succeed.

    Those are mostly a flippant thoughts, but ... I confess this has got me thinking.

    On the imperfections route AI perhaps is seemingly following

    • Chatbots overly confident in stating wrong opinions/facts on forums while barely providing sourcing ✅
    • Has a side hobby of creating art by using stealing someone else’s copyright✅
    • Then proceeds to also promote their cheaper services than the other guy ✅
    • All the while being horrible for the environment ✅

    Maybe this is the real Turing test 🤷🏻‍♂️

    Machine learning systems don't independently "feed" themselves with sounds, images and information. They rely on people, who are all flawed and have individual preferences and biases, to train them.

    And I still don't see any validity to this "horrible for the environment" claim. If anything, these systems reduce pollution by leapfrogging real-world human activity which would be required to gather the equivalent amount of information and result in an identical amount of productivity.

    There is no basis of fact here while seemingly dismissing the future for human activity and potential discoveries in favor of an AI because it’s more productive?

    You’re inherently just boiling down the human experience of living, working, existing to a 1v1 of humans against ai where productivity is the only metric.

    Often it seems like you do this, as if your vested interest is to defend the interests of the AI, and only AI.

    For example, movie (or TV productions) which involve hundreds or thousands of individuals, transportation, energy, waste production, etc. are essentially being eliminated now that generative video exists.

    Yeah… this isn’t a good thing. This is a bad thing. All of these things help economic growth in the pursuit of lifting creatives out of poverty into prosperity.

    You often have real Mr. Smith takes, not sure why, doesn’t seem like it’s in the spirit of debate or even devils advocate.

    Why is that? Why do you seem to minimally question yet accept what you’re told about AI from these companies?

    Why is the metric of the lived experience boiled down to productivity being the only form of progress?

    Simplifying things a bit for the sake of discussion, you can have full employment, little reduction in pollution and energy consumption or you can have less employment, less human activity and reduced energy consumption. You cannot have human activity and not have pollution. Both are permanently linked.

    I’ll take the future where things can be managed and I have no illusion that the entire planet is going to die if there are more computer servers and fewer Hollywood productions.

    If it’s worst for the environment because humans can have employment and prosper in a capitalism based society …thats still a better thing for humans.

    Nobody here except you compared the environmental impact of human beings exsisting vs. AI. You brought that comparison. I merely mentioned that AI is bad for the environment because there is no need for generative AI.

    Generative AI is being used by corporate entities for taking jobs from creatives while continuing the trend of the rich getting richer and the poor getting poorer.

    Again, I feel like you just take the side of AI. Like you want to believe these biased C-Suite suits & CEOs and we need to believe them.

    I’m only responding because I thought maybe you actually would answer my questions without whataboutism and maybe they’ll be a human forward understanding with some empathetic reasoning for humans and creatives not being put to poverty … it’s deeply concerning how you’re unwilling to show a shred of empathy.

    Assuming you’re even like not a paid entity for those techs or have some deeply vested interest in having that one side and only that one side. Just so off.

Sign In or Register to comment.