Loopy Pro: Create music, your way.

What is Loopy Pro?Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.

Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.

Download on the App Store

Loopy Pro is your all-in-one musical toolkit. Try it for free today.

Differences between AI and Regular Computing

Yes, AI might be overhyped. Yes, a lot of algorithmic software (regular computing software) claims to be AI when it's not. But I don't think the AI paradigm is being fully understood yet. If you're interested, I found this concise article which summarizes the differences between AI and regular computing:
https://www.rigb.org/explore-science/explore/blog/whats-difference-between-ai-and-regular-computing

Comments

  • When people say “AI” they are referring to LLMs and machine learning systems. There is no such thing as AGI today.

  • @Neum, agreed there isn't any Artificial General Intelligence (AGI) today. The article is just summarizing the differences between AI and regular computing software (traditional or classical computing that uses algorithms to perform specific tasks).

  • edited March 2

    the confusion comes from the term AI, the term has been used so often incorrectly that nowadays we have to now say 'general AI/ AGI' which doesn't exist yet and may never exist.

  • @Danny_Mammy said:
    the confusion comes from the term AI, the term has been used so often incorrectly that nowadays we have to now say 'general AI' which doesn't exist yet and may never exist.

    Fwiw, most people in AI use (and have for half a century ) AI to mean a whole family of program methodologies almost none of which are related to AGI… few people in the field think of AI as meaning primarily AGI.

    So, it isn’t wrong to refer to things as AI that aren’t AGI.

  • edited March 2

    @espiegel123 said:

    @Danny_Mammy said:
    the confusion comes from the term AI, the term has been used so often incorrectly that nowadays we have to now say 'general AI' which doesn't exist yet and may never exist.

    Fwiw, most people in AI use (and have for half a century ) AI to mean a whole family of program methodologies almost none of which are related to AGI… few people in the field think of AI as meaning primarily AGI.

    So, it isn’t wrong to refer to things as AI that aren’t AGI.

    1997
    According to Ben Goertzel, the first person that probably used the term "artificial general intelligence" (in an article related to artificial intelligence) was Mark Avrum Gubrud in the 1997 article Nanotechnology and International Security.

    My point was meaning that we started to use AGI because of the confusion that people think AI is AGI

  • @caminante said:
    Yes, AI might be overhyped. Yes, a lot of algorithmic software (regular computing software) claims to be AI when it's not. But I don't think the AI paradigm is being fully understood yet. If you're interested, I found this concise article which summarizes the differences between AI and regular computing:
    https://www.rigb.org/explore-science/explore/blog/whats-difference-between-ai-and-regular-computing

    I understand what you want to say and that article, but AI is everything about computing and algorithms. ;)

  • heshes
    edited March 2

    @NeuM said:
    When people say “AI” they are referring to LLMs and machine learning systems. There is no such thing as AGI today.

    Well I guess it depends on what you mean by AGI. Is there a specific meaning, well understood meaning for that?

    The fact remains that whatever "intelligence" an LLMs like ChatGPT has, it is certainly "generalized".

    Previous approaches to artificial intelligence focused on one thing and had no intelligence other than regarding that specific narrow thing. E.g., back in the 1980's IBM's Deep Blue was the first chess-playing AI software that could beat a human chess grandmaster. It could do only one thing: play chess. "Deep Blue's victory is considered a milestone in the history of artificial intelligence." https://en.wikipedia.org/wiki/Deep_Blue_(chess_computer)

    ChatGPT can play chess, sort of, according to who you talk to. However, it's nowhere near as good as Deep Blue was 40 years ago. But ChatGPT can also do many other things: write a story, solve a math problem, help write and debug programming language code, help plan a trip, answer questions about nearly any topic. It's not necessarily good at all these things, but the intelligence it has is generalized, can be applied to many different subject matters, many different situations.

    From Wikipedia entry on History of Artificial Intelligence:

    =============================================
    AI Era, artificial general intelligence (2020–present)
    =============================================
    
    Main articles: AI boom and AI era
    
    The AI era starts with the initial development of key architectures 
    and algorithms such as the transformer architecture in 2017 leading 
    to the scaling and development of large language models 
    exhibiting human-like traits of reasoning, cognition, attention and 
    creativity. The start of the AI era has been said to begin around 
    2022-2024 with the development of scaled large-language 
    models such as ChatGPT.
    

    https://en.wikipedia.org/wiki/History_of_artificial_intelligence#AI_Era,artificial_general_intelligence(2020%E2%80%93present)

  • @Danny_Mammy : my point is that it isn’t confusion. As computer science developed, it was clear that there was a family of techniques and approaches that made sense for computer scientists to call AI and had nothing to do with AGI…so at some point it made sense to develop the vocabulary. Even in the 70’s , there were computer scientists calling things AI which weren’t related to what people now call AGI.

  • wimwim
    edited March 2

    I will chip in to this thread when a suitably annoying comment that for some reason I think is funny occurs to me. I sense it percolating just below the surface but coaxing it out has so far eluded me.

  • @wim said:
    I will chip in to this thread when a suitably annoying comment that for some reason I think is funny occurs to me. I sense it percolating just below the surface but coaxing it out has so far eluded me.

    The farce is strong within you.

  • @wim said:
    I will chip in to this thread when a suitably annoying comment that for some reason I think is funny occurs to me. I sense it percolating just below the surface but coaxing it out has so far eluded me.

    “I will chip in” ✅
    “annoying comment” ✅
    “is funny” ✅
    “coaxing it out” ✅

    You’re done! 😅

  • @Luxthor said:

    @wim said:
    I will chip in to this thread when a suitably annoying comment that for some reason I think is funny occurs to me. I sense it percolating just below the surface but coaxing it out has so far eluded me.

    “I will chip in” ✅
    “annoying comment” ✅
    “is funny” ✅
    “coaxing it out” ✅

    You’re done! 😅

    Agree, the statement satisfied itself recursively. :)

Sign In or Register to comment.