Loopy Pro: Create music, your way.

What is Loopy Pro?Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.

Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.

Download on the App Store

Loopy Pro is your all-in-one musical toolkit. Try it for free today.

Eye Tracking is coming to iOS/iPadOS as accessibility feature!

Powered by artificial intelligence, Eye Tracking gives users a built-in option for navigating iPad and iPhone with just their eyes. Designed for users with physical disabilities, Eye Tracking uses the front-facing camera to set up and calibrate in seconds, and with on-device machine learning, all data used to set up and control this feature is kept securely on device, and isn’t shared with Apple.

Eye Tracking works across iPadOS and iOS apps, and doesn’t require additional hardware or accessories. With Eye Tracking, users can navigate through the elements of an app and use Dwell Control to activate each element, accessing additional functions such as physical buttons, swipes, and other gestures solely with their eyes.

https://www.apple.com/newsroom/2024/05/apple-announces-new-accessibility-features-including-eye-tracking/

This could be a game changer for controlling virtual gear when the thing you look at is selected for input. Potentially great for one knob midi controllers and similar!

Comments

  • Huh, will you look at that, amazing.

  • @kirmesteggno said:

    Powered by artificial intelligence, Eye Tracking gives users a built-in option for navigating iPad and iPhone with just their eyes. Designed for users with physical disabilities, Eye Tracking uses the front-facing camera to set up and calibrate in seconds, and with on-device machine learning, all data used to set up and control this feature is kept securely on device, and isn’t shared with Apple.

    Eye Tracking works across iPadOS and iOS apps, and doesn’t require additional hardware or accessories. With Eye Tracking, users can navigate through the elements of an app and use Dwell Control to activate each element, accessing additional functions such as physical buttons, swipes, and other gestures solely with their eyes.

    https://www.apple.com/newsroom/2024/05/apple-announces-new-accessibility-features-including-eye-tracking/

    This could be a game changer for controlling virtual gear when the thing you look at is selected for input. Potentially great for one knob midi controllers and similar!

    Instead of rolling the dials, we will roll our eyes!

  • @HolyMoses said:

    @kirmesteggno said:

    Powered by artificial intelligence, Eye Tracking gives users a built-in option for navigating iPad and iPhone with just their eyes. Designed for users with physical disabilities, Eye Tracking uses the front-facing camera to set up and calibrate in seconds, and with on-device machine learning, all data used to set up and control this feature is kept securely on device, and isn’t shared with Apple.

    Eye Tracking works across iPadOS and iOS apps, and doesn’t require additional hardware or accessories. With Eye Tracking, users can navigate through the elements of an app and use Dwell Control to activate each element, accessing additional functions such as physical buttons, swipes, and other gestures solely with their eyes.

    https://www.apple.com/newsroom/2024/05/apple-announces-new-accessibility-features-including-eye-tracking/

    This could be a game changer for controlling virtual gear when the thing you look at is selected for input. Potentially great for one knob midi controllers and similar!

    Instead of rolling the dials, we will roll our eyes!

    I already do that anyway 🤣

  • edited May 15

    @reasOne said:

    @HolyMoses said:

    @kirmesteggno said:

    Powered by artificial intelligence, Eye Tracking gives users a built-in option for navigating iPad and iPhone with just their eyes. Designed for users with physical disabilities, Eye Tracking uses the front-facing camera to set up and calibrate in seconds, and with on-device machine learning, all data used to set up and control this feature is kept securely on device, and isn’t shared with Apple.

    Eye Tracking works across iPadOS and iOS apps, and doesn’t require additional hardware or accessories. With Eye Tracking, users can navigate through the elements of an app and use Dwell Control to activate each element, accessing additional functions such as physical buttons, swipes, and other gestures solely with their eyes.

    https://www.apple.com/newsroom/2024/05/apple-announces-new-accessibility-features-including-eye-tracking/

    This could be a game changer for controlling virtual gear when the thing you look at is selected for input. Potentially great for one knob midi controllers and similar!

    Instead of rolling the dials, we will roll our eyes!

    I already do that anyway 🤣

    Haha yeah you won't move knobs or faders with your eyes, just select stuff. Vision pro users seem to like it, if it works it shouldn't stress the eyes beyond of normal screen use.

  • This is no doubt the beginning of minority-report like flat screens from Apple

  • @realdawei said:
    This is no doubt the beginning of minority-report like flat screens from Apple

    Or we’ll all end up being Stephen Hawking.

Sign In or Register to comment.