Loopy Pro: Create music, your way.

What is Loopy Pro?Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.

Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.

Download on the App Store

Loopy Pro is your all-in-one musical toolkit. Try it for free today.

How are you integrating your iDevices with a desktop DAW?

Aside from Logic Remote and playing instruments over IDAM?

Most of the time it makes more sense to me to print to audio on iOS and then Airdrop rather than try and sync with Ableton or with MIDI clock.

Comments

  • Ableton export, or iConnect.

    But mostly what you say — print and AirDrop or iCloud Drive. The only iOS app I play live into Ableton is ThumbJam.

  • Well, it depends. If you mean a way that considers latency then yes, I would agree that your method is foolproof. If you're happy to lean into it a little bit then give TouchOSC a look. There's no wrong answer, imo.

  • edited May 2021

    Yes, mostly "printing" audio and transferring it over WiFi.
    My most used option is automated (scripted) WebDAV file transfer from Drambo and Nanostudio 2 but sometimes I enjoy using Sonobus as well (for live streaming audio and recording on the desktop).

  • @BroCoast said:
    Aside from Logic Remote and playing instruments over IDAM?

    Most of the time it makes more sense to me to print to audio on iOS and then Airdrop rather than try and sync with Ableton or with MIDI clock.

    I was literally just thinking about starting a thread on this today. I'm using IDAM after realising that the iPad is basically a brilliant and affordable synth(s) thanks to so many amazing devs. My main question is how best to work with/transfer ideas generated on the iPad in Logic on desktop. I am principally using Nanostudio to quickly get musical sketches down as I am unable to match it's fluidity for such a task. I'm just wondering if there is a better approach than just exporting tracks out of NanoStudio for import into Logic.

  • Connecting into an audio interface seems to work best for me.
    The other options I’ve done is
    1. Using Koala Sampler to store the audio files
    2. Using Zenbeats as a DAW to transfer files back and forth
    3. Saving it into cloud storage

    Here’s the general idea in video format

  • Just occasionaly sampling iOS stuff into Maschine, yesterday was Elastic Drums as I love the start / end loop automation and fx.

  • Finally got Studiomux-5 Beta up and running,.. it gets better performance than my iconnectivity2+.

  • as others have said, printing and airdropping..., i sometimes just use the ipad as an instrument, and go headphone adapter out into my interface and record a performance. I do this with SWAM stuff, SAMPLR, etc..

  • edited May 2021

    iConnectivity audio4c (the new one). Can now record 18 mono or 9 stereo channels concurrently from AUM on iPad to Login on Mac! No file drops / imports etc. Makes my workflow so much faster, seamless and enjoyable

  • @NimboStratus said:
    iConnectivity audio4c (the new one). Can now record 18 mono or 9 stereo channels concurrently from AUM on iPad to Login on Mac! No file drops / imports etc. Makes my workflow so much faster, seamless and enjoyable

    Do you think aum to another aum, can be seperate channels, rather than just a stereo input?

  • @NimboStratus said:
    iConnectivity audio4c (the new one). Can now record 18 mono or 9 stereo channels concurrently from AUM on iPad to Login on Mac! No file drops / imports etc. Makes my workflow so much faster, seamless and enjoyable

    What about latency?
    iConnectivity didn’t work for me (because of latency). So my iPad got his own audio interface. Perhaps the new iPad Pro will be a game changer. Would be nice to stream multichannel audio via thunderbolt, which should be possible!?

  • @Ploe said:

    @NimboStratus said:
    iConnectivity audio4c (the new one). Can now record 18 mono or 9 stereo channels concurrently from AUM on iPad to Login on Mac! No file drops / imports etc. Makes my workflow so much faster, seamless and enjoyable

    What about latency?
    iConnectivity didn’t work for me (because of latency). So my iPad got his own audio interface. Perhaps the new iPad Pro will be a game changer. Would be nice to stream multichannel audio via thunderbolt, which should be possible!?

    No issues but then my work flow is do the main construction / composing in AUM, dump via audio4c into separate channels in logic, add more additional adornments direct from AUM using same pipeline. For guitar stuff I use a 18i20 but that’s solo guitar work and so not really doing the kinda overdubs where latency can cause ball ache

  • edited May 2021

    @Cambler said:

    @BroCoast said:
    Aside from Logic Remote and playing instruments over IDAM?

    Most of the time it makes more sense to me to print to audio on iOS and then Airdrop rather than try and sync with Ableton or with MIDI clock.

    I was literally just thinking about starting a thread on this today. I'm using IDAM after realising that the iPad is basically a brilliant and affordable synth(s) thanks to so many amazing devs. My main question is how best to work with/transfer ideas generated on the iPad in Logic on desktop. I am principally using Nanostudio to quickly get musical sketches down as I am unable to match it's fluidity for such a task. I'm just wondering if there is a better approach than just exporting tracks out of NanoStudio for import into Logic.

    I just export stems.

    Apart from being the easiest way it also has further benefits for me:

    1. It makes me commit and move on. I’ve discovered I’m much (much!) more likely to finish a track when the main parts are fixed.
    2. It means I don’t have to spend any time trying to rebuild the setup to get it working again.
    3. I have a library of loops to feed into BlocsWave which is a great way to come up with new stuff by using loops from different project together and for jamming out arrangement ideas in the quickest way possible.

    I do sometimes use IDAM but generally I find exporting audio the easiest and quickest way that works 100%. Every. Single. Time.

  • edited May 2021

    I’ve used MTS iOS to MTS Windows, the projects transfer directly. Right now I’m limiting myself to FLSM on iOS (on the phone and iPad), and am learning how projects on iOS fit into FL Windows. We shall see.

  • Thanks for all the replies. <3

    Hopefully IDAM will evolve into something a bit more useful when it comes to audio.

  • @klownshed said:

    @Cambler said:

    @BroCoast said:
    Aside from Logic Remote and playing instruments over IDAM?

    Most of the time it makes more sense to me to print to audio on iOS and then Airdrop rather than try and sync with Ableton or with MIDI clock.

    I was literally just thinking about starting a thread on this today. I'm using IDAM after realising that the iPad is basically a brilliant and affordable synth(s) thanks to so many amazing devs. My main question is how best to work with/transfer ideas generated on the iPad in Logic on desktop. I am principally using Nanostudio to quickly get musical sketches down as I am unable to match it's fluidity for such a task. I'm just wondering if there is a better approach than just exporting tracks out of NanoStudio for import into Logic.

    I just export stems.

    Apart from being the easiest way it also has further benefits for me:

    1. It makes me commit and move on. I’ve discovered I’m much (much!) more likely to finish a track when the main parts are fixed.
    2. It means I don’t have to spend any time trying to rebuild the setup to get it working again.
    3. I have a library of loops to feed into BlocsWave which is a great way to come up with new stuff by using loops from different project together and for jamming out arrangement ideas in the quickest way possible.

    I do sometimes use IDAM but generally I find exporting audio the easiest and quickest way that works 100%. Every. Single. Time.

    I totally hear you on the committing and moving on aspect of stem export. Anything that adds to the probability of finishing something is good! I'm only using IDAM to play iPad synths into Logic. Also virtual midi for using the iPad as a midi controller.

  • As I have and older Mac Mini 2012. I use NanoStudio 2 and WebDAV the I can record my audio and mix/master with Pro tools or reaper

  • I really want to use my iPhone with musikraken in Logic to add some extra midi modulation. Like acting as a nice xy pad for my Kong minilogue, etc. I sort of got it working but need to test more.

  • I use Zenbeats/AUM to MIDI out to Logic which essentially acts as a desktop synth host. Once I have the MIDI written in Zenbeats I export/record it into Xequence 2 to clean up the arrangement and/or export the MIDI directly to Logic. Then I use either Logic Remote or MIDI Designer 2 to do automation. If there are certain sounds that I can (easily) only get using iPad synths then I’ll just export the audio.

    My main rule is to not use iPad synths with desktop synths while jamming due to the latency. MIDI 2.0 is supposed to solve the latency issue but I dunno if it will ever be released.

  • @ralis said:
    As I have and older Mac Mini 2012. I use NanoStudio 2 and WebDAV the I can record my audio and mix/master with Pro tools or reaper

    Thanks for mentioning WebDav - I need to set that up in NanoStudio.

  • @AudioGus said:
    Just occasionaly sampling iOS stuff into Maschine, yesterday was Elastic Drums as I love the start / end loop automation and fx.

    How are you doing this Captain? I do stare at my new iPad and Maschine sat on the desk next to each other and know I should be connecting them in some way, beyond just exporting/printing loops etc....

  • @JohnnyGoodyear said:

    @AudioGus said:
    Just occasionaly sampling iOS stuff into Maschine, yesterday was Elastic Drums as I love the start / end loop automation and fx.

    How are you doing this Captain? I do stare at my new iPad and Maschine sat on the desk next to each other and know I should be connecting them in some way, beyond just exporting/printing loops etc....

    Super basic line in, love em and leave em sampling. I don’t even care about syncing much really although sometimes Link happens. There may be fancier approaches etc but for the most part I am just dropping old commute bits in the mix that I think of while using Maschine. More of a time travel novelty than any kind of sweet workflow. I may at some point use Fugue Machine or Chordflow etc for sequencing Massive etc but just living in Maschine alone is more than fine by me.

  • edited May 2021

    My recent approach is using Expert Sleepers ES-8 and ES-9. One is connected to PC and other to iPad. This way I can have my Eurorack interfaced in middle of both, and send CV/midi/audio between all three. PC is running VCVRack, so I can host VSTs and FX. The VCV Host plugin recently got midi out, which is great for routing midi from Maschine to modular and controlling VCV modules / iPad.

    Here, Maschine is sequencing modular and the tracks from modular are recorded in AUM with various FX. Maschine is also sending drum tracks into AUM.

    iConnectAudio4+ has also worked well for just PC and iPad.

  • @AudioGus said:

    @JohnnyGoodyear said:

    @AudioGus said:
    Just occasionaly sampling iOS stuff into Maschine, yesterday was Elastic Drums as I love the start / end loop automation and fx.

    How are you doing this Captain? I do stare at my new iPad and Maschine sat on the desk next to each other and know I should be connecting them in some way, beyond just exporting/printing loops etc....

    Super basic line in, love em and leave em sampling. I don’t even care about syncing much really although sometimes Link happens. There may be fancier approaches etc but for the most part I am just dropping old commute bits in the mix that I think of while using Maschine. More of a time travel novelty than any kind of sweet workflow. I may at some point use Fugue Machine or Chordflow etc for sequencing Massive etc but just living in Maschine alone is more than fine by me.

    Thank you Mister. Pushes me a step closer to trying :)

Sign In or Register to comment.