Loopy Pro: Create music, your way.
What is Loopy Pro? — Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.
Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.
Download on the App StoreLoopy Pro is your all-in-one musical toolkit. Try it for free today.
Where is iOS Music going? Freeform / Modular / DAW-less
I've tried to capture my thoughts about the current state and direction of iOS musicking in a little essay. Perhaps some of you find it interesting or have thoughts/reactions/comments about my line of thinking.
https://medium.com/@brambos/freeform-modular-daw-less-d4c89ef5b9c0
Comments
Yes I totally agree with all that
That's what's appealing about iOS to me, a modular dawless workflow which you can design yourself and apps like AB/AUM/apeMatrix are making this possible.
Whenever I want to use a DAW I prefer a bigger screen/ 2 screens and so go to Ableton Live where ipads can't compete.
That's not to say that iOS daws aren't useful for different times and situations. But the dawless experience is where the ipad shines for me and feels less like work.
So it's exciting to see new AU midi tools come out.
I would love to see Max for live on iOS as I think that's more suitable than Live itself. That would be a playground that could work on this platform.
Great read. Totally agree. Although I think there’s more to be said about mobile lite daws: gadget, auxy, etc in addition to the modular style apps you describe. Those seem to aim for a very particular sweet spot in terms of workflow, they don’t try to everything, but they do JUST enough, often with clever interfaces
The thing with mobile jamming as you discussed is that that’s only one element of the mobile musical landscape. There are many producers that compose with song structure, and are looking for a mobile solution for capturing their compositions as opposed to free flowing jams. That’s where incredible apps like Cubasis and beatmaker 3 come into play. They are cut down versions of desktop daws, however the mind boggles as to where these two apps can go in the future, in terms of how powerful they already are right now.
Agree that those whose workflow is suited to/ served by IOS no longer need a computer. But some kind of DAW is still required to bring it altogether.
Repulse the Monkey are a good example of how IOS music making can empower people with no shortage of sonic imagination, but far less ‘real world’ instrument skills than has been traditional. Apps cannot write a song, but they can help to circumvent technicalities.
In the past, so much more knowledge was required to make music, and vastly more time was involved in learning an instrument. Nowadays, virtually anyone can knock up a standard EDM style tune in a few minutes. With more effort, real world instrument skills can be successful simulated. At least up to a point.
Getting beyond that, exploring more demanding genres, and actually creating something interesting, innovative and with its own identity still takes more time, effort and a certain amount of ability. As is right and proper for any worthwhile creative endeavour.
IOS music making reminds me of an art professor who posted a story on you tube. The fellow would regularly present his students with what he told them was an example of Jackson Pollock‘s work, and ask them to explain why it was any good. Only later would he reveal to them, that the ‘work’ was actually just a close-up of paint spatters on his apron.
The point being, anyone can make a splatter pattern, and such a thing can simulate art in the eye of the beholder, but it takes an artist to go further and make something that is art. IOS music apps help people to find that art within themselves, and if they are willing to put in the effort, express it well.
The DAWless thing is exact what not works on iOS for me.
But we all different of course.
Audiobus sessions are still not working always for me without a crash or some trouble and connecting several apps and midi together is such a pain yet. Inside one powerful environment plus a few AUv3 apps it feels much better for me. Or using single apps as controller or record them like a hardware synth is still more fun.
The modular style thing is what every major DAW can do in general but on iOS we still have to search, scroll and try to fit things together....at least i do.
+1
I’m one of MIDI/AUM/AU jamming’s biggest fans. Ever since AUM was released I’ve hardly touched Auria, or even ‘all-in-one’s’ such as Gadget. And I’ve amassed a huge collection of live jam sessions.
They’re pretty useless on their own though, so found myself patching the best bits together in a DAW. And since my Air 2 has started struggling, this has been done on the desktop - recently in my new favourite rediscovery, Reason.
Mobile music making still has a way to go before it replaces desktop/laptop platforms. But certainly the devs behind AUM, Rozeta, and Apesoft are helping push this forward. Or in my case keeping it a useable platform.
Nice to read. Jamming and creating structures live is something I already searched in Ableton Live, there are lot of possibilities on iOS too. The fact is that as said there are lot of musicians with different artistic approches, and all are good. Sometimes I do everything live, sometimes I need to do some precise timeline construction. I also always work with audio as a saxophonist. I can do everything in AUM, with various apps and Loopy for real time recording of sections, but if I want to record multiple solos parts I will need a DAW. It’s not often, but I’ve to master more different kind of workflows than on desktop. I adapt myself to iOS and it leads me to find new ways of doing music. But I also need my music tool to adapt to myself. iOS modular platform allow to create your own tools which is very important too. I often try to do my own tools with various apps, like this one I made recently: https://forum.audiob.us/discussion/25970/tutorial-saxloops-1-how-to-jam-with-iphone-and-ipad-apps-like-with-ableton-live
I’m sure that some electronic musicians find their way easier on iOS platform, as it’s midi-centric as you said. But all acoustic musicians still will wish actual iOS music state to evolve to their needs. But I admit 100% that going DAWless is really refreshing and inspiring, and I do lot of my sax stuff improvised and recorded just with AUM and with other real-time tools. This also leads to improve instrumentist skills, and this is a good thing.
Indeed, it’s not the only viable workflow on iOS. 100% fair point (and perhaps not emphasized enough inthe essay). However, it’s the one that has a definite iOS signature to it - as it’s evolving right here, and uniquely so
If AUM sample players were able to record audio and play it directly after recording, with more timeline position options, I will say bye bye to DAWs.
I agree that sequencers can be separate tools like AU plugins. What I like with that AUM modular approch is that I have almost no graphical representation of my music, I mostly use my ears.
Interesting read and there is a fair bit of preaching to the choir that will accumulate here on this forum. I suspect we will also see the usual "tension" between those who make music as a creative outlet, those who are interested in music as a product (art), and those who are hoping iOS supports their live performances.
The discussion around the need for a DAW really centers around an industry that is intent on replicating the same mental model that has existed around the distribution of "art". The interesting question to me is whether mobile devices--and the proliferation of social media--opens the door to a direct relationship between artist and consumers. Who needs a DAW if an artist opens an app and the audience can open one on their end and consume synchronously or asynchronously?
Who needs a DAW if an artist opens an app and the audience can open one on their end and consume synchronously or asynchronously?
>
Anyone who is into virtual studio production, as opposed to live free form performance.
I mean if that whole thing would be more stable for me i would maybe do 50% of my music creation and/or sound design still on iOS.
If P900 and a full Alchemy would be on iOS i would right now go out and buy a new iPad Pro.
Indeed i think just to buy a new iPad and see if that will work for me or not.
I would be undecided if i would go for 12.9" or 10.5".
The big one looks great but i think iOS and beeing really more mobile screams for the 10.5" since it´s the best in terms of form factor and power maybe.
If that won´t turns out good i could sell it maybe in some months for not much loss anyway.
I still have so much unused iPad apps in my account since i´m iPhone only since some years now and i might have fun again on a bit larger multi-touch screen (but 3D touch is an amazing feature on iPhones).
What i would like to see would be more non chaotic midi and stuff.
F.e. i like to play 5,6 or more instruments at the same time in Logic.
I use the keyboard as midi input and f.e. play some chords then hold one and sustain it.
Then i can jump to another instrument and play an arp, put a midi LFO on some knobs and so on.
I can sustain it then too and jump to the next one etc.etc.
On iOS i often can just sustain all or nothing or some things works and some not.
It´s also hard to find the right virtual keyboard for these things since you have to switch a lot around which a can give me clicks and drop outs on iOS and/or the virtual keys f.e. don´t allow we to hold some notes, switch octaves and push some new notes while still holding the first note and then i could sustain them.
Things like sostenuto are doesn´t exist even on iOS mainly. It´s handy for synths as well.
While iOS offers some amazing tools and GUI´s for live tweaking and playing and interacting with sounds with a very direct feeling it also surprise me that there is so much missing which works much better for me even on a normal computer keyboard.....that shouldn´t be maybe.
iOS is good (Apple could really stand to put some oomph into the code around music related functions) and I love the touch aspect. I like that there's the no-DAW-jam aspect as well as the structured DAW (I still find it easiest and fastest to actually make complete songs in Gadget - but I loathe the fact I can't use 3rd party effects even more than 3rd party synths, though those'd be great, too). And I love, love, love not having to sit in front of a computer to make music I both enjoy making and enjoy listening to.
I think the screen size and file handling are major issues for replacing computers. And, to an extent that gets reduced slowly each year, the cpu/ram/storage connection and flow. And then there's the input/output limitations, especially after they dropped the headphone jack.
I export each track from Gadget as a .wav (every hat, ever sound on an individual track) and then load those into something far, far more powerful than anything on iOS (for now!) to mix and master them. I use Harrison Mixbus on a 27" monitor, and I wish I had a second 27" monitor to go with that. There's also still plugins that are quite nice that are more powerful than on iOS because it doesn't have the system resources - whether Kontakt, or Acoustica-Audio, etc.
The gap is narrowing. But it's there, and to some extent may always be there due to physical characteristics, or at least until iOS devices can seamlessly blend into an ecosystem so the user can just flow between hardware.
>
ISTM that your workflow uses technology and techniques that The Beatles could only dream of having at their disposal. Yet, respectfully, I would submit that while whatever you make may have technical superiority, the Fab Four (and George Martin) achieved more than you - or any of us here - will ever do, in just eight tracks.
Similarly, Phil Spector worked wonders with musicians crammed into small rooms with half a dozen microphones recording in mono.
The point being, a massive amount of tracks and every sound controlled to the ultimate degree, does not automatically mean great music. IOS has everything that is needed in terms of tools, and the rest is up to us.
To be fair though, they were recording in a studio with amazing acoustics,that had evolved over a 30 year time span , with some of the best audio technicians & engineers in the world, with incredible microphones & mic positioning techniques, with state of the art recording equipment. ( & they had George Martin!!!!!! )
>
This is true.
Yet do we not hold in our hands apps, including recording techniques and virtual positioning, programmed in by those who learned from what was done in the past. Also, virtual studios that George Martin coukd have worked miracles with, from his breakfast table!
We have so much more now, built from the knowledge of those pioneers, filtered through app developers. It is up to us how well we use it.
By your own rationale, you don't even need iOS and any hosts or plugins, because the world has everything you need to make music.
Of course it's entirely possible to make music on just about anything, including iOS. And my hobbyist workflow is my workflow and my (lack of) talent is my (lack of) talent that needs more "crutches" to result in something that isn't pure crap. Heh.
BUT, I'm fairly confident that to shift an industry, and for said industry to consider the iOS platform as relevant and useful, much of what I said applies in terms of platform adoption.
(Mostly) Seamless integration and evolution of existing workflows. It's highly unlikely to change overnight. I don't know, Apple sort of needs an Apple to come along and shake things up and "do it right". I think the tech is still limited, though, more than the vision (Microsoft, by the way, is actually pretty great at tech vision and how it could fit in our lives - just not so great at mass market execution and marketing). But it's getting there.
Currently, the disparate pieces of tech floating in iOS connected by virtual wires are still too sandboxed, in large part due to iOS. It should be possible for each app to print the audio in one place (Files!), be it a single synth making a sound, or a DAW like Gadget hosted in AUM talking to Audiobus. And have tools that output MIDI also print at the same time. And then have those files seamlessly and "instantly" available on a PC or laptop or other mobile device for consumption and manipulation. And that's even just talking about files, let alone going beyond that to the seamless integration of the software itself with software on other platforms, so that files - if needed - don't need to be "exported" from one device to another, but can be available on any device, or not, since the software is available to use.
I need more caffeine.
Word. It has everything required to make great music, and has even more things that are not required.
This was a great essay Brambos. I am going to read all of your essays now that I see them on that site.
Do you think this Metal situation will stall new development for a period?
@brambos
I agree with the modular direction and even like it over a full blown (or overblown) DAW approach.
A few things are limiting my enjoyment of this approach at this time:
Saving projects across the board can still be problematic at the moment. I however see this diminishing over time as small AU Hosts become the glue to hold it all together. Hosts such as AUM, AudioBus 3 and the new Matrix thingie are showing the way and hopefully as less IAA use is needed, things will gel more and still keep the modular nature.
The lack of AU Midi controller surfaces really needs addressing. I want to use the benefits of touch screen ‘keyboards’, drumpads and other midi input designs that I love from many iOS apps, with whatever sound sources I choose. At this time, AU Midi input devices are limited to pretty much sequence use. I want keyboards with much more touch friendly design - sticking to the piano style keyboard just does not make sense IMO.
The connection between the modular environment and recording. At this time, there are few small midi or audio recording options beyond Full DAW like apps. We need AU Audio and Midi recorders that can be placed where we want them and then midi controlled from the mini hosts. At the moment it’s easy to record stereo, but setting up multiple audio or midi recording is a pain at best!
The great thing with using apps like AUM, AudioBus and Matrix, is they promote creativity and work well for live jamming (or just creating sound!). Problem is then shifting something to work with one of the DAWs. All that recreating what you’ve just designed, when all we really want are ways to get what we already have connected to either mini recording devices ot to another smaller footprint DAW. Almost looking at an AU multitrack recorder and an AU midi recorder all synced up to the mini host.
I think the iOS DAWs days are numbered to be honest. Modular with mini host is the way forward IMO
For me, I consider AUM with a bunch of things hosted in it a DAW essentially. I would never call an AUM based jam DAWless. It’s certainly a different approach than say a desktop but it’s definitly still just using a different kind of computer. It’s a Digital computing device, it’s Audio, and it’s definitely doing/flexible enough to be called a Workstation.
Now, I think the article is excellent just pointing out that I’m not sure iOS can be called DAWless unless it’s used in a single purpose way like it is a single synthesizer etc. My experience with desktop is pretty limited though. I’ve really just been using iOS. Sometimes I do these computerless jams like with Volcas and a Circuit but I’m certainly no purist about it. In fact I bought the Circuit because I wanted to sequence the iPad with it. I ended up mostly using it by itself though. Anyway, I think the article is excellent. I just cringe when I hear DAWless on my iPad. It’s kind of like going vegan with cheese on top to me.
I'm open to a better description. "Plugin host without a conventional timeline, pianorolls or clip launchers" just doesn't roll off the tongue very smoothly. And I still stand by the observation that what I'm describing is the virtual equivalent of a DAWless hardware setup
In fact, I'm from the era when DAW referred to the entire computer (which was equipped special harddrives, sound cards, midi interfaces, clock generators, and software etc.), so in my books everybody is using the word DAW completely wrong anyway.
Yeah, that name doesn’t roll off the tongue. I also don’t have a better term
. As for the vegan with cheese on top comment, I do that all the time. I call it eating. I agree that the approaches are similar and it kind of is a virtual equivalent, but DAWless on my iPad still makes me cringe. The best thing about doing it all in AUM is not having 50 cables, adapters and splitters to just do a jam. Also not carrying 6AAs per component of the jam if doing it on the go... 3 Volcas, a Circuit, and a Reface? I have them, but don’t have enough batteries to take them all.
The Medium article states " Audiobus came up with the brilliant idea to repurpose an obscure networking protocol for routing audio between apps." I'm curious regarding what networking protocol was used. Any clues appreciated... a quick Google search didn't slip me a clue.
It's funny that while this article is true it's mostly thanks to @brambos. That's not to slight it, I think momentum is building in this direction and it's awesome, but Rozetta and the Makers are so good and straightforward that their paving the way. Thanks to Aum for making it possible though!
In general you don´t need anything but one device for all this. Depending on size for mobility and kind of tools you need/want you can choose between iPhones, iPads, notebooks and whatever.
Maybe a windows 10 multi-touch device would be the most powerful way of integrate all kind of tools in a very powerful environment.
If there wouldn´t be Logic and my favorite synth which is mac only i would go this route.
But then even when i had to buy now the latest and greatest iPad tools it would cost way more than Logic, which just has everything and some more i could imagine ever for every genre. So it´s actually even the cheaper way for me to just stay with it.
Also combining multi-touch controls via apps and trackpad and having short-cuts and keyboard as midi input as well is very very fast.
I also still find there is much place for better FX in general and FX and other things like saturation within synths.
Then for sample libraries there isn´t really much good, so i guess we mostly talk about synth sounds with some general midi like samples on top maybe.
But sometimes i guess many people thinking about DAW´s like in the 90´s or computers from 10 years ago which need minutes to boot up. These times are over as well
All the tools and devices are great but indeed there isn´t really much new in iOS you won´t find anywhere else as well in a slightly different form.
Some of the more unique tools on iOS are still Animoog, Mitosynth and things like Borderlands Granular for me.
A multi-touch screen should be about real time performance and morphing sounds etc. Someone please make something great like the Alchemy snapshot morphings of this quality.
Even if the old Alchemy app was very limited in terms of editing, i really loved it to perform with on my iPhone and iPad as well. It was and would be still the king for just about any genre when you want to bring life into your sound easy and fast.
Half of the iOS apps i tested in the last month would be more easy to use with a trackpad or mouse since they aren´t really made for touch.
So what happens to the old and often much better performer. Noe too much developers seems to try to fit desktop like apps into iOS.
So in some years there won´t be much different between an iPad and 2-1 hybrid like surface workflow....just that iOS might still lack the connections for external SSD´s and stuff.
(Hopefully I don't repeat any previously-made points. Just don't have the time to read the thread, just the article, lol.) It's a very interesting and well-thought-out read, but live jamming isn't the only thing iOS is capable of. However, it IS something it does very well, especially if you're into experimental/generative music like I am. This is where your Rozeta plugins come in handy like crazy. Sending random MIDI to my Gadget synths is a godsend.
However, also being a pop/EDM producer, I tried many ways to "find my groove" in iOS. I finally hit my stride with composing in Gadget, bouncing the stems to Audioshare which get loaded into AUM for the mixdown (the mixdown which, thanks to @FredAntonCorvest making EnVolver and Transient as well as Klevgrand's plugins, is EXACTLY like it was in FL Studio), and mastering either in Grand Finale for a quickie or Auria for a larger project. Of course Wizibel, Lumafusion, ProCreate, and my recently-acquired Core Animator are absolute godsends for generating video content. The basic Notes app is perfect for lyric writing as well as scanning in my sheet music to be used in Pia Score for piano gigs.
So, as said, live jamming isn't the only thing iOS is capable of. I can build an entire music career just using an iPad Pro, from producing music and video content, to reading sheet music, writing lyrics, notating music in Symphony Pro, using my iPhone with unlimited data to stream my piano performances via Facebook, the possibilities are endless.