Loopy Pro: Create music, your way.
What is Loopy Pro? — Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.
Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.
Download on the App StoreLoopy Pro is your all-in-one musical toolkit. Try it for free today.
What are your general thoughts/critique on iOS music production/performance?
This message is about our use of the tools in production or performance, separated from the technical 'does it work/how it works' aspect of this area of music.
We're getting established with more serious workflow possibilities these days. There's loads of iOS produced music floating around (which is brilliant) and I enjoy listening to people's clips and shared tracks every day. Performance is rarer to catch but it is well documented on youtube.
So now that we're here and there is a body of work around us, let's critique each other and learn how we can improve further in our creation!
My criticism of iOS music production is that a lot of tracks feel very 'unmixed' to me. The individual components are sculpted to perfection and digitally perfect, but there is not always a consistent blend or 'mix cohesion' and things tend to feel somewhat flat and disconnected. With great multitrack recording functionality available now, I think that taking the audio out of the creation process and going through independent mixing and mastering stages will be very beneficial to our process. I've also started noticing the same 'favourite' Thumbjam samples appearing all over the place, so have been making sure to process these further whenever I use them to avoid repetition.
In performance my opinion is very different and I find that people tend to be very attentive to volume levels and blend. My guess is that the live process and intuitive touch interface make this a natural thing. Performing with iOS instruments seems to be a confidently blossoming area, well suited to live performance. The thing that I'm still finding the 'right spot' with is interface when processing real instruments. AB+ABr works great for patch changes and things but intuitive performative changes are more of a challenge, and I don't think that a floor-button-board is the answer. This technology is an effective upgrade to 'pedal boards', so there's no need to limit ourselves to 70's technology in our control method! FLUX:FX has made some great developments in making the interface work intuitively and is well suited to integrating with external controls, but also despirately needs an update to it's MIDI functionality.
I'm interested to hear other points of view!
Oscar
Comments
thought provoking post Oscar .... ill give it some thought
I've noticed that things I post on Soundcloud are much boomier than I intended, however I know the reason (and keep repeating it) is that I need to listen to the composition on other systems, since trusting the iPad speakers is what leads to this. I guess the convenience aspect can be taken too far.
Very interesting about the iPad speakers, I have long suspected that the lack of mix cohesion comes from listening on cranked up earbuds, but it didn't occur that people are working on the devices own speaker.
O
Yeah....I know better but too lazy to break out the cans or through it on my monitors.
Perhaps in a general sense it's also the ease of sharing a 'work in progress' built into our tools that impulsively puts our unmixed arrangements into the world. My colleagues working in professional music production vigilantly protect early and unfinished versions of tracks and actively reinforce to their clients to not share unfinished or unmastered mixes. It's obvious why: what they put into the world directly represents the level of quality of their craft, so they only want the finest examples to exist publicly.
In this way, we're in a different space to this. It's really a wonderful thing to have so many ideas and creations shared so actively, but it also shows the natural immaturity of our market niche from a 'user perspective' (it goes nicely with the immaturity of our toolkits, however; ) ). All in all, I don't know if it's a good or bad thing. Maybe just a 'different' thing that we can learn from.
O
That's an interesting take on it @OscarSouth . Makes sense, but yeah....not sure whether that has a downside.
For noodling about on the couch I also like to use the onboard sound.
Looking forward to the iPad Pro. The sound on the old iPads is really shit.
But that's just for me, anything that leaves the house and is for other people to listen to is done on Sony cans.
I strongly agree with mixing being the biggest iOS issue. Most of my work is done on Apple earbuds; which I'm quite happy with. I occasionally employ my ATM50 cans, but they're just not as convenient or comfy. Checking mixes in the car and living room stereo helps tremendously, but I've not been consistent with doing so.
I am hearing very good production by people using iOS technology.
Unfortunately, live performance still has quite a ways to go. Too often it still just looks like someone checking his/her email. I do hold out hope for change, though.
This is my biggest gripe with iOS live, because just about all of the other issues/gripes I've had in the past have been worked out. When some app finally offers the ability to move my setup (AudioBus presets, sessions in looping apps, etc) from one song/session/project to another down a setlist with a MIDI program change (available at my foot), iOS will have arrived. I have high hopes the AudioBus team will solve this eventually (since they solved the routing issue so well previously).
Agreed that this is a shortfall of iOS, but to me it's more of a specific complaint about the technical nature of our tools rather than a general issue reflecting how we're representing ourselves (opinion, of course). I think that this limitation is policing itself in a way, stopping people from thinking in a traditional manner when approaching live use, and those who aren't getting past that point aren't going out and performing. I say this because I've not seen anyone out performing like this really and the successful iOS performers I've watched have all found a concise and innovative workflow that works for them, focusing on interface and ease of control over traditional thinking. I've also found my own live iOS experience to be by necessity of the toolkit, highly focused towards elegant use of minimalist setups and intelligent signal routing (THANKS JONATAN LILJEDAHL!!!).
O
I agree with all your points. I've had to drastically alter my approach to live performance in order to make it work, and I've done it because the upside is substantial. Namely, I can play with the sounds I hear in my head rather than having to make the concessions I've made previously. I work around the current limitations, but I don't think that means that I can't voice my opinion about the "better mousetrap".
To be clear, I'm not asking for scaffolding levels of complexity--instead, I'm looking for a relatively simple solution: I create a setlist of AudioBus (or AUM or whatever) presets and I can use a foot pedal to program change my way through the set. This is not a "traditional" or "non-traditional" way of approaching things--everyone from DJs to songwriters to EDM producers have a concept of "different things" that they move between in a live performance set.
Ableton Live solved this years ago. Why has it taken so long on iOS?
I am with you in that loading multi app scenes should be more reliable. Right now it's vaguely doable, but we're just 'more or less' making do with what we have.
However Ableton live is working inside it's own single app ecosystem while we're running networks of apps together, so I think that we're never going to have an 'Ableton live on an ipad' scenario. Something great and flexible for sure, but also different in it's nature.
O
Yep, good point about Ableton's walled garden.
I still have hope that someone will propose a standard that the iOS community can get behind.
I tend to believe that any lack of mix cohesion an iOS track may have is just symptomatic of the digital recording revolution in general.
I began my recording odyssey in the relative Stone Age of 1993, but at 17 years old and crazy about music & recording I was thrilled to have my new Tascam 424 Cassette PortaStudio to make demos with. I moved on to the eight track version, the 488 mkII a couple years later, then got the experience of working around pro analog 24 track/2" tape as an intern at a local studio.
The shortcomings of analog made things more challenging and at times frustrating, but there was a commitment to an idea. There was the feel of sculpting with marble as opposed to the liquid and indecisive experimentation that can happen with the undo laden digital sculpture of today.
Sgt. Pepper being done by bouncing between two Studer J-37 four track machines is often mentioned in the analog vs. digital discussion, an example of the "necessity is the mother of invention" cliche in action. Part of me likes to think that Martin & Emerick and the Beatles still would have made the same album if they had a ProTools rig vs. the Studer machines and EMI mixer they had at Abbey Road. But a greater part knows that more options breeds indecision and they may very well have suffered from the same problems we do in getting clarity and cohesion in our music today.
I feel that's mainly where the issue is with digital recording and in turn our experiences here in the iOS music community. We'll massage a track to death, adding 3 different plug ins and editing it to perfection while listening to it on solo for 45 minutes AND THEN wonder why it isn't gelling with mix. I know this as I type this, but am very often guilty of it. It is very difficult to not experiment with or manipulate a track when we have so many options and toys to work with. We almost second guess ourselves and don't believe that the well recorded, well played guitar track will be just FINE as it is. The temptation is always there to "wait, let's try this" it, for better or worse.
The great part of iOS recording that differs from the computer based DAW, mouse & keyboard style experience is that the touch interface an iPad brings is very immediate, just like working with an analog mixer or piece of gear years ago. There's a simplicity in the iOS realm that for me at least allows for the technology to meld into the background and the music to spring forward.
The limitations of processor & RAM, although it's getting close to being a moot point with the power these tablets have today, can make us think twice before slathering a track with every plugin we love. The iOS apps are also usually more streamlined, allowing for a quick workflow. To me the experience reminds me of those early days making demos on the 424 PortaStudio, just more immediate and bottom line, fun.
You should never ask me an essay question by the way, I do go on. And I could go on here but I hope I made my point. If the mixes you're making aren't gelling, and you're monitoring on quality gear and have a decent, musical set of ears, then the approach of surgically getting everything perfect may be robbing the track of it's life...it can't 'breathe' so to speak because it has been nipped and tucked to death.
I have run into it plenty and lately I try to trust my instincts more and am trying to embrace my decisions and stick with them. Do the bass part, make sure it sounds good with the tracks already recorded, move on. If all the tracks are dealt with like that, kind of mixing as we go, then hopefully the mix will come together better and all will be good.
Great topic, it really effects every iOS producer...be cool.
If Sgt. Pepper's were recorded today, would John be singing through Bitwiz on "Lucy," utilizing the mysterious variable i? Would John be a ghost, or would we be on a different timeline entirely?
Using the internal speakers is a pitfall, for sure. Really any system that skews the bass response is trouble for mixing. The mix ends up sounding like the opposite of the system's response, like, if there isn't enough bass response you will compensate by making the mix too boomy.
Repetition and endless looping is something unique to electronic music to watch out for, as well, I'm trying to hit stop whenever I can, so as not to fatigue the ears too quick on a good loop.
This is yet another super informative thread topic here on the bus. Thank you @OscarSouth and all the others so far, I am re-thinking my process with each post, seriously.
I think there could be a better designed conduit from raw tracks that are together in a DAW to a mixing/mastering type destination. In Audoibus and AUM we have input/effect/output set ups which may lend to the overlooking of proper balancing/mastering/refinement of a song. I guess if there was a location after output that would be effective and as easy to use as our other apps, tracks may become more polished and uniform across iOS.
Hm, also working on iOS is very different from what I do on a laptop.
On iOS I just throw together some apps in audiobus, play a little around and then hit record at some point. It's live and spontaneous.
In a daw on a laptop I am rearranging mixing and resequencing recording and deleting forever. so that is very very different. It's not spontaneous at all.
So on iOS it's more like let's what I can up with today.
I wish more apps would forget about traditional interfaces and go with something completely new that plays to the strengths of a touch screen. Like why are we still turning virtual knobs on synths? There still this tendency to put so much visual information all over these interfaces that sometimes I feel like my ears are not the ones making decisions. I feel like iOS music is in a great spot to push the medium forward even further. Am I the only who could do without seeing another piano roll or wave forms in a timeline for the rest of my life?
Take Patterning for example. If they were to have asked people "hey how'd you like a drum sequencer with a main interface that consists of a bunch of concentric circles?" I feel like the gut reaction would be "huh? That sounds a little weird." We need more ballsy interfaces in that vein.
timeline based stuff sucks, it's so retro.
I want to play around with ideas and not record "the opera" I have already fixed in my head.
100% agree. But humans love mental models they've accumulated...
They also don't know they want a thing until you show them that thing. Case and point: patterning, almost all apple products. Anyways we are in agreement.
This is a great point that's often overlooked. If we were told to make a song in our DAW's as normal EXCEPT we couldn't switch to the edit view, I believe it would startle all of us that we depend on visual representation in our audio works so often.
But that's how it was for over 50 years for recording studios. Plus anyone in the early digital age with ADATs and digital hard disk recorders in home project studios it was the same.
I try not too rely upon the visual to much, especially with level, but I am totally in there with the zoom and editing up bits and pieces way too often. It's just hard not to...because we can.
I don't have kids but have 4 nephews and a niece. One of the boys last year, 10 years old, said to my wife and I and his Mom, "How did you guys live without Google and the internet?" with genuine amazement. We all laughed and told him that we did, we managed just fine. It totally made me think though of how commonplace the technology we use is and our dependence on it...even when before those technological advances came, we somehow managed.
i believe the 'cohesion' issue stems from a lack of focus, the iPad is a single screen that is touch sensitive but the majority of the focus is geared towards connecting what you have on the screen to a multitude of other apps behind what you see and interact with on the screen. I feel like this is counter productive and makes it difficult to get a cohesive sound when your whole process is the polar opposite of that. There is a lego mentality that surrounds much of the development but the iPad is not a lego and legos do not create an efficient workflow for instrumentation, by their very nature they are tedious. Imho devs should be thinking how can they fit everything that is needed in the compositional process onto that one screen, having to connect with other apps is not a feature its an unfortunate necessity because so many apps take advantage of the current culture of development in this regard, building in the reliance on other devs to make an app complete. Just to make a loop you've gotta load up a sample editor, then a sequencer, then an fx module, then something to pan with and prepare your audio tracks with, not to forget a multitude of imports and exports between those apps to finally export to your final app that your going to arrange with etc... the makers of audiobus and audio share have been the saviors in this regard for making composition on this platform a noble quest but they wouldn't have had to be saviors if things were on the right track in the first place. So the environment is not conducive to cohesive output.
Remember when ableton live came on the scene, the main feature wasn't the time stretching it was that everything was right there on one screen that made working in logic seem like a completely unmusical task. iOS apps that get it in my opinion are apps like Samplr (minus the lack of panning), ikaossilator, soundscaper (dense but functional) etc.. it just seems sometimes like we focus so much on connecting many disparate apps together that we forget about connecting ourselves with an app/instrument for a complete experience within itself and I think that disconnect translates into the final output.
@kobamoto Very well said! I agree wholeheartedly. I've had in my head for a while an interface that would put on one screen, "everything that is needed for the compositional process". But I'm just a dude with no coding experience, no one wants to listen to me.
you and me both db, I just submit request to the devs I like an hope for the best, some of these devs are so freakin talented all you can do is hope they lend you an ear.
Yeah I've had some good experiences with a couple of the smaller one man team devs. Those guys appreciate all the input they can get. Sadly, even their resources are limited.
RMS is ok ...