Loopy Pro: Create music, your way.
What is Loopy Pro? — Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.
Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.
Download on the App StoreLoopy Pro is your all-in-one musical toolkit. Try it for free today.
Comments
In the US one of the main buyers of outdated Apple gear (called “Gazelle”) announced they will no longer be buying this stuff, so trading back in for a discount on new Apple hardware is the only sensible option if you don’t want to deal with selling hardware to another person yourself.
I'd much rather sell myself but I'll probably keep it as a secondary device anyway when I upgrade.
I have some minor cosmetic damage—a very small screen crack from leaving the iPad on the floor one night when it was freezing. (Protip: Don’t do that.)
Oh, no! Apple investors would be really, really, very, very angry if they did offer those features because AirPods sales will tank!
That should also be enough to cover the cost of a headphones dongle. Gonna have to wait until next year’s budget to get a case though!
I doubt it. For most people AirPods are for convenience and coolness. I doubt if we who can't use 'em for music production are even a tiny blip on the sales radar.
I find myself wondering what percentage of iPad owners have ever plugged anything into the port besides a charging cable, or maybe a Pencil before they went to induction charging. I wouldn’t be surprised if it’s a single-digit percentage.
(Surface is also a tablet).
I was referring to the futuristic MacBook with detachable keyboard and touch-screen (MacPad) which would work as a laptop or as a tablet. It would have all the raw power and memory of macOS and desktop apps. And yes, it would run iPad apps too because of M1 architecture. In addition, stable desktop O/S and apps as opposed to iPadOS and iOS and apps that keep breaking and crackling with battery issues on every new iOS/iPadOS release making it an unstable/unreliable/non-viable solution to any serious music production. Why would you not want a MacPad??
Just look at what they did to iMac - a bulky desktop transformed into a wafer-thin "screen" which looks like a giant iPad - only it has no touch. With thinner profiles, M1 running iPadOS/iOS apps on iMacs/MacBooks (reminds me of Roland ZenCore running on multiple keyboards), Apple is THIS close to making the keyboard detachable with touch on MacBooks (and maybe touch on iMacs too).
For the same reason, why Apple would not give an FM Radio inside the iPhone because Apple Music sales will tank.
The Surface is the example of why you wouldn’t want that: because operating systems designed for use with a keyboard and mouse are horrible for touch operation.
Macs have had touch input for years. It’s on the input device - the mouse or trackpad - where it belongs.
I’m inclined to doubt that - streaming music on demand can easily substitute for radio, but the reverse is not obviously true, especially now that we’ve had years to become accustomed to getting exactly the music we want when we want it.
.> @celtic_elk said:
O/S and apps are software that can be updated - not hardware that is already shipped out. They can be made to behave touch-friendly. If anyone can do it, it is Apple.
That's what we have been trained to believe for ages - what belongs where. It is just a matter of updating the O/S and apps. Steve Jobs mocked at using stylus on tablets for years (he was mistaken) - no need to mention what Apple Pencil is doing now.
I invite you to consider the possibility that Apple has done their research, considered the trade-offs inherent to building a single OS for both touch and keyboard-and-mouse environments, and concluded that it can’t be done without unacceptably compromising the user experience. That would be a very Apple way to proceed, don’t you think?
Jobs mocked devices that required a stylus for operation. That’s kind of an important distinction. Pre-iPhone devices had styli because the touch interfaces were so poorly designed that you couldn’t feasibly navigate the device with just your fingers. The Apple Pencil, in contrast, exists because a pencil-like input is the most natural way to perform specific tasks on a tablet, like handwriting or fine drawing. Not everyone will want to perform those tasks. I’ve owned an iPad since the first one was released - I pre-ordered, in fact - and the only time I’ve ever been tempted to use a stylus is to manipulate Sugar Bytes’ interfaces, precisely because they couldn’t be bothered to redesign their desktop (read: mouse-and-keyboard-centric) interfaces for a touch environment.
@xor said:
Ah, yes. But could they fit more memory on that SOC+memory package? Maybe, I don't know. Or maybe as a first generation it just wasn't designed to support more than 16GB. I'm just speculating here I think those who want more memory will have to wait for M2.
I grew up listening to the terrestrial Radio and I have all kinds of radios - AM, FM, HD, Satellite (with Lifetime free subscription), Internet and I still use them. I would prefer to listen to the random radio stations all day anytime over on-demand music. It is unpredictable.
FM Radio Tuner chips are unlocked in android/Samsung mobile devices for users -
https://www.google.com/search?q=android+phone+radio
Apple removed the FM chip from iPhone 7 and up. Even in the earlier models, they never unlocked it.
Why would the user experience be compromised? Users who want to use keyboard/mouse can still use them and they would still be available and work the same way. Users who want to use touch can still use touch (especially when the keyboard is detached). Or users could use a combination of mouse/keyboard-touch for more power. Touch will only give us more power - it is one more way of doing things and whatever is convenient in the context. What is the "difference" between a laptop and a tablet - when the keyboard is detached? Just the O/S and apps - which can be updated and made touch-friendly.
Pencil could have existed much earlier with the same apps existing in pre-Pencil era. 3rd party pencils and styluses came into existence much before Apple Pencil. Jobs was the reason why Pencil was not launched for a long time. It was launched after he was dead.
You can make streaming music random, at variable levels of randomness - see Spotify’s algorithmic playlists, for example, or artist-curated playlists, or even streams of actual radio stations. You can’t make over-the-air radio on-demand, apart from phone-bombing a DJ to demand that she play a specific song.
That is called streaming music - not broadcast/terrestrial radio.
Terrestrial radios run on AA/AAA/C/D... batteries for over six months. They are also GOOD for PUBLIC SAFETY in situations like hurricanes, cyclones, tsunamis, earthquakes. They are FREE to use too (just like GPS satellites).
Streaming radios/apps drain out a fully charged high-capacity battery in under a day because of huge processing, conversion...
That's like comparing a wall Quartz clock that runs on a single AA battery for years - vs - Apple watch that doesn't even last 8 hours on a full charge.
EDITED
I think for larger computing surfaces Apple Glasses will eventually replace the larger screens, even if a touch surface of some kind remains. For a brief state-of-the-art look at what might be possible, see this video:
Maybe I think differently than most people about these issues because I have an information degree, which means that I went to school with a bunch of people training to be UI designers and other kinds of human-computer interface specialists, or because I’m married to an instructional designer. Or maybe because I’m strange. Regardless, it seems apparent to me that you design a UI very differently when your primary form of input is touch (which is relatively imprecise with respect to exact placement, and which has relatively few modes of interacting with the input device, but which can contact the surface in multiple places simultaneously, and can be continuously varied in pressure if the appropriate sensors are available) and when your primary input is a pointer, like a mouse (which is much more precise in locating specific targets, and which can be combined with additional button presses to add context to the input, but which is inherently a single point of contact, and which requires some different neural engagement, because you’re manipulating something onscreen using a device that’s offscreen). Imagine trying to consistently distinguish between the three window-manipulation buttons in macOS (close/maximize/minimize) with your finger, for example. That’s a bad user experience, which is anathema to Apple’s brand.
Apple took a lot of heat for building "a big iPhone" when the iPad came out, but from an interaction-design perspective, a tablet just is a big smartphone - the interaction affordances are exactly the same, except that the screen is bigger, so you have more opportunities for complex multi-finger interaction (like the onscreen iPad keyboard, which I’m using to write this analysis). macOS, in contrast, was (at the time the iPad was released) the product of a quarter-century of refinement for the keyboard-and-mouse input paradigm. You simply can’t translate that to a touch-input paradigm and get a seamless user experience. Microsoft tried it with Windows, and it was a disaster - go back and read some of the early reviews of the Surface tablets, if you weren’t following tech media when it was released. The current Surface is basically a laptop with a touchscreen that you can kind of use as a tablet, for extremely simple tablet functions like scrolling a webpage or watching video - because those functions involve relatively coarse interactions with the interface, and can therefore be managed with a finger, which is a poor substitute for a mouse pointer. (A mouse pointer, on the other hand, is a very good substitute for a finger for many single-point-of-touch purposes, which is a big part of the reason Apple permits the M1 Macs to run iPad apps. There are some obvious exceptions - you can’t play an onscreen keyboard with a mouse in real time, for example, although you could manage step input for sequencing.)
Correlation is still not causation. Personal computers have had stylus-input devices for decades; I’m old enough to remember the KoalaPad, which was a resistive-touch tablet with a stylus meant for drawing with a personal computer back in the 1980s. Jobs would certainly have been aware of the utility of styli for specialized tasks - this is the guy who helped bankroll Pixar, remember. An alternative hypothesis is that a stylus was a relatively low priority for Apple, and it took them a while to get the technology developed to the point that it met their interaction standards. Remember, Apple is rarely the first mover on a particular technology, because superior interaction design is part of their brand, and that stuff takes time to develop. If you’ve got sources that explicitly say that Apple delayed releasing the Pencil until after Jobs died, that would be a different story.
All of these points are inapplicable to a discussion about Apple’s decision to remove access to FM radio on the iPhone, because even if it’s receiving an FM radio signal, it’s still an iPhone.
But what is the high-level difference between a laptop and a tablet - when the keyboard is detached? Just the touch-friendly screen, O/S and apps (I don't know the inner workings of hardware). If the screen is touch-friendly before it is shipped out, O/S and apps can be updated and perfected over time and made touch-friendly which would serve as a bonus feature to keyboard/mouse. If screens can be made to respond to "touch", they can also be incorporated into screens that have keyboard/mouse for more power to users. That hybrid interface should only enhance user experience - just as the invention/perfection of Mouse or Game Controller (in addition to just the keyboard) made a huge difference in computing and gaming. Character-based applications support mouse and it offers enhanced experience -
https://www.google.com/search?q=character+based+application+mouse+support
Microsoft also had a crappy tablet in 2002-era (bigger than iPads with a stylus) which was a huge flop. Touch experience on Surface may be mediocre which speaks about their competence in developing hardware/software. As you are aware, Apple perfected it, made it fluid and re-introduced the tablet - they did not invent the tablet.
You are right.
Apple "removed" the FM Tuner and did not act in the best interests of public safety - when competitors did. What could be the reason?
Even when they had the FM Tuner in older models (iPhone 6s and below), they never unlocked it in the best interests of public safety - when competitors did. What could be the reason?
Apple "removed" the headphone jack - when simply retaining it would have offered enhanced experience like it did before. What could be the reason?
I think you’re missing my point. I’m not arguing that you can’t design an OS that natively supports both keyboard-and-mouse and touch. Microsoft made Windows do it, after all. I’m arguing that any attempt to do so is inherently flawed - you can’t design a single OS that does both of those things well, because the two paradigms are at odds in terms of what makes for a good user experience. To put it bluntly, Microsoft has decided that a half-assed Windows touch experience is good enough, and so you have the Surface; Apple wants you to have a superior user experience on all of its devices, and so you have a separate OS for the Mac and for the touch devices (and in recent years, a separate OS for the iPad distinct from the iPhone, as it moves more in the direction of a general-purpose computing device within the touch-input paradigm).
Geeky side note: I tend to think of the Apple/Microsoft approaches here as akin to the Marvel/DC approaches to superhero shaped universes. Marvel did the work over multiple solo movies to prepare the ground for Avengers; DC saw their success and tried to replicate it in a hurry with what they had on hand (a new Superman film, which was already in production when Avengers dropped) and ended up with two mediocre films that they’ve been quickly backing away from.
If scrolling web pages and watching videos is all that Surface does, then not many people would need Surface.
StaffPad was introduced on Surface, first.
StaffPad was introduced on iPadOS - years later - because of Pencil.
Time will tell - let's wait and see.
I saw a video on youtube where somebody tests the surface with some audio software (DAW) and VSTs. The pinch to zoom function was so horrible and jiggery that i said right there: No way i could take that after the ipad experience.
And yes looks like right now it is not possible to have a great touch experience and then attach a keyboard and mouse and experience something like macOS fully. This is why i lean towards getting a different Logic for ipad than what we have on macOS. Just not possible now. Or maybe Apple will surprise us with iPadOS15..
I didn’t say it was all that Surface does. I said that it was likely what the vast majority of Surface users use its touch features to do. There’s a reason that Surface advertising almost invariably shows it with the keyboard, and that recent Surface reviews devote little or no space to addressing its touch responsiveness: touch is simply not an important mode of interaction with the device for most users. It’s a laptop that happens to have a touchscreen.
StaffPad users are not a majority of Surface users, and a single example of a Surface-first app is not an indication that there’s a strong demand for Windows apps with rich touch features. I would, in fact, argue that the absence of a rich ecosystem of touch-native Windows apps is a pretty good indication that the Surface’s touch features are secondary at best for most users.
Time always tells, but what’s the fun in waiting for that? 😉
They offered me 355 USD when I typed it in. I think it was closer to 50 or 70 if I advised there is minor damage beyond cosmetic. This is for 2017 iPad Pro 12.9 512 GB.