Loopy Pro: Create music, your way.
What is Loopy Pro? — Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.
Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.
Download on the App StoreLoopy Pro is your all-in-one musical toolkit. Try it for free today.
Comments
The thing some people are worried about is that the system can be used for other images, not just those from the CSAM database. That’s a valid concern but again, this system is limited to exact matches, so this does limit it’s abilities. This is already much less than the likes of google have been doing for the best part of 15 years.
Apple have already stated they won’t agree to this anyhow. You can choose to trust them or not.
If you don’t want your photos to be matched to those in a known database, turning off iCloud Photo Library is all you need to do to defeat it.
Apple clearly aren’t doing this to aggressively hunt down nonces. They’re just preventing certain images from being stored on their servers.
@tja time to stop resisting Linux, both on your desktop and on your phone (sorry had to! The truth is though -- Linux on phones still kind of sucks, both GNOME and Plasma).
I look at things like this.
If you don't pay your taxes government services will find you faster than you can blink.
It's not that they don't know what's going on.
It's because they don't care except for the coin.
Power can definitely be corrupted and the moment you use their devices, agree to their terms of service etc, then you have given them power over you. So if you do not want them to have power over you or assist them in having power over others (by buying apps, devices etc) you simply need to stop using Apple and send me your ipads, preferably 1-2TB M1, pls, thx
Nah.
Don't send them to AudioGus...
Send them to me.
Thanks.
Gah! Foiled again!
“Last year, Apple reported 265 cases to the National Center for Missing & Exploited Children, while Facebook reported 20.3 million, according to the center’s statistics. That enormous gap is due in part to Apple’s decision not to scan for such material, citing the privacy of its users.” — NYT
Apple are the only tech giants that have made privacy a selling point. If they were to betray their customers trust they’d be screwed.
All of the other big tech companies scan far more aggressively for content. Both for law enforcement and for selling your profiles to anybody willing to pay.
Mistrust Apple by all means, but they’re by any metric the least invasive of any of the big players.
The least bad if you like. None of them are in this for you and I, they’re money making machines.
But Apple have put a lot of eggs in the privacy basket. It’s not in their interest — at least now — to screw that up.
Will they in the future? Probably but by then Apple will be the least of our troubles.
If you don’t like it, the only safe way is to go fully analogue. Apparently the German military intelligence services went back to using typewriters for certain tasks.
The only way to ensure that you are not to tracked and profiled online is to stop using digital devices.
You have to draw the line somewhere. I’m ok with Apple. For now at least. I’m not a fan of Facebook and their practices so I don’t use Facebook anymore.
Vote with your feet.
And don’t judge whether I have an imagination or not on a couple of forum posts. Casting aspersions says more about you than I.
If they send two or more I’ll share them with you.
I’m not selfish.
Reading between the lines I think they were forced to. The numbers of reported cases compared to Facebook last year says a lot about each company. 265 to 20miion. Yikes.
Likewise.
Well said, exactly how I have been thinking about it.
You are clearly paranoid and overthinking, bordering on trolling. If you are so concerned that Apple would misuse or abuse your content, you should probably not use the iCloud or even Apple devices. Apple never persuaded anybody to use their platform or ecosystem - it was users' choice all the way. People also took photos and shot videos and recorded music 25+ years ago or 50+ years ago on films, tapes and discs when there were no digital devices and also used Pagers, land lines, Walkie-Talkies, Walkmans, Radios, Camcorders and VCRs. There are many other (not-as-great) alternatives in the market.
Exactly!
Some people don't understand how technology and law work. They overthink even when there is evidence that Apple never abuses or misuses user data.
The CORPORATE SECURITY TRAINING says:
"Sending a FAX is safer than sending Email"
Apple obviously doesn’t have the actual CSAM pictures, they are using hash values for the CSAM pictures, and a local scan takes place on your device to see if there are any matches. But regardless once things are uploaded to your iCloud, it is encrypted. There is supposed to be only one set of keys that are able to decrypt iCloud data. If Apple created a master key, every single thing Tim Cook said in that FBI vs Apple case in 2015 didn’t mean anything and, the doors are now wide open..
I wonder, wouldn't it be possible, using a brute force algorithm, to create a visually completely different image that gives the same hash value as one of the sesame errr, CSAM images, then distribute that image to unsuspecting people and thus completely busting the system?
It is encrypted, but not end to end encrypted. A list of what’s encrypted E2E is on Apple’s website.
If it’s not E2E, Apple have the keys to decrypt.
Edit; wrote way too much.
Long and short. This new announcement changes nothing. All it means is that Apple have publicly announced photos matching known CSEM images be flagged. That’s it.
Apple can already scan all your photos and look for whatever they want. Any worries you have about the new fingerprinting already apply to every digital photo you’ve ever stored. The new system is far less capable than what Apple can already achieve with their current ML photo processing. The slippery slope horse has long since bolted.
And your photos are already decrypt-able by Apple. They are not E2E encrypted so Apple already has the keys. There is literally nothing stopping them fingerprinting already. Either on device, setting flags, or server side.
so your worries are valid, but they apply whether or not Apple implement a fingerprint match to the CSEM database. That bit doesn’t suddenly give Apple new ways of spying on you.
In any case Bad actors have better ways of incriminating you.
You’re right. It seems they reversed a decision to make everything end-to-end encrypted somewhere around 2017, after meetings with the US intelligence community. And the lists you linked to kinda go hand in hand with that.
one can choose to believe the theoretical ramblings of @tja..
OR a PhD from Oxford University, a cryptographer from Caltech who holds a PhD from MIT and a cryptography and privacy researcher from Bar-Ilan University's Dept of Computer Sciences.
These are the encryption experts who provide several papers on The Technical Assessment of CSAM Detection and the security protocol and analysis of the Apple PSI System which are available on apple's website.
Apple have released an FAQ about the child protection features.
I do agree that it’s a slippery slope, and I’d trust any large corporation as far as I could throw it, but haven’t they made it abundantly clear that they’re checking hashes for known child porn images in a particular database, and NOT doing image recognition of nether regions and informing family members if kids are taking nude selfies?? Let’s talk facts here. It is alarming, it does open the door to other uses, but we need to deal with what Apple are actually doing, rather than some imaginary version of it, and then assess the possibilities, which are indeed rife.
“Well-meaning” censorship usually negatively affects those who don’t get to call the shots. Crackdowns on pornography inevitably affect those who are distributing vital sexual health education material, sexual minorities, dumb kids fooling around, etc.
Meanhile, the biggest worry I have is that we must trust Apple that they will refuse any other nefarious uses if requested by various governments. They’ve got an okay track record with refusing to unlock iPhones for the FBI, BUT the FBI doesn’t control one of their major markets in the way that, say, the Chinese government does.
(Apologies, i intended to post hours earlier, before the conversation evolved!)
You have clearly not been in the places where these people exist. I have worked around that drain. Things happen to children that are not reported in the press because they are simply too awful. If someone can be stopped then we must put up with the ways that are used to do that.
"Apple is scanning my data! Read all about it on my facebook page!"