Loopy Pro: Create music, your way.

What is Loopy Pro?Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.

Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.

Download on the App Store

Loopy Pro is your all-in-one musical toolkit. Try it for free today.

Apple scanning images on device and in iCloud

124»

Comments

  • edited August 2021

    @tja said:

    @ashh said:

    @tja said
    The proportion of iCloud photo uploads being child porn (or "CSAM", one more stupid abbreviation to remember) is probably in the same ballparks as bombs in underwear at supermarket entrances.

    You have clearly not been in the places where these people exist. I have worked around that drain. Things happen to children that are not reported in the press because they are simply too awful. If someone can be stopped then we must put up with the ways that are used to do that.

    You wrote that so, as if I had written that!
    This is not true!

    That text was from @SevenSystems

    Check above and change, please.

    They couldn’t be more wrong.

    My wife works in safeguarding. It’s a frighteningly huge problem.

    The fact that Facebook reports more than 50,000 matches a freaking day should tell you something.

  • edited August 2021

    @SevenSystems said:
    I wonder, wouldn't it be possible, using a brute force algorithm, to create a visually completely different image that gives the same hash value as one of the sesame errr, CSAM images, then distribute that image to unsuspecting people and thus completely busting the system?

    DING DING DING

    This is worse than a collision. Is a preimage attack.

    https://github.com/AsuharietYgvar/AppleNeuralHash2ONNX/issues/1

    Apple, sorry, but the CSAM is a fu***** sh**

  • The user and all related content has been deleted.
  • edited August 2021

    @MobileMusic said:
    Instead of jumping to amateur conclusions, let's just agree that the thousands of highly qualified and experienced engineers, attorneys, C-level executives, sales and marketing people at Apple know the law and technology and how to do business better than any of us. Corporations are ruled by attorneys - why would a trillion-dollar company do anything illegal? It is just machines scanning photos for some patterns - not real people peeking into our photos. We are ok with security cameras scanning our luggage, our "bodies" and getting touched/searched by security people at airports, offices...

    When was the last time Apple was found to misuse or abuse user data? They've always stated they are not in the business of collecting and harvesting user data and demographics and selling it. Security and privacy are some of the best features of iOS -

    https://www.google.com/search?q=apple+refuse+to+unlock+iphone+for+fbi
    https://www.google.com/search?q=ios+facebook+privacy
    https://www.macrumors.com/2021/04/26/tim-cook-mark-zucerkberg-2019-private-meeting

    Criticizing Apple is the job of trolls who cannot afford Apple devices - not passionate iOS users who have better things to do. When did you see an Apple user trolling on copycat products? So, let's leave that to trolls and let them have some fun for a few minutes - because copycat vendors will soon follow Apple, anyway!

    EDITED

    I am a passionate IOS user but if you think that corparations have your best interests at heart then unfortunately you need a rethink. When Amazon should have paid 2.4 billion in corparations Tax in 2002 yet using law loopholes they paid 6 million, got to think of the shareholders. As governments can equally breath down are necks, I could have in the 70’s have withdrawn any book in the library and nobody had a remote interest in what I was reading, now I read a book and then I find the robots know what I have read and spam me with adverts. Just imagine in the 70’s you read a book and when yo have left the library billboard posters displays your personal choice of book.

  • Im trying to think of the last time i went to the library.....

    Oh yeah...

  • edited August 2021

    The less people understand about this subject the more vocally opposed they are.

  • @BitterGums said:
    The less people understand about this subject the more vocally opposed they are.

    That means the vast majority of people really understand this? Or really not that bothered? Or completely unaware-ignorant?

  • I'm conflicted about the privacy, slippery-slope, potential for abuse by corporation and government, etc. vs. (assumed) good intentions. From what I understand of the technical implementation, it seems not as invasive as the hysterics around it would seem to indicate to me at least.

    Regardless, I respect the right of a private entity to regulate and govern the usage of a service that it provides and that it does not force anyone to use. If they choose to take measures to protect themselves from being held liable for abetting such image trafficking, and they disclose thoroughly what they're doing, that is their right.

    What I'm reasonably certain of, however, is there is great potential for people to figure out exploits to use this against people. It's relatively trivial to construct innocuous seeming images that match a flagged hash. It would be easy to craft a payload of such images and get an unsuspecting party to save it to their photos. If iCloud storage is enabled by default someone could find themselves flagged and investigated without ever having knowingly done a thing.

    You might say "That couldn't happen because the database is protected." Yep, and so were 40 million T-Mobile accounts, 191 million US voter records, health records of 25 million US Veterans, 21.5 million US government employees, a whole trove of NSA hacking tools ... I feel sorry for anyone that thinks that obtaining that database is impossible or even highly unlikely.

    You might say "But, if referred to authorities, the images would be un-encrypted and the person cleared." I dunno. All it takes is a tiny whiff of suspicion getting out to ruin someone's life, even if they are cleared. And, I don't know about you, but I've seen and experienced more than enough of the government jumping to conclusions and being reluctant to back down.

    Potential for malice, blackmail, etc. is just too, too high, if you ask me.

  • @tja said:

    @OnfraySin said:

    @SevenSystems said:
    I wonder, wouldn't it be possible, using a brute force algorithm, to create a visually completely different image that gives the same hash value as one of the sesame errr, CSAM images, then distribute that image to unsuspecting people and thus completely busting the system?

    DING DING DING

    This is worse than a collision. Is a preimage attack.

    https://github.com/AsuharietYgvar/AppleNeuralHash2ONNX/issues/1

    Apple, sorry, but the CSAM is a fu***** sh**

    Yes.

    And not CSAM is the problem, but the whole dynamic system of classifying images based on "some source" for "some reason" and for the interest of "some group(s)", able to upload any content de-cryptable to iCloud - or whereever.

    This is the most terrifying thing ever.

    Wow I wish I lived in your world where that really was the most terrifying thing ever.

  • The user and all related content has been deleted.
  • The user and all related content has been deleted.
  • I'm all for it. The more pedo's they catch the better.

  • edited August 2021
    The user and all related content has been deleted.
  • edited August 2021

    @cyberheater said:
    I'm all for it. The more pedo's they catch the better.

    It is not about catching pedo's or acting as law-enforcement - that's not their business.

    They just don't want any shit stored on their servers - to comply with the regulations. That's all.

  • @MobileMusic said:

    @cyberheater said:
    I'm all for it. The more pedo's they catch the better.

    It is not about catching pedo's or acting as law-enforcement - that's not their business.

    They just don't want any shit stored on their servers - to comply with the regulations. That's all.

    What a shame

  • The hysteria over this feature is completely misdirected.

    Apple's announcement changes absolutely nothing. Everything anybody has said that this system would enable can already be achieved.

    If anybody thinks this system will make it easier for China to find dissidents or for one of Putin's cronies to incriminate an enemy is living in cloud cuckoo land.

    Does anybody truly believe that China is not already scanning every single photo stored on the iCloud servers they host, and that they actually need this feature to spy on their subjects? And how many people shouting foul already upload their photos to Facebook and Google? They already scan for CSAM images and for f**k knows what else and have been doing so for many years.

    This is an example of Apple being beaten with the hubris stick. They thought they'd come up with a better, more privacy orientated system than scanning on-server but people have jumped on the bandwagon because Apple said they'll be scanning on-device. Ah yes, people say, but governments can step in and make them scan for other things now too!

    As if they couldn't before.

    Apple could, at any time, be forced to do whatever any government wants. That's how governments work. Apple have not invented a system that is easier for governments to abuse, despite the claims. By making the system that they have, any image scanned has to match an existing image. So a new image of Putin doesn't count. For that you'll have to ask Google and Facebook -- they are able to match it on their servers already.

    Governments can and will ask for more and more. Apple haven't suddenly made it easier for them to do so. People like the EFF are using the hysteria to get their message across as it's in the news. Apple are big news. Nobody cares what the EFF say usually. It's manna from heaven for them. It's good that they're getting people to understand what governments can and will do. And how much many governments care about our privacy -- They're just glossing over the fact that all of this is already possible today to get their point across.

    I will happily change my opinion once somebody goes to prison for accidentally uploading a few dozen photos of cats that look like hitler reverse engineered to have CSAM matching hashes.

  • edited August 2021
    The user and all related content has been deleted.
  • The user and all related content has been deleted.
  • wimwim
    edited August 2021

    @tja said:
    It seems as pointless as a political discussion between members of two opposing parties.

    Hi @tja. I know you don't want to discuss it further, but if you don't mind, can you help me understand how this is a political issue?

    Apple is a non-government entity. It is providing a cloud storage service for photos, and it can be argued, could be held legally liable for content stored there. Call me a cynic, but I somewhat doubt that their real motivation is to protect children. I suspect it's more about avoiding lawsuits and the horrible publicity that would result from defending themselves against them.

    Regardless of motivation, I'm having trouble grasping how any activity that is undertaken by a private entity to regulate how services it provides are used can be construed as political. To me it would become political only if forced on them, or outlawed, by a government.

    I know some would say that developing the technology opens the door for just that. But I don't buy that. Governments can (and have) demanded that technology be developed to implement the controls they require. Just because it pre-exists doesn't make any difference. Will China demand Apple scan for other kinds of images? Maybe. Would they demand that even if the technology wasn't already there? Yes, I believe so. Would Apple comply either way? Sadly, I believe they would.

    If a government demanded Apple to do anything like this, then yes, it would be a political issue. But short of that, I don't see it.

    If it were used for anything other than detecting verifiable illegal activity it would clearly be a privacy issue. But, I argue that there can be nothing private about illegal activity that harms others. Let's say I have a security camera in front of my house and it captures a murder on my street. Am I to withhold that imagery from the police to protect the perpetrator's privacy? Or, if I withhold it and later it's discovered I had such info, could I be forced to defend myself as an abettor? Am I invading people's privacy by having the camera in the first place?

    Anyway, I've gone on too long. To simplify my conundrum on this: This seems like a private entity decision. I'm not really seeing how it is a political issue, or even a privacy issue (other than that I absolutely believe it will be hacked and misused) as long as it is possible to NOT use the service if one is concerned about it. I'm interested in your perspective on this, but understand if you don't want to weigh in because of the shit storm of argument it might trigger.

  • @tja said:
    I just cannot stand your stance and opinions in this, @klownshed

    Thanks.

    But I have no interest to discuss this further.

    You don't discuss. You just rant.

    I've made it clear that I think there is a big problem with privacy, just that Apple's new scanning thing is the wrong target for vitriol. But it's falling on deaf ears. You either want somebody to agree 100% or you're 100% against them.

    All I've tried to do is get across is that complaining about this new scanner thing Apple have announced is like having a go at the council for being environmentally unfriendly after their announcement that from now on they'll be cutting the verge next to the motorway with petrol powered strimmers... whilst thousands of cars belch diesel into the atmosphere 10 feet away.

    Nothing Apple have announced for iOS 15 can't already easily be done now by Apple. And probably is (and is definitely being done by Google, Facebook, etc.) otherwise how would Apple know how bad the problem with child abuse images on their servers already is.

    I'm glad that this is the end of the discussion, although expect that you can't bear not having the last word.

    Feel free... It's all yours.

  • edited August 2021

    For people who never worked in regulated environments (healthcare, finance, education, etc), it may be hard to understand what Apple and other companies are doing to comply with regulations and governance. Some users never even heard of basic regulations like COPPA. I did not get critical when I saw the news as I knew it was just another regulatory thing as an architect who designs systems complying with heavy regulations.

    Regulations are there to comply with and Apple is not an exception - only they are doing it in a much more graceful manner (on-device) than any other company. If you don't like/trust Apple, don't use iCloud - choose other alternatives that are invasive and let them sell your info. Without regulations, the world would be unreliable, unpredictable chaos and a shit hole. If you don't like regulations, just go analog. Mankind existed even when there was no technology or devices and people were happy too.

    Surprisingly, no one ever talked about the invasive practices for ages by Google, Facebook, etc. who profile users and build demographics to sell on their marketing platforms making billions which is how they make money. Is it because they are free? It's not free if they are selling our info (Yahoo mail, Gmail are not free) and most users don't even know their info is sold. Well, iCloud is free too up to 5 GB.

    How many people have seen or even heard about Google/YouTube Ads, Facebook Marketing, Twitter Marketing, Quora Marketing "platforms" (behind the scenes) which are visible to only paying clients and not normal users? The ultimate purpose these businesses is to build a huge user base, gather their demographics and sell it on their "targeted" marketing platforms to get maximum ROI to their clients with minimum spending - to beat the competition. Clients are able to target their marketing campaigns to specific region, country, state, city, zip code, language, gender, age range, hobbies, marital status, time of the day, etc. on these platforms. A single click/view/impression on their ads can cost clients anywhere from a few pennies to hundreds of dollars and quickly soak into a hefty bill. A client could stretch a $1,000 budget over a month or in just a few minutes on their campaign. If Facebook acquired WhatsApp for 22 billion, their sole intention was to acquire WhatsApp's huge user base and sell their info later on - even though WhatsApp was running in losses at that time.

    https://www.google.com/search?q=facebook+whatsapp+acquisition

    The one crucial thing that differentiates Apple from all these companies is - Apple doesn't build our demographics or sell our info - even though nothing could stop them from doing that and add more billions to their bottom line because they have the user base, iCloud, AppStore, Safari, ecosystem... (any other company would sell such user base info). We are using Apple because we trust them. Apple is not making it mandatory for anyone to use iCloud.

  • edited August 2021
    The user and all related content has been deleted.
  • edited August 2021
    The user and all related content has been deleted.
  • edited August 2021
    The user and all related content has been deleted.
  • @tja said:
    Hello @wim

    Hi @tja, thanks for taking the time to answer. I know it's not a fun topic for you. Your answers did help me understand where you're coming from.

    I agree that this is a bad move from Apple, though for mostly different reasons. I disagree that they have any kind of monopoly in this case. I disagree storing photos or anything private in nature in iCloud, or that even taking photos with one's phone is a an essential service that needs to be regulated. I disagree that government should become involved. I'll just leave it at that. Our basic differences are rooted in what rises to the level that more laws need to be instituted. Indeed, that is a fundamental political difference where people almost never change their mind. So no point in arguing. ;) ✌🏼

  • @tja said:
    Hello @MobileMusic

    But like it or not, the systems that I mentioned above are NEW and more invasive than ever before!

    No, it's not. Do you have any better, brilliant ideas to comply with the stipulated regulations?

    They add LOCAL scanning (for whatever goals, with some easy changes).
    They add way to DECRYPT remote content for others than ME (for whatever goals, with some easy changes).

    Local scanning is for our privacy.

    Ability to decrypt at the other end is for human verification and accuracy before acting on the content - a fool-proof system, without which users won't have peace of mind for any false positives. Do you have any brilliant ideas to help with human verification of "encrypted" images at the other end - when needed? It's not that users know the server password - it's a cloud architecture. Those systems were designed by highly experience architects and engineers - considering all things. What qualifies you to criticize it? You have not offered any graceful solutions better than Apple's, yet.

    You may not believe that Apple would do something like adding goals, targets or other interests, but we now have such a new instrument!

    Apple has much better things to do than gathering user demographics, adding analytics and build a marketing platform to sell our data to clients to make more billions. They have repeatedly stated in their documented keynotes that they are not in the business of gathering user data, profiling or selling it.

    And as history has shown, such an instrument WILL be used!

    What history? San Bernardino incident where Apple refused to cooperate with the law and stood their ground? Used for what?

    It WILL be used for other goals and interests!

    Such as...? To make GarageBand better or resolve AU issues in iPadOS 14?

    But as I wrote, such a discussion is pointless, when you don't believe in this.

    This has nothing to do with what I/you believe or don't. Every business entity in the US (at least) should comply with the regulations specific to their industry - just as individuals abide by the Law. No exceptions. Would you not mind your physician not complying with HIPAA and sharing your health records with others without your consent or knowledge? Would you not mind if a bank approves a loan for a client with poor credit even if they do not qualify but they decline your application even if you qualify 100%?? Those are some benefits of regulations.

    And breaking those rules should involve a judge and a criminal act, but not by default as a mass surveillance.

    You mean a crime should be committed and victims should be brutalized before acting on it and not prevented from happening? That's RIDICULOUS. Do you want to be allowed to board a flight without going through security scanning - even if you state under oath you are not going to commit any crime? There were expensive lessons learnt from 9/11 and it is not going to change ever. Are you prepared to sit next to another passenger who is not wearing a mask?

    And this by a private company too! They behave like they are Government, Law Enforcement, Courts, Judges and Correction insitutes too. This is just not right.

    No, incorrect - you are mistaken. NONE of those are their businesses or will ever be and they are not behaving like law-enforcement, courts, la blah...

    They just don't want any shit uploaded on to their servers - to comply with regulations. Nothing more, nothing less!

    This is right. They couldn't care for less what you do in your personal life outside iCloud. You wanna use iCloud - don't upload any shit. Period. Or just don't use iCloud. Period.

    If Apple is implementing features to comply with regulations, there is no point in criticizing Apple because no amount of criticism is going to change the regulations that Apple is required to comply with. Law-makers are the right people to approach for grievances.

    If you are not uploading objectionable content to iCloud, you should not be worried - at all -because NONE of your photos will be flagged or decrypted - ever - anyway.

  • The user and all related content has been deleted.
  • wimwim
    edited August 2021

    @tja said:
    Thanks, @wim ;-)

    Just one thing:

    I am not a fan of more and more laws!
    To the contrary, I think that we have mostly too many of them already.

    But in the case of regulating large international companies, I strongly feel that we need MORE laws!
    Don't control the people, control the companies. Just to clarify :-)

    ugh. I soooooo want to reply, but nope not gonna do it. 😂 😀
    Have a great day @tja. 😎👍🏼

  • The user and all related content has been deleted.
Sign In or Register to comment.