Loopy Pro: Create music, your way.

What is Loopy Pro?Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.

Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.

Download on the App Store

Loopy Pro is your all-in-one musical toolkit. Try it for free today.

Apple scanning images on device and in iCloud

24

Comments

  • edited August 2021
    The user and all related content has been deleted.
  • What you describe is almost like some sort of angled surface exhibiting a higher than normal viscosity.

  • ...you won't catch/stop "the big fish" in the iCloud! ...but yes ...it's a small step towards it!

    What about special encrypted networks/servers ( on the daRK WEB ) built specially for this kind of business?
    What about crypto payments for All sorts of Stuff?

  • The user and all related content has been deleted.
  • @tja I'm with you here. Apple's arguments remind me of Google and Amazon desperately looking for positive uses of the privacy-intruding technology to keep people calm.

    I wonder what role the now improved efficiency of GPS reception and microphone recording plays.

  • edited August 2021

    Instead of jumping to amateur conclusions, let's just agree that the thousands of highly qualified and experienced engineers, attorneys, C-level executives, sales and marketing people at Apple know the law and technology and how to do business better than any of us. Corporations are ruled by attorneys - why would a trillion-dollar company do anything illegal? It is just machines scanning photos for some patterns - not real people peeking into our photos. We are ok with security cameras scanning our luggage, our "bodies" and getting touched/searched by security people at airports, offices...

    When was the last time Apple was found to misuse or abuse user data? They've always stated they are not in the business of collecting and harvesting user data and demographics and selling it. Security and privacy are some of the best features of iOS -

    https://www.google.com/search?q=apple+refuse+to+unlock+iphone+for+fbi
    https://www.google.com/search?q=ios+facebook+privacy
    https://www.macrumors.com/2021/04/26/tim-cook-mark-zucerkberg-2019-private-meeting

    Criticizing Apple is the job of trolls who cannot afford Apple devices - not passionate iOS users who have better things to do. When did you see an Apple user trolling on copycat products? So, let's leave that to trolls and let them have some fun for a few minutes - because copycat vendors will soon follow Apple, anyway!

    EDITED

  • edited August 2021

    We are ok with security cameras scanning our luggage, our "bodies" and getting touched/searched by security people at airports.

    More -

    https://www.google.com/search?q=ios+facebook+privacy
    https://www.macrumors.com/2021/04/26/tim-cook-mark-zucerkberg-2019-private-meeting

  • I heard about this on the radio in the car tonight.

  • edited August 2021

    child porn is one of most horrible things people do.. sick people doing this should be publicly tortured and executed... we need to fight this plague whatever it takes...

    in case it helps to catch just few of those sick bastards, i'm completely ok with some AI scanning my iCloud photos.

    btw. wondering how many catholic priests are now deleting content of their phones in panick 😂

  • Let’s ban virus scanners, malware detection and spam filters…

  • edited August 2021

    @realdawei said:
    Let’s ban virus scanners, malware detection and spam filters…

    and also firewalls and VACCINES !!

  • Read with 'paranoid hat on'.

    What's next? The camera.app analyzes what it sees when were'a about to take a picture and if the contents suspicious or includes wanted persons or other criminal activities the authorities will be contacted and if the person behind the camera has no track-record of such activities they will be rewarded for being part of the 'good guys' :sunglasses:

    Oh wait, it already 'sees' that since voice over can be used by those with bad sight to tell them what is in front of them, ie. the camera does scene analysis, it can detect how fat a way the person in front of you is using lidar...

  • edited August 2021

    Apple will scan photos it uploads to iCloud library on device. Only if it matches a known image will it be flagged. Once a certain number of these images have been uploaded, they will be decrypted for law enforcement.

    The system won’t try and use AI to detect abuse images. It only compares to known images, even if they’ve been edited to try and fool the system.

    There’s nothing to worry about with regards to privacy and Apple scanning your photos. It already scans your photos to enable things like searching for dogs on a beach, al on device. this info is not sent to Apple.

    I’d be more worried if they did nothing.

    More info here : https://www.apple.com/child-safety/pdf/Technical_Assessment_of_CSAM_Detection_Benny_Pinkas.pdf

    Via daring fireball

    EDIT:

    Apple will scan your photos on device and give the photo a fingerprint ID number.

    It will compare the number with known images.

    Only if the fingerprint matches and an upload threshold exceeded will Apple be notified. The likelihood of a false positive is virtually zero.

    Your images won’t be decrypted for law enforcement unless the fingerprints march in which case you’re bang to rights and deserve what’s coming to you. No unencrypted images without matching fingerprints will ever get sent to Apple for analysis by law enforcement.

    I can’t see how anybody can complain about this. There is no violation of privacy and all the fingerprinting is done on device along with all the current photo analysis that takes place on device.

    And only on images uploaded to iCloud.

    I think Apple have a responsibility to ensure their servers don’t host those images.

  • @tja said:

    This is NOT OK!

    It’s not only OK it’s the only responsible thing to do.

  • edited August 2021

    @realdawei said:
    Let’s ban virus scanners, malware detection and spam filters…

    @MobileMusic said:
    and also firewalls and VACCINES !!

    Going by that, YouTube should stop scanning uploaded videos, create a Content ID of the videos and compare them against their Content ID system and not protect copyright holders and also get sued for hosting such videos. Facebook/Twitter/SoundCloud should do the same.

  • edited August 2021

    @klownshed said:
    I think Apple have a responsibility to ensure their servers don’t host those images.

    Every website / storage server has legal responsibility to not host such images or copyright-protected media not owned by the uploader.

  • Holy sheets. What's next? Apple scanning ipads to see whether people have uninstalled the apps they've asked a refund for. Lol
    I'm fine with being scanned.

  • I’m fine with you being scanned too :)
    Never used cloud, but it looks like my iStuff will soon go offline until they last, and then hw only.

  • edited August 2021

    Stupid video.

    Read the bloody details of what Apple are actually doing before shouting that the sky is falling in.

    I miss real journalism.

  • Who made Apple the f**king police!

  • edited August 2021

    Google have been scanning everybodies photos and emails on their services for exactly the same reasons since 2008. Google it. They’ll tel you.

    Apple’s method of scanning for known images maintains user privacy. Unless you’re a nonce.

    Only Nonces that upload to iCloud photos need worry.

  • edited August 2021

    Ooo tricky topic.

    I suppose as a private company it's within their rights. Icloud is a private for profit service, not a right or a public service . You don't have to use it if you don't want to.

    That being said, if you are a person of power, a celebrity, or someone of political significance, this can be used to absolutely destroy someone via nefarious means.

    Imagine you are a politician ready to lay down some new privacy laws, or tax Apple more heavily and other billion dollar corps. Who is to stop apple from planting compromising photos on your iCloud, catching it with their scan, then creating a scandal that maybe will blow over, or maybe will ruin a politicians campaign, or someone's entire career. I've seen politicians buried for less here. The FBI might clear the person of wrong doing, but now the opponent can hammer down the scandal angle and that shit
    unfortunately works over here in the u.s.

    Every time you hear about a "leak" that nobody can pinpoint the source of, a celebrity sex tape, or other secret news, it could be from stuff like this. It could be completely fabricated. Don't forget Microsoft is contracted by the NSA, Amazon hired former members of the NSA, and apple is not far behind. They are no strangers to foreign or domestic cyber espionage.

    But... Apple has better sound engine architecture than android does. So I guess they got that going for them. Probably no need to upload anything to iCloud that you wouldn't want anyone else to see.

  • @klownshed said:
    Google have been scanning everybodies photos and emails on their services for exactly the same reasons since 2008. Google it. They’ll tel you.

    Apple’s method of scanning for known images maintains user privacy. Unless you’re a nonce.

    Only Nonces that upload to iCloud photos need worry.

    The lawful judicial system should decide how this should be enforced, not by diktat.

  • How are Apple able to decrypt data stored in a persons iCloud account? Up until this point, the only one able to decrypt that data was the person with the key, the user/owner of the account. I get that they perform scans locally on your device and when uploaded to iCloud it attaches a happy smiley or an angry smiley, and after a certain amount of angry smileys, Apple will decrypt the data to verify if the pictures are CSAM or not.

    It’s that very last part that doesn’t make any sense. How are they able to decrypt the data?…

  • @knewspeak said:

    @klownshed said:
    Google have been scanning everybodies photos and emails on their services for exactly the same reasons since 2008. Google it. They’ll tel you.

    Apple’s method of scanning for known images maintains user privacy. Unless you’re a nonce.

    Only Nonces that upload to iCloud photos need worry.

    The lawful judicial system should decide how this should be enforced, not by diktat.

    U.S. law requires tech companies to flag cases of child sexual abuse to the authorities.

  • The user and all related content has been deleted.
  • edited August 2021

    @ChrisG said:
    How are Apple able to decrypt data stored in a persons iCloud account? Up until this point, the only one able to decrypt that data was the person with the key, the user/owner of the account. I get that they perform scans locally on your device and when uploaded to iCloud it attaches a happy smiley or an angry smiley, and after a certain amount of angry smileys, Apple will decrypt the data to verify if the pictures are CSAM or not.

    It’s that very last part that doesn’t make any sense. How are they able to decrypt the data?…

    That’s not how it works. The CSAM database of fingerprint keys is stored on the phone. It’s just a list of numbers effectively. When you put a photo into your photos library and iCloud photos is turned on, the photos app will fingerprint the photo. This is not a scan. Saying “scan” is very misleading.

    The fingerprint returns a number for that photo. If that fingerprint number matches a number in the database then it is flagged. The images have to be the exact same image, not just similar. If enough photos are flagged (Apple obviously won’t say how many the threshold is set to) then Apple are notified.

    At no time are photos scanned, decrypted or otherwise. In any case, iCloud photo libraries aren’t currently end to end encrypted anyway. Apple have the keys. This system would allow them to end to end encrypt and comply with the law by flagging images from the CSAM database.

  • The user and all related content has been deleted.
Sign In or Register to comment.