Loopy Pro: Create music, your way.

What is Loopy Pro?Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.

Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.

Download on the App Store

Loopy Pro is your all-in-one musical toolkit. Try it for free today.

Apple scanning images on device and in iCloud

edited August 2021 in Other
The user and all related content has been deleted.
«134

Comments

  • edited August 2021

    Give the multiple layers of user data encryption and privacy protection they built into this and the terrible things they are trying to prevent by implementing it, I’m totally fine with it.

  • edited August 2021

    From the article:

    CSAM detection will help Apple provide valuable information to law enforcement on collections of CSAM in iCloud Photos.

    If anyone has a problem with an online storage provider searching for and detecting child porn in photos being uploaded to their cloud service…. I’m sorry but I see no sane way this could be considered a bad thing.

  • @Hmtx said:
    From the article:

    CSAM detection will help Apple provide valuable information to law enforcement on collections of CSAM in iCloud Photos.

    If anyone has a problem with an online storage provider searching for and detecting child porn in photos being uploaded to their cloud service…. I’m sorry but I see no sane way this could be considered a bad thing.

    The problem is with indiscriminate searching of anything for anything at all.

    What would you think if at every supermarket entrance, you were asked to pull down your underwear so you could be searched for a potential bomb? "But it's just for everyone's safety"...

    The proportion of iCloud photo uploads being child porn (or "CSAM", one more stupid abbreviation to remember) is probably in the same ballparks as bombs in underwear at supermarket entrances.

  • edited August 2021
    The user and all related content has been deleted.
  • @tja said:
    "We try to do good, and for this we need to look into your pockets and smell your underwear".

    😄 you literally took the words out of my mouth.

  • edited August 2021
    The user and all related content has been deleted.
  • @Tarekith @Hmtx that is exactly how your 4th amendment rights were abolished after 9/11.

    And of course there’s the issue of false positives. “Oh, that was a photo of your daughter at the beach? Sorry for imprisoning you and ruining your life.”

  • edited August 2021
    The user and all related content has been deleted.
  • The user and all related content has been deleted.
  • Nah, you guys are wrong.

    iCloud Photo is an online service to store images. They have a legal obligation to do what they can to ensure they aren’t storing anything illegal. If you expect that company not to review/ scan what they are storing for you… well you are going to have to pay for a service with a much higher price than what Apple charges.

  • edited August 2021
    The user and all related content has been deleted.
  • @tja said:

    @Hmtx said:
    Nah, you guys are wrong.

    iCloud Photo is an online service to store images. They have a legal obligation to do what they can to ensure they aren’t storing anything illegal. If you expect that company not to review/ scan what they are storing for you… well you are going to have to pay for a service with a much higher price than what Apple charges.

    No, that is not their job and it should even be illegal!

    The situation is different, when you are sharing content!
    In this case, they may legally be required to make sure that you are not offering illegal content.

    >

    You can share an album in iOS.

  • edited August 2021

    I disagree @tja . Possession of child porn can and should be prosecuted. It’s not about whether you share it.

    And I am not arguing from the “good people have nothing to hide” trope. I have plenty to hide… I will just never put it on the internet or into the hands of another company.

    “ that is not their job and it should even be illegal!” … hmm, Apple can do whatever they please with their services. If they declare to their customers how they are handling photos stored on their servers, what could be considered illegal here? If you don’t like the service go buy another service.

    [EDIT: and I’m bowing out here, sorry I don’t have the brain space to keep arguing an issue many of us clearly have strong opinions about.]

  • The user and all related content has been deleted.
  • @tja said:
    But while I too don't produce or consume child porn, don't plan terroristic attacks or want me to build bombs, I still don't want other to check what I am doing!

    The thing is, IF someone actually wants to produce and distribute child porn, they surely won't be stupid enough to use iCloud!

  • edited August 2021

    The articles says "new technology in iOS and iPadOS* will allow Apple to detect known CSAM images stored in iCloud Photos."

    What this suggests to me is that there is probably a big old government database somewhere of commonly traded images that are already out there. These images are probably known to often be in the possession of your typical pedo trader folk. So it sounds like Apple will scan it's database for these 'known CSAM images'. It doesn't seem to imply to me that any new picture taken will be looked at by someone at Apple to see if your kids beachside nipple slip is accidental or not. Seems more like a machine going 'does it match any of these pedo pics making the rounds?' sort of thing. Also sounds like a certain threshold of positives need to be hit first. But yah maybe next they will scan for movies, comic books, nuke plans etc, who knows? Maybe they already do.

  • edited August 2021

    @SevenSystems said:

    @tja said:
    But while I too don't produce or consume child porn, don't plan terroristic attacks or want me to build bombs, I still don't want other to check what I am doing!

    The thing is, IF someone actually wants to produce and distribute child porn, they surely won't be stupid enough to use iCloud!

    Oh of course there are stupid sick people... (that is part of the problem). I mean there literally are medically sick people. If the gov is trying to dismantle networks of people (which is how these things likely operate) there will be a whole strata of connections and if one of them is wiggy, off their rocker and storing stuff stupidly then there is a crack to exploit in brining down the operation.

  • @AudioGus said:
    The articles says "new technology in iOS and iPadOS* will allow Apple to detect known CSAM images stored in iCloud Photos."

    What this suggests to me is that there is probably a big old government database somewhere of commonly traded images that are already out there. These images are probably known to often be in the possession of your typical pedo trader folk. So it sounds like Apple will scan it's database for these 'known CSAM images'.

    That's exactly what Apple has said they are doing, scanning for known images.

  • @Tarekith said:

    @AudioGus said:
    The articles says "new technology in iOS and iPadOS* will allow Apple to detect known CSAM images stored in iCloud Photos."

    What this suggests to me is that there is probably a big old government database somewhere of commonly traded images that are already out there. These images are probably known to often be in the possession of your typical pedo trader folk. So it sounds like Apple will scan it's database for these 'known CSAM images'.

    That's exactly what Apple has said they are doing, scanning for known images.

    Yah just thought I would extract that bit for those who don't read articles before jumping in. ;) I'm feeling smug having read the actual article this time.

  • I kind of see it as Apples way of trying to get ahead of the having to provide back doors to allow authorities to check for child porn. Once they provide a back door it’s just a matter of people finding the key.

  • @BiancaNeve said:
    I kind of see it as Apples way of trying to get ahead of the having to provide back doors to allow authorities to check for child porn. Once they provide a back door it’s just a matter of people finding the key.

    Good point, had not thought of that. Plus the way they describe it sounds like 'dusting for prints'. But at the same time Apple also claim to be selling 'Pro' tablets... stay vigilant folks!

  • https://9to5mac.com/2021/08/05/apple-announces-new-protections-for-child-safety-imessage-safety-icloud-photo-scanning-more/

    "Apple says the feature will come first to the United States but it hopes to expand elsewhere eventually."

    We'll have to wait and see how EU responds to this at some point in time.
    But then again we already have more than a few 'nutcases' in the EU Parliament...

  • @Samu said:
    https://9to5mac.com/2021/08/05/apple-announces-new-protections-for-child-safety-imessage-safety-icloud-photo-scanning-more/

    "Apple says the feature will come first to the United States but it hopes to expand elsewhere eventually."

    We'll have to wait and see how EU responds to this at some point in time.
    But then again we already have more than a few 'nutcases' in the EU Parliament...

    Good tip! Scan them first Apple.

  • Yeah I'm all for privacy but at the same time I'd like to see who they manage to flush out.

  • @Hmtx said:
    From the article:

    CSAM detection will help Apple provide valuable information to law enforcement on collections of CSAM in iCloud Photos.

    If anyone has a problem with an online storage provider searching for and detecting child porn in photos being uploaded to their cloud service…. I’m sorry but I see no sane way this could be considered a bad thing.

    This conversation would be more productive if we ignore the people who hijack it with the “child porn is bad, therefore literally anything done to prevent it is justified” argument.

    You’ll never convince those folks that privacy is important, or that it needs protection in order to keep it. Trying to do so will just frustrate you.

    I’m most surprised that the matching appears to be done on your phone, and “safe” images are tagged as such. Technically it makes sense—the on-device image recognition in iOS 15 beta is incredible. It recognizes animal and plant species pretty much instantly.

    I’m uncomfortable with my photos getting scanned like this, but Apple is probably still the least-bad company for online photos. I’ve read that it’s obligatory for photo storage services in the US to run these scans.

    It also seems like this is a way for the authorities to get access to photos when Apple turns on proper end-to-end encryption for iCloud photo library. By having the device run the scan, and then reporting positives to Apple, the entire library, online too, could be encrypted. It’s a clever workaround, but it’s a chilling precedent.

    I will wait to see if this flies in Europe, but I’m considering ditching iCloud Photos. The problem is, it’s a really great service otherwise.

  • edited August 2021

    Catching the worst of the worst among us is a laudable goal, however I see this as opening the doors for countries like China or Russia (or even the US) to demand Apple to monitor for political dissidents or anyone opposed to the regime in power.

    Frankly, if they wanted to solve problems involving child abuse, they should start with the majority of people who work in Hollywood and the entertainment industry and those who wield unchecked political power (remember Epstein, who was connected to nearly everyone in politics?)

  • How about if we just ignore the people who say privacy is everything and to hell with the rights of children instead?

  • @NeuM said:
    Catching the worst of the worst among us is a laudable goal, however I see this as opening the doors for countries like China or Russia (or even the US) to demand Apple to monitor for political dissidents or anyone opposed to the regime in power.

    Agreed except I do not see this in regards to China or Russia.
    I see it happening in regards to the US and the U.K.
    It's a well known fact that we have more surveillance cameras
    than any country in the World except China.
    The only exception to this are the cameras that used to surround
    the Houses of Parliament, those were removed when
    it was attacked by a lone person a few years ago and all the footage
    was conveniently ,'misplaced'.

    Frankly, if they wanted to solve problems involving child abuse, they should start with the majority of people who work in Hollywood and the entertainment industry and those who wield unchecked political power (remember Epstein, who was connected to nearly everyone in politics?)

    Agreed.

    I'm putting my neck out here.
    I know for a fact that there is abuse within the entertainment industry
    and some of it is being enabled by Law enforcement members here in the U.K.
    I walked away from the industry because of it.
    That was two decades ago.
    Two years ago I provided my audio engineering services for
    an ex police officer/whistleblower who confirmed that police officers
    were involved and complicit.

    The moment that politicians and Law Enforcement agencies have
    to conform to Apple's new implementation then it would be worth the hassle.

    Our government here in the U.K has had a series of articles
    in major media outlets going back at least a decade that stated
    that major politicians, right up to the former prime minister Theresa May,
    were complicit in child porn, abuse et al.
    Apparently there are trials pending but so far nothing has been done.

    When Theresa May was P.M her political party introduced what is
    called a ,'Snoopers Charter', here's an article from Computerworld
    that describes it.

    https://www.computerworld.com/article/3427019/the-snoopers-charter-everything-you-need-to-know-about-the-investigatory-powers-act.html

    There were amendments that were included in the Charter that
    stated that all official data records prior to the Charter in regards
    to politicians could get wiped clean.

    As far as I'm concerned since the Charter was implemented
    all of our communication in the U.K is monitored.

    There are other more unsavoury things happening data wise so
    with Apple permitting Law Enforcement to search through our data
    doesn't bother me,I'm more concerned about the U.K Government.

  • @Gravitas said:

    @NeuM said:
    Catching the worst of the worst among us is a laudable goal, however I see this as opening the doors for countries like China or Russia (or even the US) to demand Apple to monitor for political dissidents or anyone opposed to the regime in power.

    Agreed except I do not see this in regards to China or Russia.
    I see it happening in regards to the US and the U.K.
    It's a well known fact that we have more surveillance cameras
    than any country in the World except China.
    The only exception to this are the cameras that used to surround
    the Houses of Parliament, those were removed when
    it was attacked by a lone person a few years ago and all the footage
    was conveniently ,'misplaced'.

    Frankly, if they wanted to solve problems involving child abuse, they should start with the majority of people who work in Hollywood and the entertainment industry and those who wield unchecked political power (remember Epstein, who was connected to nearly everyone in politics?)

    Agreed.

    I'm putting my neck out here.
    I know for a fact that there is abuse within the entertainment industry
    and some of it is being enabled by Law enforcement members here in the U.K.
    I walked away from the industry because of it.
    That was two decades ago.
    Two years ago I provided my audio engineering services for
    an ex police officer/whistleblower who confirmed that police officers
    were involved and complicit.

    The moment that politicians and Law Enforcement agencies have
    to conform to Apple's new implementation then it would be worth the hassle.

    Our government here in the U.K has had a series of articles
    in major media outlets going back at least a decade that stated
    that major politicians, right up to the former prime minister Theresa May,
    were complicit in child porn, abuse et al.
    Apparently there are trials pending but so far nothing has been done.

    When Theresa May was P.M her political party introduced what is
    called a ,'Snoopers Charter', here's an article from Computerworld
    that describes it.

    https://www.computerworld.com/article/3427019/the-snoopers-charter-everything-you-need-to-know-about-the-investigatory-powers-act.html

    There were amendments that were included in the Charter that
    stated that all official data records prior to the Charter in regards
    to politicians could get wiped clean.

    As far as I'm concerned since the Charter was implemented
    all of our communication in the U.K is monitored.

    There are other more unsavoury things happening data wise so
    with Apple permitting Law Enforcement to search through our data
    doesn't bother me,I'm more concerned about the U.K Government.

    100% in agreement with you. Our government is sinister and aches to become as totalitarian as it can get away with. The past 18 months has given them carte blanche to do as they like.

  • @SevenSystems said:

    @Hmtx said:
    From the article:

    CSAM detection will help Apple provide valuable information to law enforcement on collections of CSAM in iCloud Photos.

    If anyone has a problem with an online storage provider searching for and detecting child porn in photos being uploaded to their cloud service…. I’m sorry but I see no sane way this could be considered a bad thing.

    The problem is with indiscriminate searching of anything for anything at all.

    What would you think if at every supermarket entrance, you were asked to pull down your underwear so you could be searched for a potential bomb? "But it's just for everyone's safety"...

    The proportion of iCloud photo uploads being child porn (or "CSAM", one more stupid abbreviation to remember) is probably in the same ballparks as bombs in underwear at supermarket entrances.

    I literally stopped shopping because of this and now use freshdirect.

Sign In or Register to comment.