• Cory_t_@lemmy.world
    link
    fedilink
    arrow-up
    42
    arrow-down
    1
    ·
    1 year ago

    My thoughts are that there is no ethical way to use facial recognition in public spaces. I’m having a hard time thinking of a single, ethical way to use facial recognition anywhere on the planet.

    • SkyNTP@lemmy.ml
      link
      fedilink
      arrow-up
      12
      arrow-down
      4
      ·
      edit-2
      1 year ago

      Phone unlock. Is unlocking a phone unethical? Categorical no.

      Facial recognition is a tool. And like with any other tool there are always ways in which it can be used for good and for bad. In fact I can’t think of a single tool, guns and nuclear bombs included, that don’t have some potential uses for good, in addition to bad. In fact, you might say that the very definition of a tool is that it has a desirable application, and a good use is merely a desirable application where the collateral damage of it’s use is contained or offset by the benefit.

      Perhaps what you mean to say is corruptible? That is to say that use of the tool tends to devolve into other unethical uses and consequences? I might be in agreement with you on that one.

      • In the U.S., you do not want to use face unlock for your cellphone. It’s not protected by the 5th amendment. Law enforcement can get into your phone without a warrant.

        PIN & password should be used.

        • hikaru755@feddit.de
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          For those not willing to give up on convenience, on Android, there’s Lockdown mode, which will temporarily disable access via biometrics and force the use of your PIN/password to get into your device. Not sure about other brands, but on Pixel, you can enable it by long-pressing the power button and tapping on “Lockdown”.

          • Yes, however, I’m not taking the risk that I don’t have hands on my phone if shit ever hit the fan. And in any hectic situation people might not remember “lockdown”. Definitely, not as simple as you think.

    • cynar@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      5
      ·
      1 year ago

      I have some minor mild variant of face blindness. I can see faces, but my brain won’t store them properly. I therefore struggle to put names and faces to people.

      An AR device with real time face recognition would be a godsend for me.

  • 1984@lemmy.today
    link
    fedilink
    arrow-up
    4
    ·
    edit-2
    1 year ago

    Horrible and the start of real dystopia.

    I’m glad im almost 50 now because this place is gonna suck so much soon.

    • The Hobbyist@lemmy.zip
      link
      fedilink
      arrow-up
      5
      ·
      edit-2
      1 year ago

      It is illegal in some countries to fully cover your face in public. If it became a way to bypass surveillance, it could be made into law if it wasn’t already :(

      Edit: for those wondering, Switzerland is one of them. Though they don’t have a large number of public cameras (yet?).

  • pinkdrunkenelephants@lemmy.cafe
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    1 year ago

    Facial recognition doesn’t work and is racist. Like, it actually is extremely ineffective against dark-skinned people.

  • queermunist she/her@lemmy.ml
    link
    fedilink
    arrow-up
    9
    arrow-down
    6
    ·
    1 year ago

    Hypothetically, a just system using this technology would be capable of a lot of good!

    The system is unjust, so any good use is vastly outweighed by the horror and evil it will be used for - like how they use this to track uppity Palestinians and arrest/kill them.

  • bbbhltz@beehaw.org
    link
    fedilink
    arrow-up
    3
    ·
    1 year ago

    Is it opt-in or opt-out by default? I think I would be against it. But then again I am not a rule breaker so it is hard to imagine.

    Given the choice, it would be a hard no for me. It isn’t 100% perfect yet, mistaken identities and discrimination are good reasons to not bother with it. Beyond that, if insurance companies had access to it, it would be a disaster.

    It’s Friday night, you go out for a just a drink or perhaps the camera catches you smoking. Now your life insurance policy is messed up.

    Obviously, that is an exaggeration the likes of which only happen in Black Mirror. Power and greed have never pushed anyone to any single unethical thing ever.

  • poVoq@slrpnk.net
    link
    fedilink
    arrow-up
    3
    ·
    1 year ago

    I imaging if I had some AR glasses and could have them recognize faces from my personal photo collection it could be quite useful in a business context to have it remind me of names and other important information.

    It could also be used as a tool for service workers to recognize people involved in decision making that is hostile to worker rights and refuse any service to them as a kind of low profile strike/protest.

    • Possibly linux@lemmy.zip
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Its kind of wild to imagine a future where you no longer need to remember faces or names. The headset will do everything for you.

      In that future world it will be a little wild when someone’s headset dies as the will just stumble around aimlessly will no purpose

  • andruid@lemmy.ml
    link
    fedilink
    arrow-up
    1
    arrow-down
    1
    ·
    1 year ago

    I am in the boat that it could be a great thing, but it’s the systems around that just currently don’t have the level of trust to really leverage it.

    If local AI used with systems with absolutely minimal storage and good p2p levels of permission were used and laws to minimize abuse by the state and corporations were in place it would different.

    By example the opposite of what Ring represents would be good to me.

  • Steve@communick.news
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    4
    ·
    1 year ago

    You can’t legally control weather or not someone recognizes you on the street. There is no right to privacy in public spaces. It’s the principal that protects people filming the police. Any important public events really.

    • CrypticCoffee@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      I don’t see why not. There is a difference between an off chance of someone noticing you vs. camera’s with high accuracy recognising your face and being able to track your locations, what places you visited and who with for every minute of every day.

      • Steve@communick.news
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        1 year ago

        what places you visited and who with for every minute of every day

        Well now that’s a different proposition.

        Now you aren’t simply in a public place being photographed and identified by AI. Now you’re actively being monitored and tracked. That’s more like a person stalking you. That may be unethical, depending on who’s tracking you and why? Basically unless it’s law enforcement of some kind, with a specific warrant to track your location, it wouldn’t be ethical.

        • CrypticCoffee@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          The thing is, most western governments are pushing towards facial recognition and monitoring without the need for a warrant. Most countries are already stacked up to the eyeballs with CCTV (UK for example, and hooking that in with facial recognition is dangerous). First they start off with it being for terrorists, then paedophiles, then other criminals, but ultimately, it’s monitoring everyone to track down a few. When you have that infra in place, and you don’t have sufficient oversight, you can soon tweak that towards activist groups, then opposition groups etc.

          You have to challenge it before the infrastructure goes in, because after it’s in, it’s already too late.

          • Steve@communick.news
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            1 year ago

            While that is all true, it’s effectively saying nothing more than, “The misuse of a technology is unethical.” Which I think we can all agree on. So many people are pointing out obvious examples of abuse as arguments against the tech itself.

            The original question was only about the technology itself. Which is only an interesting etical question if we assume, using it appropriately.