So, I’m selfhosting immich, the issue is we tend to take a lot of pictures of the same scene/thing to later pick the best, and well, we can have 5~10 photos which are basically duplicates but not quite.
Some duplicate finding programs put those images at 95% or more similarity.

I’m wondering if there’s any way, probably at file system level, for the same images to be compressed together.
Maybe deduplication?
Have any of you guys handled a similar situation?

  • just_another_person@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    3
    ·
    3 months ago

    Everything you just described is instruction. Everything from an input path and desired result can be tracked and followed to a conclusory instruction. That is not decision making.

    Again. Computers do not make decisions.

    • simplymath@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      3 months ago

      Agree to disagree. Something makes a decision about how to classify the images and it’s certainly not the person writing 10 lines of code. I’d be interested in having a good faith discussion, but repeating a personal opinion isn’t really that. I suspect this is more of a metaphysics argument than anything and I don’t really care to spend more time on it.

      I hope you have a wonderful day, even if we disagree.