Sorry Python but it is what it is.

    • barsoap@lemm.ee
      link
      fedilink
      arrow-up
      9
      ·
      1 year ago

      cached copies of crates that you downloaded

      Meh, what else is it supposed to do, delete sources all the time? Then people with slow connections will complain.

      Also size-wise that’s actually not even much (though they could take the care to compress it), what actually takes up space with rust is compile artifacts, per workspace. Have you heard of kondo?

      • someacnt@sopuli.xyz
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        Idk, maybe you can share the common packages across projects. (That can never go wrong, right? /s)

        • barsoap@lemm.ee
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          Sources are shared, sharing compile-time artefacts is done within workspaces.

            • Anafabula@discuss.tchncs.de
              link
              fedilink
              arrow-up
              2
              ·
              1 year ago

              You can globally share compile artifacts by setting a global target directory in the global Cargo config.

              In $HOME/.cargo/config.toml:

              [build]
              target-dir = "/path/to/dir"
              

              The only problems I had when I did it where some cargo plugins and some dependencies with build.rs files that expected the target folder in it’s usual location.

    • hatchet@sh.itjust.works
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      I actually vastly prefer this behavior. It allows me to jump to (readable) source in library code easily in my editor, as well as experiment with different package versions without having to redownload, and (sort of) work offline too. I guess, I don’t really know what it would do otherwise. I think Rust requires you to have the complete library source code for everything you’re using regardless.

      I suppose it could act like NPM, and keep a separate copy of every library for every single project on my system, but that’s even less efficient. Yes, I think NPM only downloads the “built” files (if the package uses a build system & is properly configured), but it’s still just minified JS source code most of the time.

      • Espi@lemmy.world
        link
        fedilink
        arrow-up
        6
        arrow-down
        1
        ·
        1 year ago

        With python and virtualenv you can also keep the entire source of your libraries in your project.

    • Pipoca@lemmy.world
      link
      fedilink
      arrow-up
      5
      arrow-down
      1
      ·
      1 year ago

      Python virtual environments feel really archaic. It’s by far the worst user experience I’ve had with any kind of modern build system.

      Even a decade ago in Haskell, you only had to type cabal sandbox init only once, rather than source virtualenv/bin/activate.sh every time you cd to the project dir.

      I’m not really a python guy, but having to start touching a python project at work was a really unpleasant surprise.