I grew up during the dial up era of internet and remember how insane it was each time the technology improved, broadband, dsl, fiber etc.

I wouldn’t expect pages to instantly load, but I have to imagine all the data farming is causing sites to be extremely bogged down.

  • shnurr@fludiblu.xyz
    link
    fedilink
    English
    arrow-up
    116
    arrow-down
    1
    ·
    1 year ago

    Well if you just try to load a news website with and without an ad blocker you will usually notice a huge difference. So yes.

    But also, technology has become much more complex compare to the beginning of the internet. So every piece of software is more bloated than it used to be, sometimes for a good reason, sometimes less so.

    • Khalmoon@lemm.eeOP
      link
      fedilink
      arrow-up
      11
      ·
      1 year ago

      Thank you, I wasn’t sure if I was just getting impatient with websites and not appreciating how far we’ve come since DSL. It’s made sense in my head but it always felt like a mildy dumb question

  • Izzy@lemmy.world
    link
    fedilink
    arrow-up
    66
    arrow-down
    1
    ·
    1 year ago

    Web pages of today have so much added on nonsense. It’s not necessarily data farming, but also the frameworks used to develop the website themselves. Modern websites are basically entire software running in the browser even when it is used to run a simple seemingly static page. The purpose of these frameworks is to make complex things more simple for developers to make, but then people end up using them in situations that might not call for it. I think there is a general belief that since computers keep getting more powerful that it is fine to keep making software bigger and less efficient.

  • Treczoks@lemmy.world
    link
    fedilink
    arrow-up
    56
    arrow-down
    2
    ·
    1 year ago

    Lets put it this way: A typical news page pulls a few megabytes of HTML, CSS, their own images, web framework scripts, advertising, etc. For showing about 500-1000 bytes of actual text.

    • qupada@kbin.social
      link
      fedilink
      arrow-up
      14
      ·
      1 year ago

      Worse still, a lot of “modern” designs don’t even both including that trivial amount of content in the page, so if you’ve got a bad connection you get a page with some of the style and layout loaded, but nothing actually in it.

      I’m not really sure how we arrived at this point, it seems like use of lazy-loading universally makes things worse, but it’s becoming more and more common.

      I’ve always vaguely assumed it’s just a symptom of people having never tested in anything but their “perfect” local development environment; no low-throughput or high-latency connections, no packet loss, no nothing. When you’re out here in the real world, on a marginal 4G connection - or frankly even just connecting to a server in another country - things get pretty grim.

      Somewhere along the way, it feels like someone just decided that pages often not loading at all was more acceptable than looking at a loading progress bar for even a second or two longer (but being largely guaranteed to have the whole page once you get there).

    • schzztl@lemmy.nz
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      You WILL watch this flashy, totally necessary popup video on 4G and you WILL like it!

    • scarabic@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Other than a couple of top-tier ad companies like Google and FB, ad companies tend to have really bad technology. They can’t attract the best engineering talent because everyone hates ads and they are a mess to work with. As a result, advertising code is garbage and runs like shit. News sites who are not primarily tech companies are at the mercy of ad companies to run their advertising and they end up piled down with third party crap. Most small ad companies are smash-and-grab efforts by a few ruthless entrepreneurs to vampire away small sites’ revenue or steal a bunch of user data and sell it. They all want to be acquired by a bigger company and walk away with some cash. No one is working on solving problems long term.

      • Resistentialism@feddit.uk
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Fuck it. Might start an ad running business to draw in people who want to run ads and then mysteriously have a severe server fire.

  • bob_wiley@lemmy.world
    link
    fedilink
    English
    arrow-up
    24
    arrow-down
    2
    ·
    1 year ago

    A lot of it is all the frameworks. Early on a simple site might only be 1 small file, or 3 if they wanted to separate things a bit. Now when you load a page it is fetching other code libraries, fonts, and other non-sense, in addition to all the ads and tracking that surrounds that. A very simple page could be very large and bloated.

    In addition to this, a lot of sites these days are driven by APIs. So when you load the page it may make dozens of additional API calls to get all the resources. Looking at a Lemmy post, it’s making 34 API calls. It looks like they’re making a separate call for each user avatar in the comments, for example.

    You can open up the Inspect panel in your browser and see this. Go to the network tab and reload the slow page you’re on. You’ll see all the stuff it’s actually fetching and how long it takes.

    • Rikudou_Sage@lemmings.world
      link
      fedilink
      arrow-up
      5
      arrow-down
      1
      ·
      1 year ago

      It looks like they’re making a separate call for each user avatar in the comments, for example.

      The browser does that. How would you expect to get the images except for one by one? It has always been the case for as long as images have been supported.

  • zik@lemmy.world
    link
    fedilink
    arrow-up
    20
    ·
    edit-2
    1 year ago

    Run ublock origin and turn off all the ads and trackers. Then you can see for yourself how much faster it is.

    The answer is… it depends on the page but in some cases a lot, in other cases not much.

    • IverCoder@lemm.ee
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      I have been using uBlock for an extremely long time now and sites seem to become faster over time even though I have never upgraded my laptop all these years.

      When I tried to disable uBlock for a day, my laptop forcefully shut down after 30 minutes.

      Software optimization over time improves performance but is outweighted by the tracking crap they put in there.

  • MikeT@lemm.ee
    link
    fedilink
    English
    arrow-up
    14
    ·
    1 year ago

    It is not entirely data farming, a lot of this is due to use of heavy assets like fonts, frameworks, images, videos, etc. A lot of that is downloaded as part of loading the site initially and then the browser has to render/compute the site’s use of JavaScript, CSS, etc.

    Fonts and some JS assets are cached by the browsers and CDN to try to minimize redownloading it but it doesn’t change the fact that average websites today are much heavier than it was back in 90s.

    See how fast this site loads: https://text.npr.org/

    Or https://tildes.net/ compared to Reddit.

    • TrenchcoatFullOfBats@belfry.rip
      link
      fedilink
      arrow-up
      9
      arrow-down
      1
      ·
      1 year ago

      And install the NoScript extension and see exactly how many additional companies are getting your data when you just want to learn 15 fascinating facts about frogs.

  • jjjalljs@ttrpg.network
    link
    fedilink
    arrow-up
    13
    ·
    1 year ago

    Most of that stuff is async so probably not a lot. Like, you load the page and it sends a request off to pendo, but the page doesn’t wait for that to finish before doing the next thing.

    There are a lot of ways to make pages perform badly, though, especially as they get more dynamic.

    At my job the home page was loading extremely slowly for a while until we realized a refactor had made the query backing it extremely stupid. Like it accidentally was doing a separate query for every user associated with every post you could see, and then not even using the results. Oops. Fixed that and got a huge performance increase, but it had nothing to do with data tracking.

  • PeterPoopshit@lemmy.world
    link
    fedilink
    arrow-up
    1
    arrow-down
    1
    ·
    edit-2
    1 year ago

    Probably not. I think the timeouts are set up differently or something. Back in the day even if you had 1.2kb/s dial up internet, you could reliably load webpages and all their css if you were patient. Nowadays, if your internet speed dips by even a little bit, everything stops working and being patient about it only results in error screens.