I grew up during the dial up era of internet and remember how insane it was each time the technology improved, broadband, dsl, fiber etc.
I wouldn’t expect pages to instantly load, but I have to imagine all the data farming is causing sites to be extremely bogged down.
A lot of it is all the frameworks. Early on a simple site might only be 1 small file, or 3 if they wanted to separate things a bit. Now when you load a page it is fetching other code libraries, fonts, and other non-sense, in addition to all the ads and tracking that surrounds that. A very simple page could be very large and bloated.
In addition to this, a lot of sites these days are driven by APIs. So when you load the page it may make dozens of additional API calls to get all the resources. Looking at a Lemmy post, it’s making 34 API calls. It looks like they’re making a separate call for each user avatar in the comments, for example.
You can open up the Inspect panel in your browser and see this. Go to the network tab and reload the slow page you’re on. You’ll see all the stuff it’s actually fetching and how long it takes.
The browser does that. How would you expect to get the images except for one by one? It has always been the case for as long as images have been supported.
There is a technique where you put all the images side by side in one file and then reposition it in the view port to show different images. Usually this is for icons since profile pics change too often and would need rebuilt server side each time.
that’s image sprites.
nowadays, icons are very often SVG embedded right in the html or js.
Then you can put all possible images anyone could ever upload as a profile pic into a meta image and assign a 3 MB identifier to each user to specify which subsection of the image to view for that user
You know, I bet crypto mining rigs would be great at generating all those images.
Yeah that’s fair, true for other stuff though.