Why are all the sites on the internet that are just ordinary webpages like antiwiki and formiculture forums, and my friends simple websites so SLOW. Social media, YouTube, anything by a big company is fast, anything homebrew is slow.
@futurebird My personal observations are that homebrew websites using a CMS or BBS/wiki tend to be sluggish, while those with static HTML are snappy. So it might have to do with backend resources (CPU, storage) rather than network bandwidth.
@futurebird you might find it gratifying to learn how to use the “network” tab on your web browser’s developer tools to inspect this. I bet 90% of the time it’s because the page is megabytes of code that downloads more megabytes of code that then downloads more megabytes of code
I and many others largely blame the standard web dev practices for creating websites for causing much of this situation. A very very long read on this is https://danluu.com/slow-device/
@futurebird
Big sites use a CDN (content delivery network) such as Akamai or Cloudflare.
They have servers all over the world, and cache web pages for their clients. Wherever you are, you're close to such an endpoint and the download is really quick.
Good news is that you can use a CDN even for a small website. If you don't have a lot of traffic it may be completely free.
Drawback is of course that there's another layer between you and your reader, that sees the traffic.
@futurebird If you move in more international circles, intercontinental connections definitely play a role.
Big tech can afford dumping datacenters everywhere so they're reachable within a couple milliseconds from anywhere on the planet and without having to go through more congested intercontinental routes.