Am I the only one preferring low quality media over high quality one?

I have a very slow Internet connection (5 Mbps down, and even less for upload). Given that, I always download movies at 720p, since they have low file size, which means I can download them more quickly. Also, I don't notice much of a difference between 1080p and 720p. As for 4K, because I don't have a screen that can display 4K, I consider it to be one of the biggest disk space wasters.

Am I the only one who has this opinion?

TheHobbyist ,

To be fair, resolution is not enough to measure quality. The bitrate plays a huge role. You can have a high resolution video looking worse than a lower resolution one if the lower one has a higher bitrate.
In general, many videos online claim to be 1080p but still look like garbage because of the low bitrate (e.g. like on YouTube or so). If you go for a high bitrate video, you should be able to tell pretty easily, the hair, the fabric, the skin details, the grass, everything can be noticeably sharper and crisper.

Edit: so yeah, I agree with you, because often they are both of low bitrate...

taaz ,

Great wizard of the bitrates, grant me your wisdom...

I can't wrap my head around bitrate - if I have a full hd monitor and the media is in full hd then how is it that the rate of bits can make so much difference?
If each frame in the media contains the exact 1920 × 1080 pixels beamed into their respective positions in the display then how can there be a difference, does it have to do something with compression?

TheHobbyist ,

Exactly, this is about compression. Just imagine a full HD image, 1920x1080, with 8 bits of colors for each of the 3 RGB channels. That would lead to 1920x1080x8x3 = 49 766 400 bits, or roughly 50Mb (or roughly 6MB). This is uncompressed. Now imagine a video, at 24 frames per second (typical for movies), that's almost 1200 Mb/second. For a 1h30 movie, that would be an immense amount of storage, just compute it :)

To solve this, movies are compressed (encoded). There are two types, lossless (where the information is exact and no quality loss is resulted) and lossy (where quality is degraded). It is common to use lossy compression because it is what leads to the most storage savings. For a given compression algorithms, the less bandwidth you allow the algorithm, the more it has to sacrifice video quality to meet your requirements. And this is what bitrate is referring to.

Of note: different compression algorithms are more or less effective at storing data within the same file size. AV1 for instance, will allow for significantly higher video quality than h264, at the same file size (or bitrate).

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • piracy@lemmy.dbzer0.com
  • test
  • worldmews
  • mews
  • All magazines