zanetti
Well-Known Member
Thread Starter
- Joined
- Apr 6, 2021
- Messages
- 681
- Likes
- 326
could be providers encode from HDR source to SDR. could be. the reason why i suspect bitrate is that 720p bluray rips of the same 1080p content from providers does not suffer from banding, i mean its almost gone there. i replaced a tonne of such 720p br rips with 1080p both AVC and HEVC from AP, HMAX etc... and although i saved a tonne of hdd space, now i have to suffer horrible banding in dark scenes, god forbid there's fog in the scene, lol. of course, it depends from case to case, some are better, some are worse. ive lowered brightness in my players so its like 10% better overall, but if i get up from sofa and take a closer look... its just a sea of dancing squares lol. ill experiment more with 2160p versions, but those usually are HDR so i cant just compare it apple to oranges with 1080p SDR versions of the same content. and to be clear, im not being picky, i never was. i dont need ultrasharp edges and HDR and whatnots, im fine with lower quality as long as theres no banding and overall picture is not all mushy and, well... squarey. lol. anyways ill expriment more and provide screenshots so we all know whats the deal here im bitching about lol. cheers.But are you sure what you describe is an issue with compression and not simply the difference between HDR videos and SDR?