Im sorry fella but your wrong.
Lets look at your stand alone Blu Ray Player, lets say the input is H264, the HDMI output is a digital Pixel Address/data stream, NOT THE H264 input.
Well I imagine there is no address, just the pixel data stream, but otherwise that's what I was saying.
In other words, it tells the pixel at each address its luminance and hue, each one by turn, in a digital stream.
Not an expert. I know a lot of displays use dRGB (digital Red/Green/Blue), which is where 24 bits/pixel came from (8 bits/color). No idea if that is what HDMI passes, though. I expect 3Gbit/sec is the right ballpark for HDMI, since it boils down to describing how to show each pixel. I was curious, so I
found this on photo.stackexchange.com. Not terribly relevant, but I thought it was interesting.
You can alter the Contrast/Saturation/Gamma/Brightness in your players settings. That alone tells you its NOT the original info.
It is an interpretation of the H264 stream, it all depends on what filters etc i.e. (LAV) etc the player uses as to the quality sent out to the HDMI port.
It is NOT the original H264 information.
I never meant to say that it was the original H264 information. But the way I see it, the "original" information is the pre-encoded content. That is, jpeg is a lossy compression, as is H264. They are both degraded versions of the original image or digital movie. When you decode jpeg or H264, that is as close to the original as you can possibly get. It isn't inherently degraded*.
Now if you post-filter (sharpen, change the contrast, etc), then yes, it is different from the original. But that (in my opinion) is more like saying that if I decode a color jpeg, convert it to black and white, and create another jpeg from the black and white image I have a degraded the image. Indeed I do, so don't post-process.
*Fair point on LAV filters/MadVR type stuff. There are more accurate ways to decode. But by your logic, isn't what is shown on your TV *always* degraded from the H264?
Its a bit like using a CD player, taking the audio and then re encoding it to digital. (that's the closest analogy, its not an exact analogy).
Not sure how you mean this. If you mean the analog audio (say from the speakers), then you are correct that the information is degraded, but that isn't a proper comparison since the degradation comes from digital-analog conversion and then analog-digital conversion (dac/adc). But you can rip a CD to flac or other lossless format and not have a degraded signal - which is basically what I'm saying - as HDMI passes digital data.
Again, I'm not trying to argue for or against the original idea. I was just trying to make a technical point that the HDMI data was not inherently degraded. I purchased AnyDVD and CloneBD because while the 3Gbit/sec data isn't degraded, the encoded (but decrypted) H264 data is certainly easier to deal with, and doesn't lose quality in the re-encoding process.
FYI, if you have a way to get component cabling (like from a standalone blu-ray or cable box), there is always the option of using an analog capture card (see,
for example). But huge hassle, and component is analog so that will be lossy in terms of analog/digital conversion as well as encoding quality.