• AnyStream is having some DRM issues currently, Netflix is not available in HD for the time being.
    Situations like this will always happen with AnyStream: streaming providers are continuously improving their countermeasures while we try to catch up, it's an ongoing cat-and-mouse game. Please be patient and don't flood our support or forum with requests, we are working on it 24/7 to get it resolved. Thank you.

Kodi project to enable HDR in MPC to launch any movie

Interesting. A whole lot depends on the "this display's peak nits" value in the madVR settings, though. The default value of 400 nits is too high, IMHO. Try 120 nits. That should make the image brighter and help shadow detail. Also, for a comparison like this, I'd suggest to disable "contrast" and "brightness" controls and "gamma processing" in madVR's "color & gamma" settings page.
My display nits value is 418 according to the review test sites in hdr
 
I'm sure that's true, but when comparing different solutions, it would make sense to tweak the options in such a way that the solutions produce somewhat similar results. If one solution has a noticeably brighter gamma response than another, that makes it hard to compare them.

One could argue that doing HDR -> SDR conversion should always aim for 100 nits, because that's what SDR movies are usually mastered to by movie studios. So if your own solution tries to get near to the SDR Blu-Ray look, then your own solution seems to aim for around 100 nits. And the Samsung UHD player seems to do the same thing. So it would probably make sense to set madVR to 120 nits, too (that's the lowest value madVR supports), to get comparable results. By default it seems that madVR doesn't try to match the SDR Blu-Ray look, but instead it tries to maintain a bit more of the HDR benefits, but that makes the image darker overall, and you need a well calibrated display to still get all the shadow detail with that configuration.
 
I'm sure that's true, but when comparing different solutions, it would make sense to tweak the options in such a way that the solutions produce somewhat similar results. If one solution has a noticeably brighter gamma response than another, that makes it hard to compare them.

One could argue that doing HDR -> SDR conversion should always aim for 100 nits, because that's what SDR movies are usually mastered to by movie studios. So if your own solution tries to get near to the SDR Blu-Ray look, then your own solution seems to aim for around 100 nits. And the Samsung UHD player seems to do the same thing. So it would probably make sense to set madVR to 120 nits, too (that's the lowest value madVR supports), to get comparable results. By default it seems that madVR doesn't try to match the SDR Blu-Ray look, but instead it tries to maintain a bit more of the HDR benefits, but that makes the image darker overall, and you need a well calibrated display to still get all the shadow detail with that configuration.

I agree I'll try to lower madvr nit value. The Samsung was with TV in HDR mode and the camera filming. I turned off dynamic picture option in Samsung picture options or the colors would pop more. Mpcbe was playing a 4k UHD capture of the same movie. Madvr the same as HDR does not work on AMD cards to trigger the TV so I used madvr sdr method.

I felt HDR sucked at first because I was expecting the intensity of madvr but I watch HDR movies on in-store setups and they look like mine. So I've come to believe HDR is trying to duplicate what your eye sees. Sometimes a scene with trees and highway and sky look normal and normal in the real world is boring but accurate. This is the same with HDR. Planet earth 2 did look wow but movies like the magnificent 7 looked dull. John wick 2 had wow factor but aliens was dull.

Playing with levels I found I can duplicate every movie with contrast brightness and saturation levels on PC in non HDR mode.

I own player and amp and HDR TV and even HDR AMD rx470 card. I have over 40 UHD movies and 3 that are still unopened... I do HDR the official way and enjoy HDR on Netflix in Samsung player. The PC was an experiment to see if a PC can process HDR before sending it instead of the other way which is done via metadata to the TV to turn on HDR.

Either way many rips of movies I've tested with wonderful results in kodi HDR project adjustments. HDR is still young in that many gave not crossed yet. The PC way is way to expensive to upgrade everything to meet HDR requirements thus the Kodi HDR project....it's cheaper to buy a 250 dollar HDR player and the movies.

Deuhd sounds great but that's way to much. Your better to buy a 4k mini from blackmagic and do you whole collection at the bit rate you want. There's more to it but I'm not going to share how to pirate UHD even though I could do any movie on the market. Too old to do this stuff anymore. You know time is restrictive when I have 3 of the latest UHD movies still wrapped up.....

Thank you for your comments ....and I will see if I can disable gamma and lower the nit value in madvr. If I could solve the dark issue madvr would be great. Many have complained about the crushed blacks
 
Oh, I thought you had a Samsung UHD player and an SDR display and the UHD player converted HDR to SDR for you. The results you got from your TV in HDR mode are interesting. Colors seem very muted. I think too muted. I've been told every display handles HDR differently, though. So what you got with your Samsung might look quite different to what e.g. a Sony TV would do.

BTW, switching your TV to HDR with AMD *does* work with newer madVR builds, but there are some requirements: You must activate "use D3D11 for presentation", you must set your display to 10bit in the madVR "device" configuration, and you need to switch your media player to fullscreen (no start menu visible etc). Finally, the madVR HDR settings need to be set to "let madVR decide" or "passthrough". If you do all that, then madVR should be able to switch your TV into HDR mode.

I agree that the DeUHD price is pretty high. I do wish we had a solution from redfox. Maybe some day...

FWIW, from what madshi said on doom9, madVR's HDR -> SDR conversion leaves all pixels lower than 100nits (or so) "untouched". However, the pixels are scaled according to your peak nits configuration. So e.g. if the UHD Blu-Ray asks for a pixel to be displayed at 50nits, and if your TV really measures as 400nits, then if you tell madVR so, the pixel should measure as exactly 50nits. The purpose of this approach is that all pixels that are in the typical SDR brightness range (0-100nits) are not compressed by madVR at all. As a result, shadow detail should *not* be crushed. Or if it's crushed, then displaying the UHR Blu-Ray on a perfect 10,000nits display would also be crushed, because the 10,000nits display would display all pixels from 0-100nits exactly the same way as madVR does. madVR only compresses the pixels which are brighter than 100nits.

In contrast to this, most HDR TVs today apply some gamma modification curve to *all* pixels, regardless of which brightness they have. So even a 50nits pixel gets modified somehow. This can result in shadow detail looking "better" when using a HDR TV, but it's actually not reproducing the pixels "correctly", according to what the UHD Blu-Ray is asking for.

The whole concept is somewhat weird, though. The UHD Blu-Ray defined an absolute nits value for each pixel. SDR content doesn't do that at all. As a result SDR content is shown much much much brighter on today's displays. Studios master SDR content to show 100nits for peak white, but if you play an SDR movie on your 400nits display, actually peak white will be much nearer too 400nits, which in theory is much too bright. But that's what we're used to these days...

Well, at least that's my understanding.
 
Oh, I thought you had a Samsung UHD player and an SDR display and the UHD player converted HDR to SDR for you. The results you got from your TV in HDR mode are interesting. Colors seem very muted. I think too muted. I've been told every display handles HDR differently, though. So what you got with your Samsung might look quite different to what e.g. a Sony TV would do.

BTW, switching your TV to HDR with AMD *does* work with newer madVR builds, but there are some requirements: You must activate "use D3D11 for presentation", you must set your display to 10bit in the madVR "device" configuration, and you need to switch your media player to fullscreen (no start menu visible etc). Finally, the madVR HDR settings need to be set to "let madVR decide" or "passthrough". If you do all that, then madVR should be able to switch your TV into HDR mode.

I agree that the DeUHD price is pretty high. I do wish we had a solution from redfox. Maybe some day...

FWIW, from what madshi said on doom9, madVR's HDR -> SDR conversion leaves all pixels lower than 100nits (or so) "untouched". However, the pixels are scaled according to your peak nits configuration. So e.g. if the UHD Blu-Ray asks for a pixel to be displayed at 50nits, and if your TV really measures as 400nits, then if you tell madVR so, the pixel should measure as exactly 50nits. The purpose of this approach is that all pixels that are in the typical SDR brightness range (0-100nits) are not compressed by madVR at all. As a result, shadow detail should *not* be crushed. Or if it's crushed, then displaying the UHR Blu-Ray on a perfect 10,000nits display would also be crushed, because the 10,000nits display would display all pixels from 0-100nits exactly the same way as madVR does. madVR only compresses the pixels which are brighter than 100nits.

In contrast to this, most HDR TVs today apply some gamma modification curve to *all* pixels, regardless of which brightness they have. So even a 50nits pixel gets modified somehow. This can result in shadow detail looking "better" when using a HDR TV, but it's actually not reproducing the pixels "correctly", according to what the UHD Blu-Ray is asking for.

The whole concept is somewhat weird, though. The UHD Blu-Ray defined an absolute nits value for each pixel. SDR content doesn't do that at all. As a result SDR content is shown much much much brighter on today's displays. Studios master SDR content to show 100nits for peak white, but if you play an SDR movie on your 400nits display, actually peak white will be much nearer too 400nits, which in theory is much too bright. But that's what we're used to these days...

Well, at least that's my understanding.
In picture setting of Samsung UHD player if I enable in pictures settings dynamic option then all those muted colors look deep. I disabled all enhancements to give you a true 1 to 1 HDR image in HDR mode. It's almost impossible to film HDR from a camcorder pointing at the screen. I will try your tips in madvr. Thkyou
 
My TV is Sony xbr850c 64 inch...... Netflix 4k nicklous cage in HDR looks amazing ......
 
To help out XDR users here is my latest test with XDR. These are the settings for this movie. Special thanks to max for the test sample. Save the picture and view full screen to see the dark detail. XDR really works!

Untitled-1.jpg
 
Untitled-2.jpg

The water appears REALLY BRIGHT on my screen. Its hard to show this in pictures
 
Oh, I thought you had a Samsung UHD player and an SDR display and the UHD player converted HDR to SDR for you. The results you got from your TV in HDR mode are interesting. Colors seem very muted. I think too muted. I've been told every display handles HDR differently, though. So what you got with your Samsung might look quite different to what e.g. a Sony TV would do.

BTW, switching your TV to HDR with AMD *does* work with newer madVR builds, but there are some requirements: You must activate "use D3D11 for presentation", you must set your display to 10bit in the madVR "device" configuration, and you need to switch your media player to fullscreen (no start menu visible etc). Finally, the madVR HDR settings need to be set to "let madVR decide" or "passthrough". If you do all that, then madVR should be able to switch your TV into HDR mode.

I agree that the DeUHD price is pretty high. I do wish we had a solution from redfox. Maybe some day...

FWIW, from what madshi said on doom9, madVR's HDR -> SDR conversion leaves all pixels lower than 100nits (or so) "untouched". However, the pixels are scaled according to your peak nits configuration. So e.g. if the UHD Blu-Ray asks for a pixel to be displayed at 50nits, and if your TV really measures as 400nits, then if you tell madVR so, the pixel should measure as exactly 50nits. The purpose of this approach is that all pixels that are in the typical SDR brightness range (0-100nits) are not compressed by madVR at all. As a result, shadow detail should *not* be crushed. Or if it's crushed, then displaying the UHR Blu-Ray on a perfect 10,000nits display would also be crushed, because the 10,000nits display would display all pixels from 0-100nits exactly the same way as madVR does. madVR only compresses the pixels which are brighter than 100nits.

In contrast to this, most HDR TVs today apply some gamma modification curve to *all* pixels, regardless of which brightness they have. So even a 50nits pixel gets modified somehow. This can result in shadow detail looking "better" when using a HDR TV, but it's actually not reproducing the pixels "correctly", according to what the UHD Blu-Ray is asking for.

The whole concept is somewhat weird, though. The UHD Blu-Ray defined an absolute nits value for each pixel. SDR content doesn't do that at all. As a result SDR content is shown much much much brighter on today's displays. Studios master SDR content to show 100nits for peak white, but if you play an SDR movie on your 400nits display, actually peak white will be much nearer too 400nits, which in theory is much too bright. But that's what we're used to these days...

Well, at least that's my understanding.
Tried disabling gamma and set nit to 150 nits and its fucking amazing! Madvr looks beautiful at 150 nits
 
Tried disabling gamma and set nit to 150 nits and its fucking amazing! Madvr looks beautiful at 150 nits
The only downside I've seen is dithering banding because of the forced d11 8bit mode. Custom presenter supports 10bit full screen. Madvr does not so maybe another madvr update may support 10bit d11 fullscreen mode. Until then you will see some banding in skys.
 
The only downside I've seen is dithering banding because of the forced d11 8bit mode. Custom presenter supports 10bit full screen. Madvr does not so maybe another madvr update may support 10bit d11 fullscreen mode. Until then you will see some banding in skys.
Never mind. Pressing ctrl J if I double click on the window it switches to 10bit and the banding goes away. Its only in windowed mode is 8 bit set. Still have to double click on window at the beginning to force it into 10 bit fullscreen mode.
 
Not to shoot XDR I have to give madvr credit as I finally figured out how to get the hdr light levels to show on my hdr tv correctly. You dont even need hdr metadata for this to work. This also correctly shows the blurays as well. Very nice results.... This says sdi buts its really showing the total hdr image on my tv. Some can use the pass-thru but if that does not work try my settings to enjoy explosive light levels.
The hdr slider on the windows desktop is off! forcing the nits value opens up the image on the tv with amd video cards.

1.jpg 2.jpg
 
Last edited:
Never mind. Pressing ctrl J if I double click on the window it switches to 10bit and the banding goes away. Its only in windowed mode is 8 bit set. Still have to double click on window at the beginning to force it into 10 bit fullscreen mode.
Hmmmm... Not sure why that happens. Usually 10bit is supported by madVR in fullscreen exclusive mode, or in Windows 10 also in fullscreen windowed mode, but in fullscreen windowed mode only with Nvidia/AMD GPUs, not with Intel.

Banding should not occur even in 8bit, unless you have dithering disabled in the madVR settings? With 8bit and dithering you should get no banding, but you do get higher noise, due to the stronger dithering. Heck, you can even lower your display bitdepth to 3 bits in madVR (you might want to try it just for fun), and you still get no banding! But *very* high noise.

I'm glad to hear you like madVR's HDR -> SDR conversion at 150 nits. It does consume quite a bit GPU power, though. Maybe we should tell madshi to change the default value from 400 nits down to 150 nits, or at least something nearer to that.
 
P.S: Just had a look at your madVR setup video. Some questions/suggestions:

1) Is there a specific reason why are you blocking LAV Video? It should work the way you configured it, but with an up-to-date LAV version it should also work.
2) You have automatic display mode switching enabled in MPC and disabled in madVR. It might be worth a try to do it the other way round, to give madVR more control. Maybe that could help with the 10bit issues you experienced?
3) You have activated the MPC option "show controls in fullscreen mode". Maybe disabling that could help with the 10bit issues you experienced?
4) If you display 4K on a 4K TV with a 4K display mode, the image upscaling/downscaling options shouldn't matter because up/downscaling isn't needed. But for 1080p Blu-Ray playback, I'd suggest to disable the "upscaling refinement" options, and instead use better algorithms for image up/downscaling, and maybe also for chroma upscaling. DXVA scaling algorithms usually have rather bad quality (but they're fast!). If your GPU can handle it, try NGU Sharp Medium or High quality for image upscaling and SSIM1D100 for image downscaling, and maybe something cheap like Cubic for chroma upscaling. That should give you much nicer 1080p playback quality. Of course upscaling will only be performed if you use a 4K display mode.
5) Error diffusion is nice, but very expensive. Using ordered dithering is much faster and looks almost the same. Maybe if you save performance there, you can use better image upscaling/downscaling algos?

For your interest, madshi has posted a screenshot comparison with the Passengers Blu-Ray vs UHD Blu-Ray here, showing differences between various scaling algorithms:

http://www.avsforum.com/forum/24-di...or-mini-shootout-thread-607.html#post54983374

Upscaling the 1080p Blu-Ray with NGU Sharp comes surprisingly close to the UHD Blu-Ray! Compared to that, Lanczos looks awful. And Lanczos is usually better than DXVA scaling!
 
P.S: Just had a look at your madVR setup video. Some questions/suggestions:

1) Is there a specific reason why are you blocking LAV Video? It should work the way you configured it, but with an up-to-date LAV version it should also work.
2) You have automatic display mode switching enabled in MPC and disabled in madVR. It might be worth a try to do it the other way round, to give madVR more control. Maybe that could help with the 10bit issues you experienced?
3) You have activated the MPC option "show controls in fullscreen mode". Maybe disabling that could help with the 10bit issues you experienced?
4) If you display 4K on a 4K TV with a 4K display mode, the image upscaling/downscaling options shouldn't matter because up/downscaling isn't needed. But for 1080p Blu-Ray playback, I'd suggest to disable the "upscaling refinement" options, and instead use better algorithms for image up/downscaling, and maybe also for chroma upscaling. DXVA scaling algorithms usually have rather bad quality (but they're fast!). If your GPU can handle it, try NGU Sharp Medium or High quality for image upscaling and SSIM1D100 for image downscaling, and maybe something cheap like Cubic for chroma upscaling. That should give you much nicer 1080p playback quality. Of course upscaling will only be performed if you use a 4K display mode.
5) Error diffusion is nice, but very expensive. Using ordered dithering is much faster and looks almost the same. Maybe if you save performance there, you can use better image upscaling/downscaling algos?

For your interest, madshi has posted a screenshot comparison with the Passengers Blu-Ray vs UHD Blu-Ray here, showing differences between various scaling algorithms:

http://www.avsforum.com/forum/24-di...or-mini-shootout-thread-607.html#post54983374

Upscaling the 1080p Blu-Ray with NGU Sharp comes surprisingly close to the UHD Blu-Ray! Compared to that, Lanczos looks awful. And Lanczos is usually better than DXVA scaling!
I'll try some of these. Thank-you for all your expert advice
 
Good news AMD HDR CARD users and NVIDIA users who have hdr that are still having issues. I updated windows 10 last night and voila perfect HDR TV TAB ON switching and perfect extreme light levels on mt HDR TV now! Here is my settings..... I'm sure you have different ones but this may help many out. Purposes was spelled wrong but who cares. Later

 
Last edited:
I noticed fate furious looks better at 180 nits but looks washed out at 120 nits , apes movie looks great at 120. .. So maybe a pixel shader XDR.Cal option would be very useful for fast non HDR cards in madvr.... What do you guys think about this? Now that HDR switching is on both cards I would never use it but there may still be guys with non HDR TVs who would....call it MadNITS lol
 
Last edited:
Thanks for the great info in this thread gereral1

Quick question. Can you only set this up on Windows 10, or could you use Windows 7. I'm about to rebuild my HTPC as I'm having sound drop out issues with my Atmos set up, Wanting a clean install. HTPC hooked up to my Denon AVR 2200w and LG 55 OLED.

If it has to be Windows 10, does it need to have the creator update applied ?

Thanks in advance
 
Back
Top