• AnyStream is having some DRM issues currently, Netflix is not available in HD for the time being.
    Situations like this will always happen with AnyStream: streaming providers are continuously improving their countermeasures while we try to catch up, it's an ongoing cat-and-mouse game. Please be patient and don't flood our support or forum with requests, we are working on it 24/7 to get it resolved. Thank you.

Dolby Vision

for myself that is not so relevant to see a movie every now and then in DV is enough for me the mp4muxer method. For everything else, I use the Zidoo Z9s me everything plays fine, however, can handle no media streamers DV (depends on the chip, will probably only be realized with Realtek 1395 20190403_160116.jpg )

With the Z9s then just the DV streams are played only with HDR and the LG Oled makes then just from the static HDR dynamic HDR. And as I said if it should then be DV then just LG Media Player
 
Yeah but 300 lb is quite steep.
It is the only device, that does all what you want: HD Audio & DV & no disc burning.
Compared to a "real" Oppo UHD player, it is a bargain. ;)

I personally don't care about DV as it is IMHO not needed with currently available display technology (and I somehow doubt, this will change anytime soon).
 
I personally don't care about DV as it is IMHO not needed with currently available display technology (and I somehow doubt, this will change anytime soon).
What about Sammy's Q90R or Vizio's P Quantum X? They can get near 2000 to 3000 nits of brightness respectively. HDR10 maxes out at only 1000 nits, where Dolby Vision goes to 10000 nits. That makes great use of their display, maximizing both contrast ratio and bright-room viewing experience :)
 
I'm not sure where you got that hdr10 maxes out at 1000 nits but that's not true.

Sent from my SM-G975U using Tapatalk
 
I'm not sure where you got that hdr10 maxes out at 1000 nits but that's not true.

Sent from my SM-G975U using Tapatalk
can you provide proof of this? hdr10 allows 1000 nits. hdr10+ allows 1000-4000 nits. dolby vision allows 10000 nits.
https://www.howtogeek.com/364609/what-is-hdr10/amp/
<<HDR10+ doesn’t wholly match Dolby Vision. Dolby Vision offers 12-bit color while HDR10+ sticks to 10-bit color. And, while HDR10+ boasts 4,000 nits of brightness, up from HDR10’s 1,000 nits, Dolby Vision still offers up to 10,000 nits.>>

hopefully this site is misinformed!

edit: found a site saying technically hdr10 can get to 4000 but content is only mastered at 1000 whereas dolby vision gets mastered in the several of thousands. so basically whoever you believe, seems dv will make use of new TV's added nits where hdr10 will not??
 
Last edited:
Content on several of my hdr movies is mastered to 4000 nits. So yes I can provide proof that the site is wrong.

Sent from my SM-G975U using Tapatalk
 
can you provide proof of this? hdr10 allows 1000 nits. hdr10+ allows 1000-4000 nits. dolby vision allows 10000 nits.
https://www.howtogeek.com/364609/what-is-hdr10/amp/
<<HDR10+ doesn’t wholly match Dolby Vision. Dolby Vision offers 12-bit color while HDR10+ sticks to 10-bit color. And, while HDR10+ boasts 4,000 nits of brightness, up from HDR10’s 1,000 nits, Dolby Vision still offers up to 10,000 nits.>>

hopefully this site is misinformed!

edit: found a site saying technically hdr10 can get to 4000 but content is only mastered at 1000 whereas dolby vision gets mastered in the several of thousands. so basically whoever you believe, seems dv will make use of new TV's added nits where hdr10 will not??


The USA version of Blade Runner 2049 is even mastered at 10 000 nits

https://docs.google.com/spreadsheet...u4UI_yp7sxOVPIccob6fRe85_A/edit#gid=184653968

Mad max uhd : Mastering display luminance : min: 0.0050 cd/m2, max: 4000.0000 cd/m2

Format : HEVC
Format/Info : High Efficiency Video Coding
Format profile : Main 10@L5.1@High
Codec ID : V_MPEGH/ISO/HEVC
Duration : 2 h 0 min
Bit rate : 44.1 Mb/s
Width : 3 840 pixels
Height : 2 160 pixels
Display aspect ratio : 16:9
Frame rate mode : Constant
Frame rate : 23.976 (24000/1001) FPS
Color space : YUV
Chroma subsampling : 4:2:0 (Type 2)
Bit depth : 10 bits
Bits/(Pixel*Frame) : 0.222
Stream size : 37.1 GiB (69%)
Default : Yes
Forced : No
Color range : Limited
Color primaries : BT.2020
Transfer characteristics : PQ
Matrix coefficients : BT.2020 non-constant
Mastering display color primaries : R: x=0.680000 y=0.320000, G: x=0.265000 y=0.690000, B: x=0.150000 y=0.060000, White point: x=0.312700 y=0.329000
Mastering display luminance : min: 0.0050 cd/m2, max: 4000.0000 cd/m2
Maximum Content Light Level : 9918 cd/m2
Maximum Frame-Average Light Level : 3241 cd/m2
 
Last edited:
The USA version of Blade Runner 2019 is even mastered at 10 000 nits

https://docs.google.com/spreadsheet...u4UI_yp7sxOVPIccob6fRe85_A/edit#gid=184653968

Mad max uhd : Mastering display luminance : min: 0.0050 cd/m2, max: 4000.0000 cd/m2

Format : HEVC
Format/Info : High Efficiency Video Coding
Format profile : Main 10@L5.1@High
Codec ID : V_MPEGH/ISO/HEVC
Duration : 2 h 0 min
Bit rate : 44.1 Mb/s
Width : 3 840 pixels
Height : 2 160 pixels
Display aspect ratio : 16:9
Frame rate mode : Constant
Frame rate : 23.976 (24000/1001) FPS
Color space : YUV
Chroma subsampling : 4:2:0 (Type 2)
Bit depth : 10 bits
Bits/(Pixel*Frame) : 0.222
Stream size : 37.1 GiB (69%)
Default : Yes
Forced : No
Color range : Limited
Color primaries : BT.2020
Transfer characteristics : PQ
Matrix coefficients : BT.2020 non-constant
Mastering display color primaries : R: x=0.680000 y=0.320000, G: x=0.265000 y=0.690000, B: x=0.150000 y=0.060000, White point: x=0.312700 y=0.329000
Mastering display luminance : min: 0.0050 cd/m2, max: 4000.0000 cd/m2
Maximum Content Light Level : 9918 cd/m2
Maximum Frame-Average Light Level : 3241 cd/m2
Wow, well that settles that then lol. I don't quite get how people are allowed to publish such blatantly false information on their tech websites, but I'm glad I have the proper information now. Thanks.
 
Wow, well that settles that then lol. I don't quite get how people are allowed to publish such blatantly false information on their tech websites, but I'm glad I have the proper information now.
It's not blatantly false, but I guess they are confusing technical limitations with practical restrictions.
Because most displays support less than 1000 nits, the HDR10 movies are mastered targeting those displays. Most HDR10 movies go up to 1,000, some 4,000 - the 10,000 mark is new to me, though.

With HDR10+ and Dolby Vision it's easier to master a 10,000 nits movie, that will still look mostly as intended (though limited) on a 1,000 nits display.

To my understanding, this summarizes it:
  • If you have a display with up to 1000 nits, you will not benefit from HDR10+ or DV at all
  • If you have a display with higher levels, a DV or HDR10+ movie is more likely to have higher brightness levels, while plain HDR10 movies may simply not have been mastered that way.
I'm guessing, if displays with >2000 nits become more common, HDR10 movies are going to be mastered with those levels in mind and DV or HDR10+ become obsolete.

Disclaimer: I may be wrong. It's super-hard to find solid information about this stuff.
 
As James has said, it's EXACTLY the opposite. HDR10+ and DV become less relevant the higher the target NIT the display can manage. The reason is for tone mapping to get to the target NIT level of the display. The less NITs you have to play with, the more tone mapping is required. Dynamic tone mapping like HDR10+ and DV are most relevant on lower NIT displays for that reason. If your display can come close to the mastered NIT level then there's no tone mapping involved (or very little) and those dynamic algorithms become less relevant to you.
 
Now before anyone has a go at me, this is my opinion and not intended to be against ANYONE in any way.

The UHD format should have been 8K 12 bit Dynamic range to give a true human experience.

4K Blu Ray is already a dead duck

Even the studios don't want a disk based 4K system to succeed.

iTunes etc are offering 4K movies titles with DV where Blu Ray disks are not. This is not the case for all Movies but quite a few.
Also with Atmos where the disk based versions are not.

Even though BD can offer greater bit rates by far.

DV (Dolby Vision) is a manipulation of the video to give better results at low luminance scene levels on displays that are NOT truly 10 bit compatible in luminosity etc..
Not a great start is it. A bit of a mess.

It should have been left so that the benefits of the 10bit DR system evolved with display evolvement/penetration.

Its a great shame, I wish I didn't think this, I really do.

Let me put it this way, I would love to be wrong.
 
Last edited:
You're certainly entitled to your opinion, and as it's such, you're not "wrong". I don't share your opinion. The UHD world is extremely complex. It's all well and good to talk about 8k and 4k being obsolete blah blah blah blah. But that ignores a very large reality....we haven't even STARTED in 4K yet. Not by a long shot. WHAT?! I must be an IDIOT for proclaiming that because of COURSE we've started doing 4K. "My library is FILLED with 4k movies!" Yea.....that's a nice theory. What you mostly have, and this is the majority but not all, are 2K upscales. The reason for this is, as I said, complicated. It mostly comes down to money. Professional equipment that can render and edit in 4k is expensive. That and most movie theaters can only display 2k anyway. Don't believe a word I'm saying? That's ok.

https://4kmedia.org/real-or-fake-4k/

https://digiraw.com/4K-UHD-ripping-service/the-real-or-fake-4K-list/

So it's kuel that you think everything should be 8k, and as I said, that's your right to have that opinion, but, it's not happening any time soon. As for streaming vs disc, I'm not even going there. I have both for every UHD I own. The disc kills the streaming every single time on my equipment.

(I guess saying "the majority" is a stretch as it seems we are FINALLY starting to get more REAL 4K instead of 2K upscales, but, the point remains that there are enough 2K upscales still coming out that it's a problem still.)
 
Last edited:
Hello SamuriHL

I don't think everything should be 8K, but 4K is sort of nowhere. Its not good enough for large display/projection, and 8K is now a reality.
8K has longevity, 4K does not. People with existing equipment will, in the most, not upgrade to 4K with 8K/new codecs on the horizon, they will be happy with 1080 and even 576/480 in the meantime.

The jump will come with the upgrade to 8K and especially 8K/120.

If you have not seen this demonstrated, its well worth seeing, it will totally blow you away.

I am NOT pleased or in fact that displeased about this, I am just a realist..

I don't see this as an argument and will leave it there, time will tell.
 
Last edited:
You're talking to someone who was ADAMANT about skipping 4k entirely for 8k. I ignored 4k for years, convinced that 8k was "just around the corner so why bother with 4k". Then that reality thing set in. Are there 8k tv's? Yup. There have been for a couple years now. Is there 8k content? Mmmmmm, if you're in Japan, maybe a little. 4k presents enough technical challenges that it's going to take years to roll out. To think that they're just going to skip a generation and go directly to 8k....I'm not sure I agree. It was my stated goal for a LONG time but I relented and jumped into 4k right into the deep end. I spent almost 7k USD upgrading to a fully 4k home theater environment over the past year. I came from a Panny plasma 3D absolutely fantastic top of the line HDTV. I think 8k will have a place in about 5 to 10 years. By then I'll be ready to upgrade again so I'm not really concerned. If people believe that 8k is coming in a year or two so they might as well skip 4k cause it's "dead", they're mistaken.

Now, all that being said, I'm VERY skeptical that we'll see a disc based 8k format. I believe it'll be streaming only. And for that, you're going to need a frackton of bandwidth. So in addition to the content problem, you have a transmission problem. What'll solve that problem is 5G being ubiquitous which again, another 5 years away for a wide-spread roll out and fixed wireless plans being realistic at gigabit speeds. I do believe that physical media will be dead in this generation. Samsung pulling out of the UHD market is not a good sign. I for one will still support physical media for as long as I can buy it and for as long as I can back it up.

We'll talk 8k again in another 5 years and see where we're at.
 
Sure, as do I. But I've found UHD media to be very susceptible to fingerprints, grease, oil, lunar eclipse, sunspots, and downright meanness in terms of playing. Which is why I back all my stuff up to MKV and use madvr for tonemapping. But, when I want to watch Dolby Vision stuff, being able to throw it on a rewriteable and chuck it into the UB820 is nice. Less handling of the originals is always the way to go.

Forgive my ignorance but, what do you do with madvr? What requires tone mapping? Regular HDR10 content? I also backup to MKV to watch UHD disks from my Plex server. I realize I lose DV which I hope will be rectified some day. But I am curious what you do with this?
 
Forgive my ignorance but, what do you do with madvr? What requires tone mapping? Regular HDR10 content? I also backup to MKV to watch UHD disks from my Plex server. I realize I lose DV which I hope will be rectified some day. But I am curious what you do with this?

I've got an LG C8 OLED TV that runs roughly 700 or so nits max. There's really 2 ways to use madvr for HDR tone mapping. There's the new dynamic method that they're constantly improving. It works really well for projectors and low nit target levels. And then you can scan each file to create a measurement file. madvr basically knows ahead of time from the measurement file how it needs to tone map the HDR. Setting a max target nit allows madvr to quite nicely compress the video to fit within your max nit level and should therefore not require any further tone mapping by your TV. The results are fantastic.
 
I have two UHD disc and did not realize you lost Dolby Vision when making an MKV file.


I’m confused (and easily so), the options are:


Make and ISO using AnyDVD HD and keep everything.

HD Audio & Dolby Vision.

But no way to pay it on PC or LG OLED TV.

Only way to play it would be NVidia Shield?

Make an MKV with CloneBD.

But lose Dolby Vision.

No way to play HD audio on LG OLED TV.

Use mp4muxer method. Not sure what this is.

Get a bd-mod device. Would this work better than the NVidia Shield?
 
Last edited:
Back
Top