• AnyStream is having some DRM issues currently, Netflix is not available in HD for the time being.
    Situations like this will always happen with AnyStream: streaming providers are continuously improving their countermeasures while we try to catch up, it's an ongoing cat-and-mouse game. Please be patient and don't flood our support or forum with requests, we are working on it 24/7 to get it resolved. Thank you.

MKV format outdated

Well, I'm not blonde but bald....

To extend the MKV format it would be essential to reverse engineer DV and HD10+ and that's not an easy task -I know. But without HDR features MKV is approaching a dead end street.
 
Um, that statement shows your ignorance of MKV, I'm afraid. MKV supports HDR10+ just fine as is. It COULD support DV without issue *IF* everything I've said repeatedly were to happen. So you can keep beating on the "mkv is dead" drum all you want, but, you are simply wrong.
 
MKV supports HDR10+ just fine as is.
Really? It does? I didn't know. HDR10 sure, but HDR10+?

BTW, a little off topic, MKV added support for frame packed 3D. elby added support for frame packed 3D quite early, too. But no player added support for frame packed mkv. PowerDVD on the PC and Kodi on the Raspberry Pi being the exception.
An example for a useful addition to the standard widely ignored by almost everyone. Maybe a bad comparison because "3D is dead anyway", but adding DV (or HDR10+) to mkv needs all involved parties to play along.

For DV there will be no media players, as Dolby doesn't want to license it to media players like Nvidia Shield, not even mentioning the Chinese AmLogic or Realtec boxes.
 
Really? It does? I didn't know. HDR10 sure, but HDR10+?

BTW, a little off topic, MKV added support for frame packed 3D. elby added support for frame packed 3D quite early, too. But no player added support for frame packed mkv. PowerDVD on the PC and Kodi on the Raspberry Pi being the exception.
An example for a useful addition to the standard widely ignored by almost everyone. Maybe a bad comparison because "3D is dead anyway", but adding DV (or HDR10+) to mkv needs all involved parties to play along.

For DV there will be no media players, as Dolby doesn't want to license it to media players like Nvidia Shield, not even mentioning the Chinese AmLogic or Realtec boxes.

Why wouldn't it? HDR10+ is not a separate video track like DV. AFAIK, it's embedded so once you move the video track to an MKV container the HDR10+ metadata is retained. At least that's how I understand it. Now, I don't have anything to test that theory with as my LG doesn't support HDR10+ but I do believe it's kept.

P.S. 3D frame packed MKV's were supported in LAV/MadVR for quite some time. It's why we begged Elby to add support for it.

EDIT: Just checked Alien MKV and it's showing HDR10+ Profile A compatible.
 
Last edited:
One more thing on this topic because it's annoying me now that MKV is being called a "dead end". MKV is NOT the limiting factor of Dolby Vision support. And this point seems to be getting lost somehow. You can absolutely mux an MKV that contains Dolby Vision data. It's just an additional video track. MKV can handle that just fine. What WON'T happen, and the reason why we say the MKV format needs to be extended, is that *NOTHING* will be able to read that metadata video track. When we say MKV needs to be "extended" it simply means that the way things are muxed need to be agreed upon so that things like splitters, decoders, renderers, etc all work with the data that's stuff into the container. MKV, as a container, couldn't care less what you shove in there, DV or otherwise. It's NOT the limiting factor. The limiting factor is the LACK of all the other eco-system components that are needed to make use of the data. THAT is the problem. And that's why the title of this thread annoys the hell out of me because the issue is NOT understood.
 
SamuriHl,

sometimes you are just too negative:

It's NOT the limiting factor. The limiting factor is the LACK of all the other eco-system components that are needed to make use of the data. THAT is the problem. And that's why the title of this thread annoys the hell out of me because the issue is NOT understood.

This thread is not to complain about MKV - as it may sound - but to initiate thoughts how we can get rid of discs and keep still all features. DV, HDR, HDR+, Dolby Atmos. And both RedFox and Elby should have the best knowledge to make this work. For sure not alone but with people who maintain Matroska, VLC, MPC etc. It was supposed to give this a head start. Nothing else. Don't give up before you get started!
 
@James , @Reto ,

your opinions please. Besides working on these daily bug issues. There should be a new target to establish a non-disc platform. Of course against the trends in the industry that only wants disc based solutions.
 
SamuriHl,

sometimes you are just too negative:

You're joking right....you must be, because the title of this thread is: MKV format outdated...and I'm the one being negative? Alrighty then.

This thread is not to complain about MKV - as it may sound

But without HDR features MKV is approaching a dead end street.

It sure does! Which is why this thread bothers me. Because you kept on insisting that what we have is somehow an MKV problem....and it's simply NOT.

- but to initiate thoughts how we can get rid of discs and keep still all features. DV, HDR, HDR+, Dolby Atmos. And both RedFox and Elby should have the best knowledge to make this work. For sure not alone but with people who maintain Matroska, VLC, MPC etc. It was supposed to give this a head start. Nothing else. Don't give up before you get started!

HDR, HDR10+, Atmos, DTS:X, 3D frame packed MVC....ALL DONE in MKV. The ONLY thing missing from your list is Dolby Vision, which much to your surprise, is already being put in MKV containers by another product. It's UNPLAYABLE! Which is again, NOT an MKV problem. I go back to the thread title and the post I quoted about it being a "dead end street" as to why I've taken the tone I have. And you think you're the first one to bring up the DV in MKV issue? LMAO! Just for the record, someone HOUNDED madshi about this for MONTHS before the mods finally gave him the boot from the madvr thread because it's simply NOT GOING TO HAPPEN. I'm not the one "giving up before I get started". It's been asked 10000 times already and the developers of the tools that we NEED to get ANY kind of support are telling us, REPEATEDLY, it's not happening. So it's a non-starter that has nothing at all to do with the container.

 
Ok, I've said my peace on MKV and the reason I take issue with the entire tone set by the thread title and what I quoted is because someone may read that and think it's somehow true and that MKV is a "dead end format". And that bothers me because MKV has been the most extensible format we've ever seen. The DVD guys that were doing it LONG before I ever got on the MKV bandwagon convinced me when I was doing blu-ray backups that MKV was absolutely the way to go. I still believe that for UHD.

You have the logic completely and totally backwards, though. It's not "MKV better get with the DV program or it'll be a dead end street"....it's ACTUALLY Dolby better open up DV or it's a completely DEAD format. Don't believe me? Is Samsung, the current market leader in TV's, a "dead end street" because their TV's don't support DV? And never will. Is LG a "dead end street" because 1) Their internal player only supports MP4 for DV content and not MKV and 2) they don't support HDR10+? You'll find VERY few consumer devices that support ALL the formats. (Screw you, Panasonic, for abandoning the US TV market. Take your beautiful HDR10/HDR10+/DV supporting OLED and...nope, not bitter at all). My point is that by setting the tone of this thread as "MKV Outdated format" and then continuing that by saying that IF MKV doesn't start "supporting" DV that it's a dead end street you've completely missed the mark. It's not where the issue lies. The issue lies in closed formats that have zero support outside people who throw cash at them. You DO realize that any device you buy that contains DV support is marked up because you've essentially given Dolby money in order to have that support, right? So my UB820 and my LG C8 are both priced higher for supporting DV. Awesome.
 
Of course you are absolutely right with all you are statings. And the name of the post header was - I admit - more than provoking. The ultimate goal is still to find non-disc solutions which would have been the better non-provoking header.
 
Yup. That would have set my tone quite different. It genuinely felt like this thread was an attack on MKV because it's perceived to not be able to handle Dolby Vision and I don't want people reading this thread to come to that conclusion because it's not the case. Other than Dolby Vision, we have a non-disc solution for every format out there already with MKV. Dolby Vision non-disc can be done with MP4 if you have a player that supports it (i.e. the LG built in player on the OLED TV's like my C8). There's no technical reason it can't support MKV, they just don't. CloneBD just added support for Dolby Vision in MP4. The obvious limitation to this is that you lose HD audio with the MP4 container so no ATMOS or DTS:X. I've seen people taking the output from MakeMKV and converting it to MP4 and are able to play it with the LG player and get full Dolby Vision, so, we know that DV can be stored in MKV just fine. We just don't have any support outside official players and we may never get that support. And even if we do, it's going to be a VERY long time. Madshi is working on tone mapping improvements to madvr and a new product called the Envy which is a stand alone box that will do high end dynamic tone mapping thus killing Dolby Vision even more IMO. Nev can't do anything about decoding until the Dolby Vision spec is released so there's really, at this point in time, zero hope.
 
Thanks SamuriHL. Very good insight on the progress. I was not aware of that. Personally I just don't want to lose any audio options but if workarounds for HDR are on the way I would be willing to look into that. Spec wise DV is anyway over the top what hardware can actually display currently.
 
Last edited:
I wouldn't necessarily call it a work around for HDR. Dynamic tone mapping, TRULY dynamic where it's measuring each frame in real time and processing it. It then applies processing to balance luma and chroma sampling and a whole bunch of other goodness. It also targets your set NIT level so for my OLED I set 700 NITS and the dynamic tone mapping will tone map to it. I also use measurements files which take the time to FULLY process each and every frame, NOT in real time, so that it gives you the absolute best possible quality. This is why I'm not concerned at all about Dolby Vision. Every Dolby Vision movie I own I've created an MKV and measurement file for and the quality is right there with Dolby Vision.
 
Is this post-processing? And if yes, please provide some hints where to start.

Guess it doesn't make sense for discs but for files only. But since LG doesn't support HDR10+ it might be interesting even for me.
 
Some clarification. The measurement file is PRE-processing. The live algorithm is, well, LIVE meaning as you play. So yes, it works to do dynamic HDR while playing a disc. And it works VERY well. madvr is, well, a Video Renderer which is where the vr part of the name comes from. I have it setto upscale chroma using NGU anti-aliasing high and using madvr tone mapping with luma processing set to a target of 700 nits. I need to play with the live algorithm some more as they've been improving it even more in the past few days. The quality I get is absolutely fantastic. Your 1050 isn't QUITE strong enough for madvr tone mapping....my 1060 could do it with a lot of trade offs.
 
I added madVR to MPC some time ago following the recommendations. So you mean madVR in MPC or is there another workflow? Means playing from a PC and no conversion of any files?
 
madvr in MPC is fine. I use J River MC but it hardly matters. The thing is, I'm using the test builds of madvr which are still being developed. They've been worked on for MONTHS since the last official release. They're working to get another official release with the dynamic tone mapping optimized. Not sure what you mean by no conversion of any files, though? If you mean can you play a disc, then yes as long as AnyDVD decrypts it.
 
I'll throw in my thoughts and findings - which are incomplete, some of what I'm saying is pure guesswork, but I did dive deep into the relevant specifications. Feel free to correct me.

First off - the original thing about MKV: yes, MKV is currently in my opinion the bestest of all existing single-file formats. I can't think of a single thing other than DV, that it doesn't support and couldn't easily add if told how to.
Having dealt with the innards of MP4 after I already knew MKV, I was totally appalled at the stupidity and non-scalability of MP4. That crap only survived, because of the lack of alternatives at first and now still isn't dead because it's everywhere.

MP4 is a very rigid container format with a lot of ugly hacks applied over time to support things, that weren't originally catered for. OK, these things happen.
But somebody explain to me, why an audio track needs to define "width" and "height" and other video-related properties.
It's not like the standard was finished and then somebody came around and said "Hey, new idea, let's support audio, too. We'll just use the same structure we did for video, because we didn't think of this before".
Also, in order to parse an MP4 file and its content, you need to know the exact format of all structures (atoms). And there are hundreds.
And one gets added for every measly new feature - which is probably why hardly ever anything gets added.
And when defining a new structure, you need to think of all possible future things that might change, adding reserved fields or optional fields making a huge mess of data nobody needs.

MKV: if you know the basic types (integer, string, ....), you can parse and display the whole content of the thing.
To add a feature: just add a field. If a field is missing, there's a default value. The player doesn't know a new field? Ignore it. Done. Next.

Adding Dolby Vision would be simple - Dolby would have to say how. They just didn't think of it.
It's not only the extra track, but you need to define the relationship between video and DV and also DV profile and stuff somewhere in the file.
Simple thing, but a requirement and something that needs to be agreed upon.

MKV would have been a better option, considering that it supports both Dolby "products" Vision and TrueHD.

When we get displays at 4000 nits and higher, Dolby Vision starts to become relevant.
This, I believe, could also be the exact opposite. DV seems to me (and HDR10+), is only useful as a workaround for displays not supporting the full range.
(Unless we're talking about the additional 2 bits we get through the extra DV track, but I'm not convinced, that this would really be noticeable).

A couple of "things" (I like things):
  • Thing 1: plain HDR10 already has all tools on board to cover the entire defined (BT2020 and ST2094) colour and brightness range. It's all there.
  • Thing 2: current TVs are limited to around 700-800 nits (ZD9 excepted), so most UHD movies are mastered up to - typically - roughly 1000 nits. That makes a lot of sense.
  • Thing 3: just throwing this in for people, who are new to this: standard "historical" SDR goes up to ~100 nits, while the HDR specification speaks of up to 10,000 nits.
    This sounds like the TV is going to fry you, but brightness is perceived logarithmically. Terms like "twice as bright" probably don't make so much sense in the context of the subjective perception of light (which our eyes also auto-adapt to, so they never give us an absolute measurement).
    But I'll say that 10000 nits are probably perceived somewhere in the area between 2 to maybe 8 times as bright as 100 (but it still is 100x the energy).
    Just to put these values into perspective.
    Still - large areas of 10000 nits (or 4000 even) would be annoyingly bright, you'd be squinting at your screen. These values only make sense for tiny things like sparkles or stars.
  • Thing 4: HDR10+ adds a per-scene information about average lighting and such - optionally different values for a number of defined areas.
  • Thing 5: Dolby Vision adds, on top of that, two extra bits per pixel (actually per 4 pixels, if that's how they map the 1920x1080 DV stream onto the 3840x2160 video stream, but there may be more magic inside, I don't know - the few specs there actually are, do seem to support that theory).

So given all that, you can conclude: if you had a plain HDR10 movie, mastered for up to 10000 nits and a TV, that supports the required nits in full, there is nothing HDR10+ or DV could add.
Except maybe those two bits, but: there is no TV in existence, that can deal with 12 bits - not even the DV ones. The panels simply can't display the 12 bits and it's not likely to change.

What remains is: currently, HDR10 movies are released with a maximum of 1000 nits, sometimes up to 4000, because that requires less compromise on TVs with a limit in that general area.
That is because the 750 nits LG can't simply cut off anything above 750nits. That would produce flat white areas with no detail. Instead splines are used to ease the bright pixels into the given limits.
The downside is, that even pixels with less than 750nits will have to be darkened as well instead of getting shown with their intended brightness, even though they could.
HDR10+ and DV both can aid in reducing that effect by telling the TV in advance whether it will be required to use a smoothing curve at all (note: this can also be done by the TV itself without HDR10+/DV by generating a histogram of each picture - and apparently some do exactly that).

It is possible, but I'm not at all sure, that DV can allow for HDR10 remaining in that "modest" area far below 4000nits while adding the option for more for devices that can handle it.
I can't say, because all articles I read about this so far, just either parrot the Dolby Vision marketing, which is vague and full of meaningless bullshit ("up to 10000 nits" - well, the same goes for HDR10, guys, "12 bit" - okay, but what is that really good for?).
Or they really don't know what they are talking about, I've seen the prettiest contradictions.
I have a feeling, that hardly anyone really understands what this DV shit does.

Which is no surprise, as it's a friggin secret.
 
Last edited:
Thanks for that. Very nice overview of the situation. I think we'll eventually see 4000 nit TV's and dv might matter if they mastered to 10000 nits but I don't know how many of them are out there.

Madvr also creates a histogram for it's dynamic tone mapping.

Sent from my Pixel 3a using Tapatalk
 
Back
Top