• AnyStream is having some DRM issues currently, Netflix is not available in HD for the time being.
    Situations like this will always happen with AnyStream: streaming providers are continuously improving their countermeasures while we try to catch up, it's an ongoing cat-and-mouse game. Please be patient and don't flood our support or forum with requests, we are working on it 24/7 to get it resolved. Thank you.

ReClock chitchat

just ran another test, and 90' later the VSYNC was still at the same location moving very slowly one pixel up/one pixel down.

no ill effect on tearing, and I got the whole shobang in ffdshow/avisynth MT on the 4 cores of my Q6600 :D

who says MPC simply has a lousy VMR9 presenter :D
 
That maybe happens to you. Not in my case :p
well I prefer VMR7 over VMR9 because it doesn't apply any PS sharpen script(dunno whether the ATi drivers are messing w/ my patience, or if it's actually built-in)

anyhow, in VMR9 the VSYNC usually synchronises on the very top line of the actual video.

in VMR7 it seems to synchronise in the top blanking most of the time.....I guess it's wiser as there can't possibly be any tearing, but that means I can't "see" it...

but it looks just as stable as VMR9, so that's cool.

I've also tried Overlay, but it doesn't support LUT's so I can't use it on my CRT..

anyway Reclock was meant to be used w/ these official renderers apparently, using anything else is begging for problems(my buddies on HCFR now use HR on Vista I think)..
 
hey Jong, prolly better to speak in here...that will not clutter the original topic and avoid trolls altogether :D

:bang: Last time I was active on the pstrip forum we were all assuming the pstrip camera and Reclock were the most accurate. It is enough to drive you mad. Ah.... maybe that explains it.
well the pstrip camera has always been undocumented because Rik has always been telling me that the measures it made were not accurate.
you can filter Rik's msg from my first post there(with the HD2600 48Hz prob), he was very clear about it.

he's got an oscilloscope and knows his sh*t I guess.

Also James has been clear that getting 3 figures after the digit was already a pretty impressive performance to do through DX..

anyway I'm discussing with Haali as we speak 8)
 
leeperry, still on XP or you switched to Vista? Are you now back trying to use Haali Renderer?
 
hey Jong, prolly better to speak in here...that will not clutter the original topic and avoid trolls altogether :D


well the pstrip camera has always been undocumented because Rik has always been telling me that the measures it made were not accurate.
you can filter Rik's msg from my first post there(with the HD2600 48Hz prob), he was very clear about it.
Yeah, I find it all a bit unhelpful to tell us that we need an oscilloscope to get 100% accuracy. We all know that (or should). The questions have always been "which of the software methods is most accurate" and "how accurate do they have to be to get perfectly smooth playback"?

So maybe the answer to the former is CCC? Still not sure.

When it comes to the latter, James tells us that for smooth output from the PC using Reclock it does not have to be accurate at all - Reclock makes all the necessary adjustments. But we are still left with the display device - how close to the internally supported rate of the display (as measured by its clock) does the source need to be for perfect, judderless playback? I know for many PAL TVs 48Hz is not close enough, even though they may accept it, what about 50.002Hz? If it is good enough fine, if not what reference do we use to set our timings?
 
Last edited:
This talks about the issue:

http://software.intel.com/en-us/articles/video-frame-display-synchronization

I have a feeling that with Reclock you do NOT want to force vsync. I may be wrong but I think I get the occasional "bad vsync" issues you describe, fixed by a fresh seek, when vsync is forced. They may well be caused by what is described in Fig 5. It's what we talked about some time ago with Haali. Things go wrong when you are very close, but not quite close enough to the right refresh rate. Unless you are very unlucky at the start, all is fine for a while, and then you hit the point where all hell breaks loose, as in Fig 5.

You remember that "little wobble" in the MPC-HC jitter you always hated when selecting VMR9 D3D. It may well be that this is exactly the kind of "best fit refresh cycle for each frame " described, i.e a positive feature designed to avoid the problems illustrated when slightly imperfect clocks synchronise in the wrong part of the refresh cycle as in Fig 5.
 
Last edited:
This talks about the issue:

http://software.intel.com/en-us/articles/video-frame-display-synchronization

I have a feeling that with Reclock you do NOT want to force vsync. I may be wrong but I think I get the occasional "bad vsync" issues you describe, fixed by a fresh seek, when vsync is forced. They may well be caused by what is described in Fig 5. It's what we talked about some time ago with Haali. Things go wrong when you are very close, but not quite close enough to the right refresh rate. Unless you are very unlucky at the start, all is fine for a while, and then you hit the point where all hell breaks loose, as in Fig 5.

You remember that "little wobble" in the MPC-HC jitter you always hated when selecting VMR9 D3D. It may well be that this is exactly the kind of "best fit refresh cycle for each frame " described, i.e a positive feature designed to avoid the problems illustrated when slightly imperfect clocks synchronise in the wrong part of the refresh cycle as in Fig 5.

This bit from the article is telling:

"Therefore, it is best to have a refresh rate that exactly matches the display rate. If this were the case, then each and every frame could be drawn at its actual frame time."

And it's one reason I use true multisyncing CRT based displays (24" monitor and CRT projector running off a profession video splitter).

My displays do not have internal refresh rates, they sync to the signal.
 
The following holds for the RADEON Cards - but I see no reason why it shouldnt be the same for other brands:

You cannot really rely on Reclock's refresh rate calculation. Its based on DX and thats possibly flawed from the start in this respect. Powerstrips display of the refresh rate was for a long time not precise enough (only 3 decimal digits, though a little more accurate than the reclock on-the-fly calculation) to get good results. I heard that might have changed in newer versions of powerstrip though.

Fact is: Of course using the displayed refreshrates is better than nothing, but with correctly calculated timings you get no drop/repeats (that is except for the obligatory one at the start of playback) for 4-5 hours.

So less drop/repeats with the displayed refreshrate: YES. But is near the possible optimum: No.

Unfortunatly its not as easy as to add 0.00x or substract 0.00y because it depends on resoluton/refreshrate/quartzdividers as well as some other stuff. But people tell me that the new Powerstrip solves this problem once and for all as Rik implemented an precise timings search engine. I didn't try that yet, as i have my own program for that. But if he did it correct you shouldn
t worry what reclock displays, as long as you get no drops/repeats.

regards,
Zardoz
I have played with Rik's search engine. Unfortuantely, for me it juts spat out a lot of timings my TV would not accept. I don't think it has changed.

pstrip camera seems to agree with Reclock on the refresh rate. Obvious the camera now offers a lot more digits, but when rounded it comes out the same as Reclock. Getting these spot on do reduce s/pdif drops/repeats over the standard value shown in CCC or pstrip advanced timings, which for my display are typically 0.002 out with XP in "default timer mode" and 0.001 out usig "/USEPMTIMER". But I'm not sure this really tells us anything very useful. Of course it improves drops/repeats - Reclock THINKS that is the refresh rate and is using that to decide when to drop/repeat. But it could still be inaccurate and lead to problems with the display device.

Probably it is better to get the display rate right. I don't think I have ever noticed a single dropped AC3 packet in passthrough mode (unlike a single dropped AC3 packet created by Reclock, which does lead to pops and crackles, but that is another matter!).
 
I'll never understand all the mistery surrounding Reclock and what exactly it does... 23.976 reported from players as 24.000 but that's just cosmetic, turn this on, leave that off... it's the single most "magical" piece of software that I know of.

As long as there's no full disclosure on its inner workings, I doubt that all this experimentation will go somewhere.
 
humm interesting link Jong!

yes I still run XP SP3, Aero on Vista makes HR hiccup constantly...and it shows.

from many ppl tests on HCFR, only D3D exclusive mode in MPC HC can be as smooth as HR.

but to me MPC has bad karma, so this is out of the question...besides I don't like the exclusive mode w/o GUI.

anyway, ogo has been constantly telling me that media players could drop frames after a while because they didn't sync with Reclock properly in the first place(basically what they explain in fig5)

Reclock will use its own "beat", but if it's not perfectly synced with the video renderer...at some point it will go bersek(reason why MPC/ZP6 will drop frames after 45/60' in non exclusive mode)

I use a CRT and a DLP pj that has a VSYNC option in its OSD, so luckily I don't have to bother about what the display likes/dislikes.

but as long as there won't be a video renderer fully synced w/ Reclock, we will suffer what is explained in fig5 I think.
 
Leeperry, what player, renderer and Operating system are you using?

I'm testing ZP6, VRM9 Windowless (no V-sync in Reclock) on SP3 right now.


Vista was fantastic....until I enabled my second monitor and it went all jerky.
 
Leeperry, what player, renderer and Operating system are you using?

I'm testing ZP6, VRM9 Windowless (no V-sync in Reclock) on SP3 right now.


Vista was fantastic....until I enabled my second monitor and it went all jerky.
KMP/HR/XP SP3

at some point I tried ZP6 + VRM9 Windowless, but it wasn't as smooth as HR.
plus it would drop frames after a while, just like MPC
if I had Reclock's VSYNC indicator enabled, it was clear that randomly it would start going nuts after a while.
besides I found the jitter(the average interval between the frames) really poor...it looks way smoother w/ HR, and prolly w/ exclusive VMR9/EVR in MPC HC too.....but VMR9 does some ugly sharpening, and EVR some crazy EE.

I've finally managed to make totozero from HCFR agree with me on EVR doing crazy EE.

plus HR is soon to offer PS gamut conversion, I've got a few betas but it's not too functional yet....only a matter of time, hopefully :D
 
humm interesting link Jong!

yes I still run XP SP3, Aero on Vista makes HR hiccup constantly...and it shows.

from many ppl tests on HCFR, only D3D exclusive mode in MPC HC can be as smooth as HR.

but to me MPC has bad karma, so this is out of the question...besides I don't like the exclusive mode w/o GUI.

anyway, ogo has been constantly telling me that media players could drop frames after a while because they didn't sync with Reclock properly in the first place(basically what they explain in fig5)

Reclock will use its own "beat", but if it's not perfectly synced with the video renderer...at some point it will go bersek(reason why MPC/ZP6 will drop frames after 45/60' in non exclusive mode)

I use a CRT and a DLP pj that has a VSYNC option in its OSD, so luckily I don't have to bother about what the display likes/dislikes.

but as long as there won't be a video renderer fully synced w/ Reclock, we will suffer what is explained in fig5 I think.
I think VMR9 D3D with vsync turned off (don't force in the graphics drivers!) is close to perfect. I think Haali may be broken due to its use of a buffer and its strict adherence to a stable beat rate - exactly the Fig.5 scenario - it works nicely until it drifts into that boundary region and then falls apart.

I know your initial experience was not so good, but Andrew of TheaterTek is still promising a version of TT that fully supports mkv internally, not just via DS graphs, including chapter points etc. That IMO will be the perfect D3D HD player, with its OSD fully supported in D3D mode.
 
Last edited:
well I like KMP too much to bother using anything else...seamless playback is really great stuff :)

and VMR9 looks too sharp...does some ugly interaction with my Avisynth LimitedSharpenFaster() script :policeman:

I think VMR9 D3D with vsync turned off (don't force in the graphics drivers!) is close to perfect.
why would u wanna turn the VSYNC off in the graphic drivers ?
HR is forcing it anyway, and I guess any D3D mode does too.
 
VMR9 and VRM7 look identical.

Or are you scaling stuff? I watch 1920x1080 at 1920x1080i at 96hz.

The videocard does not scale.
 
well I like KMP too much to bother using anything else...seamless playback is really great stuff :)

and VMR9 looks too sharp...does some ugly interaction with my Avisynth LimitedSharpenFaster() script :policeman:


why would u wanna turn the VSYNC off in the graphic drivers ?
HR is forcing it anyway, and I guess any D3D mode does too.
Actually I mixed up two things :eek:.

Yes, D3D forces vsync (at least I am pretty sure it does) but uses that "wobble" you see in MPC to avoid bad synchonisation between the "flip time" and "refresh time".

But, PDVD uses VMR9 renderless in non-D3D mode and THAT exhibits exactly what you describe for Haali if you force vsync in the graphics drivers - occasional horrible judder when you start playback at the wrong point in the vsync cycle and judder after an extended period of time. PDVD seems to do a perfectly adequate job of avoiding tearing by itself and using the graphics driver control panel to force vsync on it just messes things up. So, ignore what I am saying about vsync if you are not using PDVD or a non-D3D renderer. If it is not turning on vsync itself (and even maybe if it is) forced vsync will likely mess things up.

p.s. By they way I mean turn off "vsync FORCED on" or "off, unless the application requests it". I do not mean to "force vsync off, regardless of application". I suspect that would be very bad too!

p.p.s. I think it is inevitable that you may need to revisit any sharpening routine when changing renderers. That does not mean any one renderer is better/worse than another, but, yes, you may need more/less sharpening depending on the exact algorithm the renderer is using.
 
Last edited:
I've always used VMR9 Windowless, but had terrible tearing with sending 1080/24p or 480/24p. Using the latest Nvidia drivers I tested Overlay, and it works much better. No tearing, all color transforms (YUV->RGB) work right for both SD and HD content. The overlay implementation also keeps levels where they should be, and doesn't do the TV->PC levels transform. I use native resolution, so no scaling by the renderer (720x480 for DVD and 1920x1080 for BD). If you're having problems, test the lowly Overlay renderer. Works fine with VSYNC tools too.
 
I've always used VMR9 Windowless, but had terrible tearing with sending 1080/24p or 480/24p. Using the latest Nvidia drivers I tested Overlay, and it works much better. No tearing, all color transforms (YUV->RGB) work right for both SD and HD content. The overlay implementation also keeps levels where they should be, and doesn't do the TV->PC levels transform. I use native resolution, so no scaling by the renderer (720x480 for DVD and 1920x1080 for BD). If you're having problems, test the lowly Overlay renderer. Works fine with VSYNC tools too.

Only Overlay works with 24p. All other renderers (VMR, EVR) introduce tearing or other problems. They only work with higher refresh rates.
 
Actually I mixed up two things.

Yes, D3D forces vsync (at least I am pretty sure it does) but uses that "wobble" you see in MPC to avoid bad synchonisation between the "flip time" and "refresh time".
anyway...I'm hearing ppl say that the nvidia cards give smoother results...thing is the PQ on the several 8600GT I tried was terrible(some ppl from HCFR tried the 9600GT and came to the same conclusion) :eek:

whatever in 5xBNC on my old iiyama CRT, or in DVI on my HC3100.

the upgrade to the ATi was like salvation, much better blacks and much sharper picture 8)

like they've witnessed here, apparently nvidia's cheating at these HQV tests and that completely ruins the PQ(even on the windows desktop) :
http://www.xtremesystems.org/forums/showthread.php?t=157368

Thing is I like the ATi PQ, but I'm tired of pulling the monkey's tail to get my HTPC working smoothly :D

so I'm gonna try some PCI-E Matrox card, apparently they offer the best 2D picture on the market...can't get any worse than ATi I guess :D
 
What are you talking about leeperry?

If you do everything in software (RGB conversion, decoding, even (ick) scaling), then the only thing the video card could do is tear.

Nivida and ATi would be identical - they are both passive unless you use DXVA or scale in the drivers.
 
Back
Top