Not sure if this is relevant but I have to under-clock my 1080Ti both core & mem by 100MHz before using it for video encoding or I get visual artifacts / a few random white dots every 150 frames or so in the output video file.
Thank you for saying this. I am in the same boat, and after turning off my GPU's overclock, I can successfully use HW acceleration for both Encoding and Decoding. Yay NO ARTIFACTING
Thanks again!
That is fascinating news. I believe, we are observing two different causes for artefacting, though. I'll explain.
We're still sifting through our own observations.
Things you should know:
the nVidia card has distinct units for video decoding, video encoding and CUDA.
When transcoding, in most situations, CloneBD requires all of them.
For example, when converting UHD to BD (HDR->SDR), the video decoder (NVDEC) converts the input stream into pictures, CUDA is used to scale them down to 1920x1080 and to do the HDR to SDR conversion, then the encoder (NVENC) creates an AVC stream.
Now overclocking most likely affects all three units - they are likely all tied to the same base clock.
nVidia is often used for overclocking and it is quite safe to assume, that most of the testing is done on the CUDA engine, while NVDEC/NVENC are not so much the focus, they are of no interest in gaming.
It does make some sense to me, that overclocking can break transcoding (as to the requirement for "underclocking" - depends, some manufacturers up the base clock and sell that as the normal clock with an option to further overclock, so that may already be overclocking, I don't know).
UPDATE: Hmm guess not all are fixed by removing the overclock. Mummy Returns and Split both still have artifacting using HW accel. :/
Now, about that.
I think we're talking about artefacts on the one hand, and artefacts on the other.
The ones due to overclocking may just be actual miscalculations, and obviously, that's something, we can't do anything about.
We do know of one UHD (Terminator 2), that really causes these artefacts and a jerky playback, not because of a stressed GPU, but because the nVidia decoder doesn't appear get the frame order right.
AVC and especially HEVC are a bit difficult in that way, because some frames are built upon previous frames and others upon future frames.
For the decoder to be able to decode them right, they are not delivered to the decoder in playback order, but in the order that allows these dependencies to build up progressively.
Simplified example: pictures 1, 2, 3 in display order. Picture 1 is encoded all by itself and doesn't depend on any other picture.
Picture 2 uses information both from pictures 1 and 3 and picture 3 is based on picture 1.
The decoder gets the pictures in order 1, 3, 2, so there's never a picture missing any information.
The decoder then has to reorder the pictures on output.
These dependencies are far more complex than this on actual UHDs, but it's basically the same.
With Terminator 2, nVidia gets the ordering mostly right. From 24 frames, 16 are aligned correctly and the other 8 are more or less messed up.
But it's not output-ordering alone, there are also these artefacts, which means that the decoder is also mixing up reference frames (using the wrong ones as base images).
The wierd thing about it is, that the nVidia decoder actually knows the correct order. There are tags in the pictures identifying their correct order (simply numbers going from 0, 1, 2, ....23).
The reason why I'm typing this lengthy techno-babble is, that you can understand the causes and maybe are able to tell the difference between the overclocking effect and the more systematic mess.
Do these remaining videos show that kind of "jerkiness", as if sometimes an individual frame is not appearing at the right time? Like a steady movement of an object, that suddenly jerks back and then continues normally.