• AnyStream is having some DRM issues currently, Netflix is not available in HD for the time being.
    Situations like this will always happen with AnyStream: streaming providers are continuously improving their countermeasures while we try to catch up, it's an ongoing cat-and-mouse game. Please be patient and don't flood our support or forum with requests, we are working on it 24/7 to get it resolved. Thank you.

File Size increased?

Not in encoding, maybe in gaming sure.
Still didn't find the article again...but in short, the increase comes from the 1060 using GDDR5 192 bit memory bus where the ti came with GDDR6 288 bit memory bus
The 1060 clock speeds averaged 1200MHz while the ti averaged 1500MHz. The ti also had a 20% increase in CUDA cores, but we already know that had no effect on
encoding. I'm going by memory, but the increase varied from 42% to 13%...albeit mostly due to differences in systems other components.
 
You realize, I was never talking about 1060 ?
1660 is a real thing, only one generation below 3060 (Turing vs Ampere) and has the same encoding hardware
See the surprising test here (and it's even with a 3070)
Code:
https://www.youtube.com/watch?v=YuV0ujgR5Qo
 
Your post..."but in the results I found, the 1060 was NOT slower."

The 1660 is 2 generations different. It's a series 10. Then there is the series 20 and then the series 30....
If you want to get technical, it's a series 16...still 2 generations below.

The hardware is NOT the same and the TI series are not the same as the regular series.

EDIUS isn't software I would use or test anything on....sorry, it's out of date and simply doesn't even support the latest GPU's.

Read the comments on the video you sent...one user confirmed the 1070 ti rendered the file in about half the time with Adobe Premiere.
It wasn't a surprising video at all...it was actually a joke.

How about a REAL test using Software designed to reencode video like Handbrake or VidCoder instead of software RECORDING video from a video game.
 
Ok, there was that one typo ... I apologize. :bowdown:
Just corrected that.
How about a REAL test using Software designed to reencode video like Handbrake or VidCoder instead of software RECORDING video from a video game.
Sure, what source, settings?
 
Last edited:
You mean, you compared the 1060 with 3060?
If so, then yes, you are most correct.

But I really want to compare my 1660ti with your 3060ti ... as I'm really curious what the result would be...
 
I am using a 3090Ti and I really maybe only see a couple of seconds difference between that and my old 1080Ti, the NVENC.dll is in the GeForce software, not the video card. Yes, there is a chipset in the video card but there have only been 2 versions in 10 years.I know you all spent a lot of cash on your cards and would like to believe there is a big difference, but alas it is not there.
 
@RedFox 1 Care to do some testing? Jimc seems to get cold feet ;)
And I really want to know, if my 300-bucks-card can perform as fast (or similar) as one for 1500 :dance:
 
@RedFox 1 Care to do some testing? Jimc seems to get cold feet ;)
And I really want to know, if my 300-bucks-card can perform as fast (or similar) as one for 1500 :dance:
I have tested that for years, and I have no interest in wasting my time on Nvidia software. The answer is it can, it all depends on whether you have a new chip in your card or the old NVENC in your card, plus yes it's much better for gaming, ray tracing, and other things, but encoding, you may get a few seconds difference between the old NVENC chip and the new one, all the information is in the GeForce software. Have fun with this thread, but all that information above my post is related to gaming not encoding. This will be my last post in this thread, I may move to the Chat forum. Since the 780ti there have only been 2 versions of the Nvidia NVENC chipset. It's great for gaming yes more FPS but not for encoding. Have fun. I didn't pay for my card, it was a freebie for services rendered from( Asus). I would have been just as happy with my old 1080. I am not a gamer.
 
Last edited:
I can tell you that without any testing at all. It won't, there's a price difference for a reason. Two of those reasons are the available VRAM and CUDA cores

This should give youageneral idea
https://gpu.userbenchmark.com/Compare/Nvidia-GTX-1660-Ti-vs-Nvidia-RTX-3090-Ti/4037vsm1818101
View attachment 66402

The 3090TI is about 2,5x faster

In regards to 3D performance .. yes, I totally agree.
Video encoding ... not so much.
That Userbench is all about 3D... that is also, where you need all the VRAM and core speed and memory bandwith... the money went there.

One thing that is great about NVENC on the GeForce RTX 20 and 30-series and GeForce GTX 1650 Super and up is that all GPUs have the same NVENC with the same performance and quality, from the RTX 2060 to the RTX 3090

Guess, where I found that quote ... on the NVIDIA page.

That's what I'm trying to tell ... the video encoding hardware did not change (much) from 1660 up ... and afaik will not change on the 4000 series.
 
the technology no, the cuda core count yes. 1660 TI: 1536 cuda cores, 3090 TI has 10752 cores, that's 7x as much, and the amount of cuda cores WILL significantly impact encoding speed.
 
1660? 3090? TI ?

Pfffttt, you are all posers because I have a "Bitchin Fast! 3D 2000"!!!!!!!!!!!!!!
(If you even recall this from when it came out it means you ain't no spring chicken)

My apologies but I felt the need to inject humor. :)
 

Attachments

  • BitchinFast3D2000.jpg
    BitchinFast3D2000.jpg
    640.1 KB · Views: 5
the technology no, the cuda core count yes. 1660 TI: 1536 cuda cores, 3090 TI has 10752 cores, that's 7x as much, and the amount of cuda cores WILL significantly impact encoding speed.

NVIDIA GPUs contain one or more hardware-based decoder and encoder(s) (separate from the CUDA cores) which provides fully-accelerated hardware-based video decoding and encoding for several popular codecs. With decoding/encoding offloaded, the graphics engine and the CPU are free for other operations.

from
Code:
https://developer.nvidia.com/nvidia-video-codec-sdk

come again?

For anyone interested and since this is the wrong forum, I'm trying to create a new post in 3rd party SW to try and find real results...
 
If the OP ( King_Pin ) doesn't mind I can move this thread to Gen Chat or the Third Party forum. I really do not care if it stays here.
 
I've noticed Amazon offers CBR and CVBR, the latter is?
There are two main types of encoding that you can choose from constant bitrate (CBR) and variable bitrate (VBR) encoding, CVBR is something that someone made up at another company. ( I think) Maybe they mixed the two together and came up with CVBR
 
CVBR is something that someone made up at another company. ( I think) Maybe they mixed the two together and came up with CVBR
Yes, for some reason they call it CVBR, no idea why. I think it is just VBR and they just call it that way, but I am no encoding expert. It is only a guess.
 
I am using a 3090Ti and I really maybe only see a couple of seconds difference between that and my old 1080Ti, the NVENC.dll is in the GeForce software, not the video card. Yes, there is a chipset in the video card but there have only been 2 versions in 10 years.I know you all spent a lot of cash on your cards and would like to believe there is a big difference, but alas it is not there.

There have been 3 gpu version in the last 6 years with multiple generations.
The Encoder has advanced 3 generations and the Decoder 5 generations.

And NO, the NVENC.dll is not the same for the 1080 and the 3090. from 2019 to date, the NVENC.dll has gone through 74 version changes.

You should be flying high at gaming or encoding with a RTX 3090Ti. It's actually considered more of a Workstation Graphics card than a gaming card.
For gaming...it's total overkill. If you can only see a few seconds between the 1080 and 3090, you have a major system bottleneck somewhere.
With the exception of memory...the 3090ti basically replaced the Quadro 8000

Did you run the program to completely remove any previous versions of the drivers?
Are you running the Game Ready Driver or the Studio Driver?
 
Yes, for some reason they call it CVBR, no idea why. I think it is just VBR and they just call it that way, but I am no encoding expert. It is only a guess.
Stupidly...they call it CVBR because they took the video from the CBR mpd and the audio from the VBR mpd. It apparently worked at first and you got CBR video and 640k or 782k Audio
NOW....people are complaining that neither version will get the higher Audio most of the time.
 
Back
Top