• AnyStream is having some DRM issues currently, Netflix is not available in HD for the time being.
    Situations like this will always happen with AnyStream: streaming providers are continuously improving their countermeasures while we try to catch up, it's an ongoing cat-and-mouse game. Please be patient and don't flood our support or forum with requests, we are working on it 24/7 to get it resolved. Thank you.

File Size increased?

What is your preferred program to do the reencode with out of the multitude? I've only had to reencode blurays of mine but have never taken a streamed file to redo it. The nvidia card I've got, 1660ti. I may try a few. Thanks btw it finished up overnight, the #'s are in 144GB all 7 seasons 98ES at 720p

I have used Handbrake with good success to re-encode files (larger to smaller size)
 
What is your preferred program to do the reencode with out of the multitude?
StaxRip ...
1660ti is sufficient to support all settings, got that one myself... from my experience, the more expensive models are nothing faster.
Have a look at the Encode/Decode Matrix:
Code:
https://developer.nvidia.com/video-encode-and-decode-gpu-support-matrix-new
 
I have used Handbrake with good success to re-encode files (larger to smaller size)

I also use HandBrake with NVENC and it works wonders. Be careful about one particular setting (placebo?) as I have had that one fail. Could have been a bug at the time that has since been fixed.
 
What is your preferred program to do the reencode with out of the multitude? I've only had to reencode blurays of mine but have never taken a streamed file to redo it. The nvidia card I've got, 1660ti. I may try a few. Thanks btw it finished up overnight, the #'s are in 144GB all 7 seasons 98ES at 720p

Certain channels do practice overkill....AMC (or it's equivalent in Germany) is one of them...All the Walking Dead shows are about 5gb's an episode also.
Starz was running a few old Western series at about 7gb's an episode.
Sony is another (AXN for you)...tends to lean on overkill also.

VidCoder is a simple and easy to use off-shoot of HandBrake....it's also free.

The NVENC Codec is fast, I use to run a season at a time overnight without a problem using my old 1060
With the 3060ti....it's about 1/3 of the time.
 
StaxRip ...
1660ti is sufficient to support all settings, got that one myself... from my experience, the more expensive models are nothing faster.
Have a look at the Encode/Decode Matrix:
Code:
https://developer.nvidia.com/video-encode-and-decode-gpu-support-matrix-new

I got a huge increase in speed with the 3060ti. The 3070ti is rumored to be another huge jump...but I wasn't willing to drop $1200 on a card....$500 was more than enough.
 
You have to learn what kind of content you will be encoding.

Simple animations like Family Guy can withstand a very low bitrate due to its simplicity (eg little to no colour gradients and quite often characters and backgrounds are not moving or are barely moving).

Action movies will require higher bitrates due to a lot of motion and complexities on screen.

Movies shot on film with will have film grain and will require higher bitrates to retain that film grain. Digitally shot films may have little to no grain and can therefore handle having lower bitrates without looking bad.

The whole CBR vs VBR debate is very misunderstood. Despite what a lot of people say on this forum, VBR is generally the superior option. CBR stands for Constant BitRate (the exact same bitrate from the start of the video file to the end of it) and VBR stands for Variable Bitrate (different bitrates throughout the video file, going up and down as needed). VBR should provide a higher quality at a lower bitrate. Picture a 5-minute scene that has Arnold Schwarzenegger blowing up a tank with a rocket launcher right before the 5-minute credits start to roll. With CBR at 5Mbps, the tank scene and the credits will both be encoded at 5Mbps. The credits will look fine but the tank scene will look like rubbish. With VBR, bitrates can go higher or lower throughout the video file. The credits could go down to 1Mbps but look indistinguishable against the credits encoded at 5Mbps because it is very simplistic with white text on a black background. The additional 4Mbps you sacrificed from the credits can then be dedicated to the tank scene which will boost it up to 9Mbps. The resulting tank scene at 9Mbps will look substantially better than the tank scene at 5Mbps but the resulting file size will be the same on average. DVDs, Blu-rays and 4K Blu-rays all use VBR for their video encodes.

The only time CBR should look better than VBR is if CBR's bitrate is substantially higher than the average VBR bitrate however, this will result in a massive file size that you would need to re-encode should you wish to conserve disk space.

I think in the past some playback hardware couldn't handle VBR well so that might be where the negative connotations stem from.
 
Couldn't care less about disk space. The only items I'm downloading are things not available on disc, which is always older stuff shot on film. So yeah - higher bitrate matters.

That said - I've downloaded 6 movies thus far. Zero difference. Same file size as before.

FYI - not complaining about wanting CBR files and then not liking the resultant size, in this case the 1080p and 720p are both VBR. It's just a notably big difference between like 3.5x that was odd

Yeah. I'm thinking these claims of CBR are BS as well. Nothing is CBR that I've seen thus far. Same sizes as always.

Amazon has *ALWAYS* had the highest bitrate of all streaming platforms. If someone isn't used to that, then it might explain the shock. Or maybe only some movies/tv shows have CBR?
 
Last edited:
Amazon has *ALWAYS* had the highest bitrate of all streaming platforms.

I agree with the sentiment that Amazon has always had the highest bitrate of all the streaming platforms however, so far from my experience Paramount+ content via Amazon Prime has a much higher audio bitrate but much lower video bitrates than Paramount+ directly. This may not always be the case though.
 
After 12 downloads with absolutely no difference, I'm not going to bother with any others. There's no difference in size whatsoever.
 
I got a huge increase in speed with the 3060ti. The 3070ti is rumored to be another huge jump...but I wasn't willing to drop $1200 on a card....$500 was more than enough.
I couldn't find much on the web regarding comparisons between 1660 and 3060 in terms of video encoding ... but in the results I found, the 1660 was NOT slower. Why would it? ... Nvidia put the same encoding hardware on the newer cards.
In other 3D related applications the 3060 clearly beats the 1660
 
Last edited:
I couldn't find much on the web regarding comparisons between 1660 and 3060 in terms of video encoding ... but in the results I found, the 1060 was NOT slower. Why would it? ... Nvidia put the same encoding hardware on the newer cards.
In other 3D related applications the 3060 clearly beats the 1660
Correct, all those Cuda cores mean nothing to encoding, the only thing that helps is the NVENC.dll. Thats it.
 
I couldn't find much on the web regarding comparisons between 1660 and 3060 in terms of video encoding ... but in the results I found, the 1060 was NOT slower. Why would it? ... Nvidia put the same encoding hardware on the newer cards.
In other 3D related applications the 3060 clearly beats the 1660

No offense, but I don't care what you couldn't find on the web. In REAL LIFE USE, the RTX 3060ti performs much faster than the GTX 1060
encoding videos. The GPU architecture has completely changed as well as the introduction of Tensor cores and Ray Tracing Cores that are not present on the 1060.
The 1060 is also PCIE 3 while the 3060ti is PCIE 4 compatible. AI software also runs considerably faster...dropping an operation on the same video from
7 hours of processing time to just over 3 hours with the 3060ti.
And...while the CODEC is still called NVENC, it's 2 generations higher in development.

Even considering the difference between a 1060 and a 1060ti, you are looking at a 35% improvement.

I think you need to consider all hardware combined. If I pulled a 1060 out of a 5 year old computer and slapped in a 3060ti, I wouldn't expect a great deal of
a performance boost, but considering all the improvements, I would expect moderate improvements.
I'm not running a 5 year old computer though.

Motherboard: ASROCK X570 Taichi Razer Edition
Processor: AMD 9 5950X
GPU: ASUS NVIDIA GeForce RTX 3060 Ti V2 Edition 8GB GDDR6 PCIE 4.0
Memory: Corsiar Vengence Pro 128GB DDR4 4000
Boot Drive: Samsung 970 EVO Plus 2TB M.2 NVME
Scratch Drive: Samsung 960 EVO 1TB M.2 NVME

Search...3060ti encoding video improvements for more detailed info.
AND as I stated...no offense intended. I believe the system as a whole makes a big difference.
 
Last edited:
Back
Top