Jump to content

GTX 1660Ti / 1660 VS GTX 1080 Encoding Bandwidth

Go to solution Solved by ninbura,

Sad to report that the GTX 1660 has way less bandwidth than the GTX 1080:

https://postimg.cc/yDyN08Xp

https://postimg.cc/PLf5ynSX

 

Encoding 3 sources in FFmpeg (3440x1440@100FPS, 4K60, 1080p60) simultaneously resulted in the GTX 1660 capping out on decode immediately , this would result in the continuous climb of system memory until fully saturated wrecking havoc across the entire PC. As can be seen in the images above the GTX 1660 still wasn't encoding real-time even 5 minutes into the recording.
 

In comparison, the GTX 1080 was at a measly 40% load... almost reaching real-time encoding instantly. This puts the GTX 1660 at less than half the bandwidth as the GTX 1080!


Kinda sucks that there isn't a dual chip RTX option, but oh well... was hoping to get higher quality streams with an RTX NVENC chip while retaining the ability to record high quality source footage separately. Really goes to show how worthless the NVIDIA Encode Matrix is.


PS I was using a patch to bypass NVIDIA's two stream encode limit.

Hey I was wondering if anyone could confirm (preferably someone who actually has one) that the GTX 1660ti / 1660 has the same or more encoding bandwidth in comparison to the GTX 1080.

When I say bandwidth I mean bandwidth, not performance, I am aware that the Turing NVENC chip produces a higher quality image. What I would like to know is if it can handle the same load, for example, my GTX 1080 can encode 2 4K streams at a 288M bit-rate simultaneously no sweat, while the GTX 1070 and below do not have this capability. The GTX 1080 actually has 2 NVENC engines giving it 2x the bandwidth of it's like series brethren, see NVIDIA's Encode Matrix. You'll also see that the GTX 1070 is listed to have 2 NVENC chips, but for reasons unknown to me only one of them is active, so it still remains true that the GTX 1080 has higher encoding bandwidth. It's reasons like these that I ask and not assume, NVENC chips have in the past been very tricky to judge on paper.

 

Basically, if I could get someone to encode a high bit-rate video in something like OBS and screen shot the GPU section of task manager while doing so I would be highly appreciative. Preferably a 4K60 video at a 288M VBR (variable bit-rate) bitrate, but a 1080p60 video at a 288M VBR will still theoretically work and give me something to reference in comparison to my GTX 1080. Also worth mentioning that something has to be going on on screen to get an accurate measure, the more fast paced the game the more stress it puts on the encoder, so something like Rocket League or Apex Legends works great. Really just looking to see what percentage the "Video Encode" encode section of task manager is at while encoding one of the aforementioned streams.

I consistently encode 2 4k streams simultaneously so it's important for me to know that a 1660ti / 1660 can do the same before picking one up. If it can however, they're cheaper, produce a higher quality image, and consume less power, so there would be no reason not to make the swap. PS this is for a dedicated encoding rig, so the 3D performance of the card is inconsequential.

Main PC: Corsair 900D | ProArt Z690-Creator | Intel 13900K | RTX 4090 | Trident Z5 (2x32GB) | 1TB 980 Pro, 2TB Sabrent Rocket 4+, 2TB 980 Pro, 1TB Sabrent Rocket | HX1200i

Capture PC: Meshify XL | Designare TRX40 | AMD 3960X | 2xRTX 4070 TI | Trident Z (4x16GB) | 2TB 970 Evo Plus, 1TB 970 Evo Plus | Dual HDMI 4K Plus LT, 2xElgato 4K 60 Pro, HX850

Media / Render PC: Corsair 900D (shared) | ASRock X399M | AMD 2970WX | RTX 4070 TI | Trident Z (2x16GB) | 2TB Samsung 970 Evo | 2xElgato HD60 Pro | HX750
Full Room Watercooling: EK X3 400 | EK-XTOP Revo Dual D5 | 4xHardware Labs 560GTX | 16xSilentWings 4 Pro  | EVGA 450 B3

Peripherals: Logitech G502 X |  Wooting 60HE | Xbox Elite Controller Series 2 | Logitech G502 Wireless | Logitech MX Keys Mechanical

Displays: Asus XG35VQ | 2xLG 24UD58-B | LG 65UH6030 | Asus VH242H | BenQ GW2480 | HP 22CWA | Kenowa CNC-1080P | Asus VC39H

Audio Interfaces : RME Fireface UFX+, Scarlett 18i20, RME HDSPe RayDAT, RME HDSPe MADI FX, RME ADI-648, RME ADI-192 DD

Audio Playback: 2xYamaha HS5 & Yamaha HS8s | Sennheiser HD820, Sennheiser IE 500 Pro, Ultimate Ears RR CIEMs

Link to comment
Share on other sites

Link to post
Share on other sites

Sad to report that the GTX 1660 has way less bandwidth than the GTX 1080:

https://postimg.cc/yDyN08Xp

https://postimg.cc/PLf5ynSX

 

Encoding 3 sources in FFmpeg (3440x1440@100FPS, 4K60, 1080p60) simultaneously resulted in the GTX 1660 capping out on decode immediately , this would result in the continuous climb of system memory until fully saturated wrecking havoc across the entire PC. As can be seen in the images above the GTX 1660 still wasn't encoding real-time even 5 minutes into the recording.
 

In comparison, the GTX 1080 was at a measly 40% load... almost reaching real-time encoding instantly. This puts the GTX 1660 at less than half the bandwidth as the GTX 1080!


Kinda sucks that there isn't a dual chip RTX option, but oh well... was hoping to get higher quality streams with an RTX NVENC chip while retaining the ability to record high quality source footage separately. Really goes to show how worthless the NVIDIA Encode Matrix is.


PS I was using a patch to bypass NVIDIA's two stream encode limit.

Main PC: Corsair 900D | ProArt Z690-Creator | Intel 13900K | RTX 4090 | Trident Z5 (2x32GB) | 1TB 980 Pro, 2TB Sabrent Rocket 4+, 2TB 980 Pro, 1TB Sabrent Rocket | HX1200i

Capture PC: Meshify XL | Designare TRX40 | AMD 3960X | 2xRTX 4070 TI | Trident Z (4x16GB) | 2TB 970 Evo Plus, 1TB 970 Evo Plus | Dual HDMI 4K Plus LT, 2xElgato 4K 60 Pro, HX850

Media / Render PC: Corsair 900D (shared) | ASRock X399M | AMD 2970WX | RTX 4070 TI | Trident Z (2x16GB) | 2TB Samsung 970 Evo | 2xElgato HD60 Pro | HX750
Full Room Watercooling: EK X3 400 | EK-XTOP Revo Dual D5 | 4xHardware Labs 560GTX | 16xSilentWings 4 Pro  | EVGA 450 B3

Peripherals: Logitech G502 X |  Wooting 60HE | Xbox Elite Controller Series 2 | Logitech G502 Wireless | Logitech MX Keys Mechanical

Displays: Asus XG35VQ | 2xLG 24UD58-B | LG 65UH6030 | Asus VH242H | BenQ GW2480 | HP 22CWA | Kenowa CNC-1080P | Asus VC39H

Audio Interfaces : RME Fireface UFX+, Scarlett 18i20, RME HDSPe RayDAT, RME HDSPe MADI FX, RME ADI-648, RME ADI-192 DD

Audio Playback: 2xYamaha HS5 & Yamaha HS8s | Sennheiser HD820, Sennheiser IE 500 Pro, Ultimate Ears RR CIEMs

Link to comment
Share on other sites

Link to post
Share on other sites

That kinda sucks. Was hoping to swap mine for one. Figured their new fancy encoder would be worth it. Thanks for the info. 

Main RIg Corsair Air 540, I7 9900k, ASUS ROG Maximus XI Hero, G.Skill Ripjaws 3600 32GB, 3090FE, EVGA 1000G5, Acer Nitro XZ3 2560 x 1440@240hz 

 

Spare RIg Lian Li O11 AIR MINI, I7 4790K, Asus Maximus VI Extreme, G.Skill Ares 2400 32Gb, EVGA 1080ti, 1080sc 1070sc & 1060 SSC, EVGA 850GA, Acer KG251Q 1920x1080@240hz

 

Link to comment
Share on other sites

Link to post
Share on other sites

  • 1 year later...

Hey guys I got a question that might piqued your interest. And to start with I will give you a background story. I just want to build a pc for streaming and hopefully can play the AAA games for the next 3 years. But my budget is well quite limited.

Looking at the local market for my budget my option is fall into used or second hand GTX1070 or go with GTX1660,1660super, or safe some budget and go with 1650super.

I was pretty sure to go with GTX1070 since it gives better FPS across the board. But I found this video that you guys can check out. I will leave the link down below. But please read my post until finished first.

 

Looking at this video, he tested and said that his GTX1080 perform worst in streaming compared to 1650super. It absolutely producing worse. He siad he can only stream up to 900p60 with GTX1080 compared to 1080p60 using 1650super. Looks like the game dropped a lot of FPS if he do streaming with his GTX1080. Please continue reading my post, I include the video link down below.

 

From this I heard that GTX1080 produce worst when streaming compared to 1650 because 16xx series and in this case 1650 has a dedicated chip to encode streaming and 1080 don't eventhough both have NVENC. This chip allow 16xx series to save their VRAM for the game instead of having some of them taken for encoding the stream which happen on the 1080 because it doesn't have the dedicated stream encoding chip.

 

So I am asking for anyone who have streaming experience with both cards 1070/1080 vs 1650super /1660 or 10xx series vs 16xx series. Is it true that even 1650 super will perform better in streaming compared to 1070/1080?

 

The video:

https://youtu.be/FeCm10Xdkno?t=33

Link to comment
Share on other sites

Link to post
Share on other sites

On 7/1/2020 at 7:18 AM, PCForStreaming said:

Hey guys I got a question that might piqued your interest. And to start with I will give you a background story. I just want to build a pc for streaming and hopefully can play the AAA games for the next 3 years. But my budget is well quite limited.

Looking at the local market for my budget my option is fall into used or second hand GTX1070 or go with GTX1660,1660super, or safe some budget and go with 1650super.

I was pretty sure to go with GTX1070 since it gives better FPS across the board. But I found this video that you guys can check out. I will leave the link down below. But please read my post until finished first.

 

Looking at this video, he tested and said that his GTX1080 perform worst in streaming compared to 1650super. It absolutely producing worse. He siad he can only stream up to 900p60 with GTX1080 compared to 1080p60 using 1650super. Looks like the game dropped a lot of FPS if he do streaming with his GTX1080. Please continue reading my post, I include the video link down below.

 

From this I heard that GTX1080 produce worst when streaming compared to 1650 because 16xx series and in this case 1650 has a dedicated chip to encode streaming and 1080 don't eventhough both have NVENC. This chip allow 16xx series to save their VRAM for the game instead of having some of them taken for encoding the stream which happen on the 1080 because it doesn't have the dedicated stream encoding chip.

 

So I am asking for anyone who have streaming experience with both cards 1070/1080 vs 1650super /1660 or 10xx series vs 16xx series. Is it true that even 1650 super will perform better in streaming compared to 1070/1080?

 

The video:

https://youtu.be/FeCm10Xdkno?t=33

The GTX 1080 has 2 pascal NVENC chips so it has more bandwidth, the 1650 has a newer Turing NVENC chip so it has better quality.

I encode 3x4k60 streams with my GTX 1080 in real-time, there is no resolution bottleneck with the GTX 1080 (unless you're doing over 8k60 in one stream), him not being able to stream 1080p with the GTX 1080 doesn't make any sense. That being said like I mentioned earlier, the 1650 has a Turing NVENC chip which has better quality, just worse bandwidth.

For 99.99% of people the 1650 Super will make more sense since it's cheaper and produces a higher quality encode, and there really aren't many looking to encode over 4k60 which would overload its available bandwidth.

Main PC: Corsair 900D | ProArt Z690-Creator | Intel 13900K | RTX 4090 | Trident Z5 (2x32GB) | 1TB 980 Pro, 2TB Sabrent Rocket 4+, 2TB 980 Pro, 1TB Sabrent Rocket | HX1200i

Capture PC: Meshify XL | Designare TRX40 | AMD 3960X | 2xRTX 4070 TI | Trident Z (4x16GB) | 2TB 970 Evo Plus, 1TB 970 Evo Plus | Dual HDMI 4K Plus LT, 2xElgato 4K 60 Pro, HX850

Media / Render PC: Corsair 900D (shared) | ASRock X399M | AMD 2970WX | RTX 4070 TI | Trident Z (2x16GB) | 2TB Samsung 970 Evo | 2xElgato HD60 Pro | HX750
Full Room Watercooling: EK X3 400 | EK-XTOP Revo Dual D5 | 4xHardware Labs 560GTX | 16xSilentWings 4 Pro  | EVGA 450 B3

Peripherals: Logitech G502 X |  Wooting 60HE | Xbox Elite Controller Series 2 | Logitech G502 Wireless | Logitech MX Keys Mechanical

Displays: Asus XG35VQ | 2xLG 24UD58-B | LG 65UH6030 | Asus VH242H | BenQ GW2480 | HP 22CWA | Kenowa CNC-1080P | Asus VC39H

Audio Interfaces : RME Fireface UFX+, Scarlett 18i20, RME HDSPe RayDAT, RME HDSPe MADI FX, RME ADI-648, RME ADI-192 DD

Audio Playback: 2xYamaha HS5 & Yamaha HS8s | Sennheiser HD820, Sennheiser IE 500 Pro, Ultimate Ears RR CIEMs

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×