Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
GabenJr

Nvidia, you PROMISED!

Recommended Posts

Posted · Original PosterOP

Nvidia’s RTX 3080 is here, bringing Ampere to the masses and promising DOUBLE the performance of the RTX 2080! Could they really have been telling us the truth this whole time?

 

 

Buy an RTX 3080 (starting Sept 17th)
On Amazon (PAID LINK): TBD
On Newegg (PAID LINK): TBD
On B&H (PAID LINK): TBD


Anthony @ LINUS MEDIA GROUP             

I'm a handsome man with a charming personality. - Gabe Newell

Link to post
Share on other sites

So it was a bit over hyped but still a great product?

 

The title's tricking me.


Main Rig :

Ryzen 7 2700X | Powercolor Red Devil RX 580 8 GB | Gigabyte AB350M Gaming 3 | 16 GB TeamGroup Elite 2400MHz | Samsung 750 EVO 240 GB | HGST 7200 RPM 1 TB | Seasonic M12II EVO | CoolerMaster Q300L | Dell U2518D | Dell P2217H | 

 

Laptop :

Thinkpad X230 | i5 3320M | 8 GB DDR3 | V-Gen 128 GB SSD |

Link to post
Share on other sites

Meme Creator - Funny You're a la-la-la-la Liar Meme Generator at  MemeCreator.org!

 

but still great performance


|Ryzen 7 3700x || Asus strix 5700 XT || Asus strix x570-E || G.skill tridentZ Neo 32gb 3666mhz cl16-19-19-19-39 || cooler master ml360 RGB AIO || Kolink Horizon RGB || Thermaltake grand rgb 750w 80+gold psu || samsung 250 gb ssd || toshiba 5400rpm 1tb hdd || HyperX cloud alpha || Corsair k55 || Asus gladius 2 origin gaming mouse || Monitor 1 Asus 1080p 144hz || Monitor 2 AOC 1080p 75hz ||   QUOTE ME, FOR ANSWER.

Read this: 

 

 

 

 

 

 

 

Link to post
Share on other sites

What happened to Linus' audio at the 9min mark when discussing Nvidia Reflex? Sounds like it was a recording from a skype call. Sticks out like a sore thumb from the audio recorded in the studio throughout the rest of the video.

 

@GabenJr Not sure if this is from the writers or the editors, but a few thoughts about the graphs in the video...

 

I find the graphs a little confusing having the minimum framerate aligned on the far left of the bar. I think it would look better if it was at the end of the blue bar where the value actually lies, in the same way that the other values (in this case average FPS) are shown.

image.thumb.png.3ae115bfe85d6d91aa43e8536589bff4.png

 

An example of what I think makes more sense:

image.thumb.png.a65c895a215d0365f92417f6195721d3.png

 

 

 

The Blender tests I think they should be shown as two different bars since the BMW and Pavilion test are independent of each other, so they shouldn't be displayed in the same bar. Not sure why the Blender test is displayed that way when other tests shown later in the video like the SpecviewPerf results with separate tests are each given their own bar. (and again, I don't like the way the time is aligned to the far left of the blue bar)

image.thumb.png.96cce1dd25b2534eacd536af470a1434.png

 

 

Also RIP AMD who decided to call in sick on RTX3080 Benchmark Day.


CPU: Intel i7 6700k  | Motherboard: Gigabyte Z170x Gaming 5 | RAM: 2x8GB 3000MHz G.Skill Ripjaws 5 | GPU: Gigabyte Aorus GTX 1080ti | PSU: Corsair RM750x (2018) | Case: BeQuiet SilentBase 800 | Cooler: Corsair H100i AIO | SSD: Samsung 970 Evo 500GB + Samsung 840 500GB | HDD: Seagate Ironwolf 8TB + 2x Seagate Ironwolf 6TB | Monitor: Acer Predator XB271HU + Samsung BX2450

Link to post
Share on other sites

I'm disappointed that linus's reviews doesn't cover in depth as it uses to do before. 

 

I got more out of Jayz 2 cents video than linus'a video as the benchmarks linus use is not sufficient and a bit confusing 

 

 

 


 

Spoiler
Spoiler

Intel i7 4790K (4.0 GHz) | MSI Z97-GAMING 5 | Corsair Vengeance 8GB DDR3-1866 2x8GB | Asus GeForce GTX 780 Ti DirectCU II OC | Samsung 840 Pro Series 256GB | Corsair RM 850W | Corsair H90 94.0 CFM | Logitech® Wireless Combo MK330 | Cooler Master HAF XM | Dell S2240L 60Hz 21.5 IPS | 

PCPartPicker 

Link to post
Share on other sites
11 minutes ago, Rohith_Kumar_Sp said:

I'm disappointed that linua's reviews doesn't cover in depth as it uses to do before. 

they havent covered GPUs or CPUs in dept for a long time. as far as im concerned, they are an entertainment channel, and not for those that want deep dives. 

Link to post
Share on other sites
39 minutes ago, HenrySalayne said:

Noise? Nvidia claimed it will be "10x" quieter than the previous generation (so probably 10 dB, SPL is not measured linear). 

Level is just a level, and dB is one way of expressing those levels on a non-linear scale. Without going back to check exactly what was claimed, a 10x change in sound pressure level is a 20 dB change.

 

A 10 dB change is roughly 3x level change.


Main system: Asus Maximus VIII Hero, i7-6700k stock, Noctua D14, G.Skill Ripjaws V 3200 2x8GB, Gigabyte GTX 1650, Corsair HX750i, In Win 303 NVIDIA, Samsung SM951 512GB, WD Blue 1TB, HP LP2475W 1200p wide gamut

Desktop Gaming system: Asrock Z370 Pro4, i7-8086k stock, Noctua D15, Corsair Vengeance Pro RGB 3200 4x16GB, Asus Strix 1080Ti, NZXT E850 PSU, Cooler Master MasterBox 5, Optane 900p 280GB, Crucial MX200 1TB, Sandisk 960GB, Acer Predator XB241YU 1440p 144Hz G-sync

TV Gaming system: Asus X299 TUF mark 2, 7920X @ 8c8t, Noctua D15, Corsair Vengeance LPX RGB 3000 8x8GB, Gigabyte RTX 2070, Corsair HX1000i, GameMax Abyss, Samsung 970 Evo 500GB, LG OLED55B9PLA

VR system: Asus Z170I Pro Gaming, i7-6700T stock, Scythe Kozuti, Kingston Hyper-X 2666 2x8GB, Zotac 1070 FE, Corsair CX450M, Silverstone SG13, Samsung PM951 256GB, Crucial BX500 1TB, HTC Vive

Gaming laptop: Asus FX503VD, i5-7300HQ, 2x8GB DDR4, GTX 1050, Sandisk 256GB + 480GB SSD

Link to post
Share on other sites

Why would you use a 3950X for the test bench? Or rather, why only a 3950X, and not also a 10700K/10900K to see whether Nvidia's claim that "a faster CPU still is more important than PCIe 4.0" holds true (which it does)?

 

It was already known that Zen 2 holds back the 2080Ti even at 1440p, if not lower-end SKUs as well, so imo the testing methodology for gaming benchmarks made no sense. Just like including CS:GO in the benchmark suite, which is known to be a mostly CPU bound title regardless of resolution, making the CPU choice even more questionable.


Desktop: Intel Core i9-9900K | be quiet! Dark Rock Pro 4 | ASUS Strix Z390-F | G.Skill Trident Z Neo 2x16GB, 3200MHz 14-14-14-34 | EVGA GeForce RTX 2070 SUPER XC Ultra | Corsair RM650x | Fractal Design Define R6

Laptop: 2018 Apple MacBook Pro 13" (i5-8259U | 8GB LPDDR3 | 512GB NVMe)

Peripherals: Ducky Shine 7 (Cherry MX Brown) | Logitech MX Master 3 & Razer Basilisk X HyperSpeed | Beyerdynamic Custom One Pro Plus | Audio-Technica AT2020USB+

Displays: Alienware AW2521HF & BenQ BL2420PT

Link to post
Share on other sites
7 minutes ago, Mateyyy said:

Why would you use a 3950X for the test bench? Or rather, why only a 3950X, and not also a 10700K/10900K to see whether Nvidia's claim that "a faster CPU still is more important than PCIe 4.0" holds true?

 

It was already known that Zen 2 holds back the 2080Ti even at 1440p, if not lower-end SKUs as well, so imo the testing methodology for gaming benchmarks made no sense. Just like including CS:GO in the benchmark suite, which is known to be a mostly CPU bound title regardless of resolution, making the CPU choice even more questionable.

This is true. Pci 3rd gen 10900k is giving more performance than a ryzen 4th gen in digital foundry's review. Hence my confusing and disappointment with the review. 


 

Spoiler
Spoiler

Intel i7 4790K (4.0 GHz) | MSI Z97-GAMING 5 | Corsair Vengeance 8GB DDR3-1866 2x8GB | Asus GeForce GTX 780 Ti DirectCU II OC | Samsung 840 Pro Series 256GB | Corsair RM 850W | Corsair H90 94.0 CFM | Logitech® Wireless Combo MK330 | Cooler Master HAF XM | Dell S2240L 60Hz 21.5 IPS | 

PCPartPicker 

Link to post
Share on other sites

My comments based not only on LTT vid, but other benchmark vids form other sources aswell

 

Went in ready to be disappointed, and i wasnt disappointed in being disappointed.

 

Im coming from a 4k 120hz standpoint. It just cant do it unfortunately.

 

Great gains vs the 2080 for sure, but the 2080 was such a disappointing card due to its price and competition vs the 1080ti, so it really didnt stand a chance. A roughly 60% on average gain vs it is nice, but expected, though somewhat disappointing as 60% is not 100% (doubling)

 

The biggest disappointment for me is its 4k results, which it can now safely be said that the 3080 is THE 4k gaming card, but its not 120hz capable in the most demanding AAA titles commonly tested which is a shame.

 

The 3090 being so highly overpriced wont be any help here either even if it can handle 4k 120hz, however the 3080ti which we should expect can hopefully fill that gap so long as Nvidia doesn't try to increase the price of the X80ti tier like they did during the 20 series, it should slot in at the current launch price of the 3080, and the 3080 at that time brought down in price. Same as they did in previous generations.

 

Still, the 3080 is a great option for those looking to finally upgrade form a 1080ti, just not really as good as what those running 4k were hoping for. if it had truly been a doubling of the 2080 performance, it probably could have been good for 4k120hz, alas, and not unsurprisingly, Nvidia Lied.


CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w | VDU: Panasonic 42" Plasma |

GPU: Gigabyte 1080ti Gaming OC w/OC & Barrow Block | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + Samsung 850 Evo 256GB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P |

Link to post
Share on other sites

You all are missing the greatest part of the RTX 3080 - the design means that we will finally be able to see one of the GPU fans from a normal case layout


 

Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Link to post
Share on other sites
37 minutes ago, Mateyyy said:

Why would you use a 3950X for the test bench? Or rather, why only a 3950X, and not also a 10700K/10900K to see whether Nvidia's claim that "a faster CPU still is more important than PCIe 4.0" holds true (which it does)?

 

It was already known that Zen 2 holds back the 2080Ti even at 1440p, if not lower-end SKUs as well, so imo the testing methodology for gaming benchmarks made no sense. Just like including CS:GO in the benchmark suite, which is known to be a mostly CPU bound title regardless of resolution, making the CPU choice even more questionable.

so many other reviewers use intel its nice to see an amd test

Link to post
Share on other sites

This review below is very interesting.

It shows the effect of CPUs on the 3080, the most important part IMO is the 4770k performance and its performance of the 3080 at 4k.

https://www.tomshardware.com/features/nvidia-geforce-rtx-3080-ampere-cpu-scaling-benchmarks

 

TLDR: if ur running an older CPU and want to get a RTX 3080, 4k should be ur target resolution as it negates, almost completely, the increase in CPU performance over the generations.


CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w | VDU: Panasonic 42" Plasma |

GPU: Gigabyte 1080ti Gaming OC w/OC & Barrow Block | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + Samsung 850 Evo 256GB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P |

Link to post
Share on other sites
1 minute ago, spartaman64 said:

so many other reviewers use intel its nice to see an amd test

It's a GPU review. Logically you'd expect the respective GPUs to be paired with hardware that'll allow them to run at their full potential, so the viewer can get an idea of how fast GPU X is in comparison with GPU Y.

Might as well run the benchmarks on Bulldozer at 900p and say "look, the 3080 performs the same as a 1660!!".


Desktop: Intel Core i9-9900K | be quiet! Dark Rock Pro 4 | ASUS Strix Z390-F | G.Skill Trident Z Neo 2x16GB, 3200MHz 14-14-14-34 | EVGA GeForce RTX 2070 SUPER XC Ultra | Corsair RM650x | Fractal Design Define R6

Laptop: 2018 Apple MacBook Pro 13" (i5-8259U | 8GB LPDDR3 | 512GB NVMe)

Peripherals: Ducky Shine 7 (Cherry MX Brown) | Logitech MX Master 3 & Razer Basilisk X HyperSpeed | Beyerdynamic Custom One Pro Plus | Audio-Technica AT2020USB+

Displays: Alienware AW2521HF & BenQ BL2420PT

Link to post
Share on other sites
4 minutes ago, Mateyyy said:

It's a GPU review. Logically you'd expect the respective GPUs to be paired with hardware that'll allow them to run at their full potential, so the viewer can get an idea of how fast GPU X is in comparison with GPU Y.

Might as well run the benchmarks on Bulldozer at 900p and say "look, the 3080 performs the same as a 1660!!".

except the difference isnt that extreme and theres a significant amount of people running amd instead of intel

Link to post
Share on other sites
16 minutes ago, spartaman64 said:

except the difference isnt that extreme

It's still a difference regardless.

16 minutes ago, spartaman64 said:

and theres a significant amount of people running amd instead of intel

That irrelevant in this context.


Desktop: Intel Core i9-9900K | be quiet! Dark Rock Pro 4 | ASUS Strix Z390-F | G.Skill Trident Z Neo 2x16GB, 3200MHz 14-14-14-34 | EVGA GeForce RTX 2070 SUPER XC Ultra | Corsair RM650x | Fractal Design Define R6

Laptop: 2018 Apple MacBook Pro 13" (i5-8259U | 8GB LPDDR3 | 512GB NVMe)

Peripherals: Ducky Shine 7 (Cherry MX Brown) | Logitech MX Master 3 & Razer Basilisk X HyperSpeed | Beyerdynamic Custom One Pro Plus | Audio-Technica AT2020USB+

Displays: Alienware AW2521HF & BenQ BL2420PT

Link to post
Share on other sites

Lol at 9:55 he said  "That Microsoft is implementing into their Xbox Series X and Xbox Series X"


I am still TechWizardThatNeedsHelp, just less of a mouthfull.

 

My beautiful, but not that powerful, main PC:

My new PC I'm saving for:

  • NZXT H1 Matte Black
    • Comes with a 650W NZXT PSU
    • NZXT AIO
  • Ryzen 5 3600
  • MSI B450i GAMING PLUS AC
  • XPG ADATA 2800Mhz
  • RX480 untill I have more for a 2060 or 2070
  • 2tb Sabrent Rocket Q PCIE gen 3 NVME SSD
  • Samsung 470 128GB SATA SSD
Link to post
Share on other sites

Hey hey, just crusin' by lo leave an interesting 3080 FE vs 2080 FE benchmark done in Black Desert Online.

Pretty demanding title, mostly CPU-Bound than GPU-Bound, but apparently 2080 FE was a huge bottlecap (just see CPU utilization in both cases).

The difference is around 33% raw-FPS increase, but it's pretty interesting too see nonetheless (fun fact: In BDO one of the playerbase sayings is more FPS = more damage, mostly due to how poorly optimized the game is, so after the OCs are released, I'll probably go out for one, since this is my most played title so far). 
 

 

 


🇧🇷 - Average 3080 FE price after taxations - 992 dollars

Current PC specs - not the worst, just 2011 😕:

Case: NOX NXLITE010 (older model, PSU on top)

PSU: One Power 500W

Motherboard: Gigabyte H61M-S1
CPU: Intel Core i7 2600
GPU: GTX 1060 6GB
RAM: DDR3 Kingston Hyperx Fury 16gb (2x8) 1600MHz
Storage: 240gb Kingston SSD, 500gb HD
Cooling solutions: NZTX AER F120 (Case), Cooler Master Hyper 212 Turbo (CPU)

"Why not upgrade buddy?" First of all, I'm in a scholarship (College - Computer Engineering), and my financial condition isn't the best. Once I have a stable job, and help my family with their debts, I'll consider going to a Ryzen 5 2600 or something in this region.

Link to post
Share on other sites

"Double the performance! in very specific scenarios and workload. "

 

At $1k CAD, it's way out of my budget, but it's nice to see that they've at least made it cheaper than their previous top of the line cards. The used market in 5 years from now looking very attractive.


CPU: AMD Ryzen 3600 / GPU: Radeon HD7970 GHz 3GB(upgrade pending) / RAM: Corsair Vengeance LPX 2x8GB DDR4-3200
MOBO: MSI B450m Gaming Plus / NVME: Corsair MP510 240GB / Case: TT Core v21 / PSU: Seasonic 750W / OS: Win 10 Pro

Link to post
Share on other sites
26 minutes ago, SolarNova said:

This review below is very interesting.

It shows the effect of CPUs on the 3080, the most important part IMO is the 4770k performance and its performance of the 3080 at 4k.

https://www.tomshardware.com/features/nvidia-geforce-rtx-3080-ampere-cpu-scaling-benchmarks

 

TLDR: if ur running an older CPU and want to get a RTX 3080, 4k should be ur target resolution as it negates, almost completely, the increase in CPU performance over the generations.

welp, i have 4790k planning to get 3080 but for 1080/1440p at most, do i need an cpu upgrade? that'd cost a lot since my rig is like 6 years old 


edit  dafuq? 
image.png.60b345082200ee31bdcb2f210d579115.png
holy batman 

 


 

Spoiler
Spoiler

Intel i7 4790K (4.0 GHz) | MSI Z97-GAMING 5 | Corsair Vengeance 8GB DDR3-1866 2x8GB | Asus GeForce GTX 780 Ti DirectCU II OC | Samsung 840 Pro Series 256GB | Corsair RM 850W | Corsair H90 94.0 CFM | Logitech® Wireless Combo MK330 | Cooler Master HAF XM | Dell S2240L 60Hz 21.5 IPS | 

PCPartPicker 

Link to post
Share on other sites
31 minutes ago, Rohith_Kumar_Sp said:

welp, i have 4790k planning to get 3080 but for 1080/1440p at most, do i need an cpu upgrade? that'd cost a lot since my rig is like 6 years old 

As mentioned in the review i linked, the difference in FPS on a GPU as powerful as a 3080 at 1080p or 1440p when comparing a 4790k or equivalent CPU vs a more modern CPU like a Ryzen 3600 or better is rather significant, as much as multiple 'average' GPU generational performance jumps..

 

What i'd suggest is that if u cant afford to do a full system upgrade, and u do get a 3080, that u make sure u run games at an internal rendered resolution of 4k, at least that way ur gaining some amount of visual improvement whilst pushing all the processing to the GPU. You wont get the full effect of 4k, but it certainly look better than native 1080p. (you'll most likely want to disable any Anti Aliasing (AA) when running 4k DSR , u likely wont need it, and AA nowadays is usually limited to the crap versions (FXAA , TAA etc),

 

This is in fact what i already do with my 1080ti, in games that can get over 120fps I make sure to use DSR to render them at 4k fully utilizing my 1080ti. Im still rocking a 1080p Plasma until i buy a 48" OLED later this year / early next, which will coincide with my purchase of a 3080/3080ti/BigNavi so i can use HDMI 2.1 with the OLED for 4k 120hz.

On a 42" 1080p Plasma, there is a significant noticeable difference between rendering at native 1080p vs 4k, so its certainly worth it for me.


CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w | VDU: Panasonic 42" Plasma |

GPU: Gigabyte 1080ti Gaming OC w/OC & Barrow Block | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + Samsung 850 Evo 256GB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P |

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×