Jump to content

RTX Sucks

Wormhole
16 hours ago, GoodBytes said:

Don't know. To me the the RTX 20 series is just a filler. Technically, the next architecture is Volta.

We didn't get it (ignoring the Titan V). I think Nvidia needed more time with Volta as they were not happy with the performance that they were getting, and in the mean time had a team on the side worked on this RTX 20 series line up. That is just my speculation.

I hope you're right on this one. I was actually missing Volta myself when the RTX cards came out :( 

Link to comment
Share on other sites

Link to post
Share on other sites

Just what in the hell are GPU manufacturers doing?

 

Dear NVIDIA and AMD, NO ONE FUCKING GAMES AT 4K. Talk about one huge manufacturer push for a technology that the public doesn't want/need. NVIDIA: "Spend more money so your rig can run games at 4K" User: "But I don't have a 4K monitor, they're expensive" NVIDIA: "Well then buy a better monitor because only plebs would play at 1080p and 1440p." NVIDIA: "Look at all the features you can enjoy if you decide to play at a stupidly high resolution!"

 

Ugh

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, corrado33 said:

Dear NVIDIA and AMD, NO ONE FUCKING GAMES AT 4K.

That's why you should buy 2080 Ti and enable RTX to play at 1080p60. 

 

What you want them to do btw?

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

I have to say i am disappointed with DLSS... I was hoping for more with the technology. Maybe it will improve over time who knows. For a laugh i switched DLSS on at 1080p, felt like i had put someone elses spectacles on. Horrendous! And they used that as a selling point for the 2060. "Hey, turn RTX on and DLSS for 60fps @1080p!". No one should ever do that. Ever. 

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Lexicalje said:

I have to say i am disappointed with DLSS... I was hoping for more with the technology.

always skipping the every 1st gen of new tech ~ bugs / unoptimised everywhere

Link to comment
Share on other sites

Link to post
Share on other sites

RT is pointless you don't notice it when you're immersed in a game.

(◣_◢) Ryzen 5 3600,   Aorus X370 K7,   XPG 16GB 3200,   Gigabyte 2070 Windforce Corsair RM650x,   LG 32GK650F-B 31.5" 144Hz QHD FreeSync VA,   Kingston 120GB SSD,   Samsung 1TB 860 QVO,   2TB HDD,   Fractal Design Meshify C,   Corsair K63 Wireless,   Corsair Gaming M65 PRO,   Audio Technica ATH M50x,   Windows 10 ProCorsair H100x 240mm.  (◣_◢)

(◣_◢) Ryzen 5 1600,   Noctua NH-L12S,   Gigabyte GTX 1060 6G,   ASUS Prime B350 Plus,   HyperX Fury 8GB DDR4 (2666MHz - 1.3v),   SilverStone ET550-B,   Kingston 120GB SSD 2TB HDD,   Cougar MX330,   Windows 10 Pro.  (◣_◢)

Link to comment
Share on other sites

Link to post
Share on other sites

As an RTX owner, i bought for the raw horsepower, RTX was always gonna fail, there simply is no other choice for 4k gaming. I expect the die shrink to blow the current RTX cards out of the water, the longevity of the current cards won't come close to a 1080ti. However i also expect the top cards to be 1500-2000usd as it took 5 months for 2080 ti supply to meet demand.

 

As for Nvidia lying, AMD did the same thing with vega. Just gimme the raw horsepower, there is no brand loyalty at this point. 

5950x 1.33v 5.05 4.5 88C 195w ll R20 12k ll drp4 ll x570 dark hero ll gskill 4x8gb 3666 14-14-14-32-320-24-2T (zen trfc)  1.45v 45C 1.15v soc ll 6950xt gaming x trio 325w 60C ll samsung 970 500gb nvme os ll sandisk 4tb ssd ll 6x nf12/14 ippc fans ll tt gt10 case ll evga g2 1300w ll w10 pro ll 34GN850B ll AW3423DW

 

9900k 1.36v 5.1avx 4.9ring 85C 195w (daily) 1.02v 4.3ghz 80w 50C R20 temps score=5500 ll D15 ll Z390 taichi ult 1.60 bios ll gskill 4x8gb 14-14-14-30-280-20 ddr3666bdie 1.45v 45C 1.22sa/1.18 io  ll EVGA 30 non90 tie ftw3 1920//10000 0.85v 300w 71C ll  6x nf14 ippc 2000rpm ll 500gb nvme 970 evo ll l sandisk 4tb sata ssd +4tb exssd backup ll 2x 500gb samsung 970 evo raid 0 llCorsair graphite 780T ll EVGA P2 1200w ll w10p ll NEC PA241w ll pa32ucg-k

 

prebuilt 5800 stock ll 2x8gb ddr4 cl17 3466 ll oem 3080 0.85v 1890//10000 290w 74C ll 27gl850b ll pa272w ll w11

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, xg32 said:

As an RTX owner, i bought for the raw horsepower, RTX was always gonna fail, there simply is no other choice for 4k gaming. I expect the die shrink to blow the current RTX cards out of the water, the longevity of the current cards won't come close to a 1080ti. However i also expect the top cards to be 1500-2000usd as it took 5 months for 2080 ti supply to meet demand.

 

As for Nvidia lying, AMD did the same thing with vega. Just gimme the raw horsepower, there is no brand loyalty at this point. 

Agreed. Have a 2080ti FTW3 on water. No regrets. You're gonna pay for the best regardless of extra features.

Link to comment
Share on other sites

Link to post
Share on other sites

19 hours ago, RobbinM said:

You realize you don't have to use DLSS?

For a moment I thought you said "You realize you can't use DLSS anyway"... given there are exactly 2 games with it after 6 frigging months of waiting. If they made reverse DSR that works in any game it would be far more useful than this "deep learning" garbage. There is nothing deep learning about it except the fact no one can understand why they think this nonsense is worth waiting for months so you can get basically as blurry image as with regular scaling in every single modern game and limited to specifically coded titles. It's absurd and idiotic. When you make such features they should support as many games as possible to give gamers as wide selection of games to use it on. But no, they opted for "deep learning" gimmicky nonsense.

 

For RTX I get it why games need to be coded in a specific way because you can't just slap the feature on any game. But for DLSS I damn well know it's possible, but not by arrogant companies like NVIDIA who need to push their garbage in their own dumb way just to make it look "smart" even though it's just HW accelerated resolution scaling seen in all games from last half of decade...

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, xAcid9 said:

1080 use lower power actually.

I'd ask you to give examples, but I don't have any I could quickly reference on hand the other way either.

 

Having said that, I'm currently running some compute tasks on my GPUs, so not necessarily representative of gaming. Also I don't have a 1080, but following are Pascal and Turing cards I currently have running on it.

 

Power consumption (reported by GPU-Z), task time (lower is better), GPU

 

100W, 208s, 1060 3GB

113W, 158s, 1070

120W, 87s, 2070

170W, 123s, 1080 Ti

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

41 minutes ago, porina said:

I'd ask you to give examples, but I don't have any I could quickly reference on hand the other way either.

 

Having said that, I'm currently running some compute tasks on my GPUs, so not necessarily representative of gaming. Also I don't have a 1080, but following are Pascal and Turing cards I currently have running on it.

 

Power consumption (reported by GPU-Z), task time (lower is better), GPU

 

100W, 208s, 1060 3GB

113W, 158s, 1070

120W, 87s, 2070

170W, 123s, 1080 Ti

What kind of compute task? Maybe Turing provide better hardware for that specific task.

 

From TPU reviews, reference 1080 use 166w while reference 2070 use 195w

power_average.png

 

Spoiler

relative-performance_1920-1080.png

So from both graphs above, 2070 performed 12% better while used 17% more power.

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, xg32 said:

As for Nvidia lying, AMD did the same thing with vega. Just gimme the raw horsepower, there is no brand loyalty at this point. 

Regarding the raw performance I have to agree, but I still feel bad for anyone who paid the extra price because of the extra features but actually those extra features failed to project the extra price accordingly..

My system specs:

Spoiler

CPU: Intel Core i7-8700K, 5GHz Delidded LM || CPU Cooler: Noctua NH-C14S w/ NF-A15 & NF-A14 Chromax fans in push-pull cofiguration || Motherboard: MSI Z370i Gaming Pro Carbon AC || RAM: Corsair Vengeance LPX DDR4 2x8Gb 2666 || GPU: EVGA GTX 1060 6Gb FTW2+ DT || Storage: Samsung 860 Evo M.2 SATA SSD 250Gb, 2x 2.5" HDDs 1Tb & 500Gb || ODD: 9mm Slim DVD RW || PSU: Corsair SF600 80+ Platinum || Case: Cougar QBX + 1x Noctua NF-R8 front intake + 2x Noctua NF-F12 iPPC top exhaust + Cougar stock 92mm DC fan rear exhaust || Monitor: ASUS VG248QE || Keyboard: Ducky One 2 Mini Cherry MX Red || Mouse: Logitech G703 || Audio: Corsair HS70 Wireless || Other: XBox One S Controler

My build logs:

 

Link to comment
Share on other sites

Link to post
Share on other sites

33 minutes ago, xAcid9 said:

What kind of compute task? Maybe Turing provide better hardware for that specific task.

I'm looking for Generalised Fermat prime numbers, using software called genefer. I'm not sure what code path is used in this case, as it includes multiple implementations depending on the test and hardware it is run on. In a quick search it suggests it uses fixed point math for the specific tasks I'm running at the moment.

 

33 minutes ago, xAcid9 said:

From TPU reviews, reference 1080 use 166w while reference 2070 use 195w

 

So from both graphs above, 2070 performed 12% better while used 17% more power.

Thanks. Will have to look more closely. Have to wonder what did I see where to get my impression previously, but that'll have to be a little later.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

Just signed up to wade in on this.  Can't disagree that DLSS and RTX are not really what people are buying these cards for - it's the raw power.  Problem with that however is Moore's law is coming to an end and even considering the innovation/improvement cycles the performance increments from silicon have been getting smaller and smaller and will continue to do so which is why companies like Nvidia are trying to add these features which rely on advances in AI/Machine learning or features which they can offload to dedicated chips.  It's the only way they will be able to maintain a competitive advantage.

 

You can get mad about Nvidia selling you new features, you can get mad that they are trying to sell 4k gaming and you don't game at 4k but you don't have to use these features, you don't have to game at 4k, you don't have to buy their new cards.  In fact why would you if your current hardware is doing the job for you?

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×