Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
GabenJr

Was RTX a big scam? – Performance & image quality analysis

Recommended Posts

7 minutes ago, valdyrgramr said:

Okay then let's call it false advertising and not a scam.  Just like they did with the VRAM on the 970.

The gtx 970 was false advertising because the hardware didn't work as Nvidia marketed it, the RTX cards are just simply overpriced because its a early adopter product.

Link to post
Share on other sites
17 minutes ago, valdyrgramr said:

I'm talking about price based on MSRP as shown in the video.  The 2080 Ti launched at Titan priceswhere the launch of the 1080Ti and every other 80 Ti was less than a grand.  Asking me for more for a feat that barely works right and is barely supported to the point it's hundreds more then you can go fuck yourself.  Which is why I refuse to buy a 2080 Ti.  That price is literally a fuck you from Nvidia.  The extra feats and lack of competition equated to hundreds more than previous versions of each card.  A 2080's MSRP is higher than that of a 1080, 980, and even the 780.  It's the same with Turing's 2060, 2070, Titan.  The MSRP is the price hike I was talking about.  Not the scalping of older cards vs the price of Turing cards.

You can't compare the 2080Ti to the 1080Ti and 2080 to 1080, theres far to big a gap in performance. 

Essentially the 2060 is a 1070, the 2070 is a 1080 and the 2080 is a 1080Ti...the 2080Ti would be the equivelant of a Titan V. With that comparison, they're relative in MSRP to their respective performance and hardware. 


Spoiler

Intel i7 3770K @ 4.6ghz | EVGA Z77 FTW | 2 x EVGA GTX1070 FTW | 32GB (4x8GB) Corsair Vengeance DDR3-1600 | Corsair H105 AIO, NZXT Sentry 3, Corsair SP120's | 2 x 256GB Samsung 850EVO, 4TB WD Black | Phanteks Enthoo Pro | OCZ ZX 1250w | Samsung 28" 4K Display | Ducky Shine 3 Keyboard, Logitech G502, MicroLab Solo 7C Speakers, Razer Goliathus Extended, X360 Controller | Windows 10 Pro | SteelSeries Siberia 350 Headphones

 

Spoiler

Corsair 400R, IcyDock MB998SP & MB455SPF, Seasonic X-Series 650w PSU, 2 x Xeon E5540's, 24GB DDR3-ECC, Asus Z8NA-D6C Motherboard, AOC-SAS2LP-MV8, LSI MegaRAID 9271-8i, RES2SV240 SAS Expander, Samsung 840Evo 120GB, 2 x 8TB Seagate Archives, 12 x 3TB WD Red

 

Link to post
Share on other sites
5 minutes ago, Blademaster91 said:

The gtx 970 was false advertising because the hardware didn't work as Nvidia marketed it, the RTX cards are just simply overpriced because its a early adopter product.

"It just works!" -Jensen marketing his product.

https://www.techpowerup.com/252550/nvidia-dlss-and-its-surprising-resolution-limitations 

 

^Doesn't seem like it "just works!"


VashTheStampede 4.0:

CPU: AMD Threadripper 1950x | CPU Cooling: EKWB Liquid Cooling(EK-Supremacy sTR4 RGB - Nickel, EK-CoolStream SE 280, EK-Vardar EVO 140ER Black x 2, EK-XRES 100 SPC-60 MX PWM (incl. pump), EK-ACF Fitting 10/13mm - Red (6-pack), EK-DuraClear 9,5/12,7mm 3M, and Scarlet Red Premix) | Compound: Thermal Grizzly Kryonaut | Mobo: Asrock X399 Taichi | Ram: G.Skill Ripjaws V 32GBs (2x16) DDR4-3200 | Storage: Crucial MX500 500GB M.2-2280 SSD/PNY CS900 240GB SSD/Toshiba X300 4TB 7200RPM | GPU: Zotac Geforce GTX 1080 8GB AMP! Edition(Replacing with a Radeon VII | Case: Fractal Define R5 Blackout Edition w/Window | PSU: EVGA SuperNOVA G2 750W 80+ Gold | Operating System: Windows 10 Pro | Keyboard: Ducky Shine 7 Blackout Edition with Cherry MX Silent Reds | Mouse: Corsair M65 Pro RGB FPS | Headphones:  AKG K7XX Mass Drop Editions(Replacing with k712s) | Mic: Audio-Technica ATR2500 | Speakers: Mackie MR624 Studio Monitors

 

Surtr:

CPU: AMD Ryzen 3 2200G(Temp/Upping to a Zen 2 Ryzen 7) | CPU Cooling: Wraith(Dark Rock Pro 4 when I get the 3700x or 3800x) | Compound: Thermal Grizzly Kryronaut | Mobo: Asrock x470 Taichi | Ram: G.Skill Ripjaws V 16GBs (2x8) DDR4-3200 | Storage: PNY - BX500 240 GB SSD+Seagate Constellation ES.3 1TB 7200RPM | GPU: PowerColor - Radeon RX VEGA 64 8 GB RED DEVIL | Case: Corsair - SPEC-DELTA RGB | PSU: EVGA SuperNOVA G2 750W 80+ Gold | Optical Drive: Random HP DVD Drive | Operating System: Windows 10 | Keyboard: Corsair K70 with Cherry MX Reds | Mouse: Corsair M65 Pro RGB FPS Speakers: JBL LSR 305 Studio Monitors(At some point)

 

Prince of Dark Rock:

CPU: AMD Ryzen 5 2600 | CPU Cooling: be quiet! - Dark Rock Pro 4 | Compound: Thermal Grizzly Kryronaut | Mobo: MSI B450 Tomahawki | Ram: G.Skill Ripjaws V 8GBs (2x4) DDR4-3200 | Storage: Crucial - BX200 240 GB SSD+Seagate Constellation ES.3 1TB 7200RPM | GPU: EVGA - GeForce GTX 1060 6GB SSC | Case: Cooler Master - MasterBox MB511 | PSU: Corsair - CXM 550W | Optical Drive: Random HP DVD Drive | Operating System: Windows 10 Home | Keyboard: Rosewill - NEON K85 RGB BR | Mouse: Razer DeathAdder Elite Destiny 2 Edition 

Link to post
Share on other sites
1 hour ago, Jarsky said:

You can't compare the 2080Ti to the 1080Ti and 2080 to 1080, theres far to big a gap in performance. 

Essentially the 2060 is a 1070, the 2070 is a 1080 and the 2080 is a 1080Ti...the 2080Ti would be the equivelant of a Titan V. With that comparison, they're relative in MSRP to their respective performance and hardware. 

1080 Ti was 700 USD on MSRP.  The 2080 Ti's MSRP is 1200 bucks.  There is no justification to the MSRP on the 2080 Ti.  The so called justification for the price hike on the 2080 was the Real Time Ray Tracing and DLSS feats which barely are supported and when they are it's a hot mess.  The price increase is not justified by performance solely.  It's literally them trying to say Real Time Ray Tracing and DLSS is worth the price in crease until you get to the Titan and the 2080 Ti then those are just a fuck you to the consumer because no competition.


VashTheStampede 4.0:

CPU: AMD Threadripper 1950x | CPU Cooling: EKWB Liquid Cooling(EK-Supremacy sTR4 RGB - Nickel, EK-CoolStream SE 280, EK-Vardar EVO 140ER Black x 2, EK-XRES 100 SPC-60 MX PWM (incl. pump), EK-ACF Fitting 10/13mm - Red (6-pack), EK-DuraClear 9,5/12,7mm 3M, and Scarlet Red Premix) | Compound: Thermal Grizzly Kryonaut | Mobo: Asrock X399 Taichi | Ram: G.Skill Ripjaws V 32GBs (2x16) DDR4-3200 | Storage: Crucial MX500 500GB M.2-2280 SSD/PNY CS900 240GB SSD/Toshiba X300 4TB 7200RPM | GPU: Zotac Geforce GTX 1080 8GB AMP! Edition(Replacing with a Radeon VII | Case: Fractal Define R5 Blackout Edition w/Window | PSU: EVGA SuperNOVA G2 750W 80+ Gold | Operating System: Windows 10 Pro | Keyboard: Ducky Shine 7 Blackout Edition with Cherry MX Silent Reds | Mouse: Corsair M65 Pro RGB FPS | Headphones:  AKG K7XX Mass Drop Editions(Replacing with k712s) | Mic: Audio-Technica ATR2500 | Speakers: Mackie MR624 Studio Monitors

 

Surtr:

CPU: AMD Ryzen 3 2200G(Temp/Upping to a Zen 2 Ryzen 7) | CPU Cooling: Wraith(Dark Rock Pro 4 when I get the 3700x or 3800x) | Compound: Thermal Grizzly Kryronaut | Mobo: Asrock x470 Taichi | Ram: G.Skill Ripjaws V 16GBs (2x8) DDR4-3200 | Storage: PNY - BX500 240 GB SSD+Seagate Constellation ES.3 1TB 7200RPM | GPU: PowerColor - Radeon RX VEGA 64 8 GB RED DEVIL | Case: Corsair - SPEC-DELTA RGB | PSU: EVGA SuperNOVA G2 750W 80+ Gold | Optical Drive: Random HP DVD Drive | Operating System: Windows 10 | Keyboard: Corsair K70 with Cherry MX Reds | Mouse: Corsair M65 Pro RGB FPS Speakers: JBL LSR 305 Studio Monitors(At some point)

 

Prince of Dark Rock:

CPU: AMD Ryzen 5 2600 | CPU Cooling: be quiet! - Dark Rock Pro 4 | Compound: Thermal Grizzly Kryronaut | Mobo: MSI B450 Tomahawki | Ram: G.Skill Ripjaws V 8GBs (2x4) DDR4-3200 | Storage: Crucial - BX200 240 GB SSD+Seagate Constellation ES.3 1TB 7200RPM | GPU: EVGA - GeForce GTX 1060 6GB SSC | Case: Cooler Master - MasterBox MB511 | PSU: Corsair - CXM 550W | Optical Drive: Random HP DVD Drive | Operating System: Windows 10 Home | Keyboard: Rosewill - NEON K85 RGB BR | Mouse: Razer DeathAdder Elite Destiny 2 Edition 

Link to post
Share on other sites
6 minutes ago, Jarsky said:

You can't compare the 2080Ti to the 1080Ti and 2080 to 1080, theres far to big a gap in performance. 

Essentially the 2060 is a 1070, the 2070 is a 1080 and the 2080 is a 1080Ti...the 2080Ti would be the equivelant of a Titan V. With that comparison, they're relative in MSRP to their respective performance and hardware. 

If it worked like that the 1060 would have costed $600, or the 970 would have been $700 which in this case would cause the 1060 to be over $1000, luckily it doesn't work like that, otherwise prices would double every generation.

Also if you consider what you are saying, the 2000 series is barely better than the 3 year old 1000 series while costing the same without any significant improvement on efficiency or noise level which then comes back to how useless the main features of the 2000 series are.

Link to post
Share on other sites
1 hour ago, Jarsky said:

You can't compare the 2080Ti to the 1080Ti and 2080 to 1080, theres far to big a gap in performance. 

Essentially the 2060 is a 1070, the 2070 is a 1080 and the 2080 is a 1080Ti...the 2080Ti would be the equivelant of a Titan V. With that comparison, they're relative in MSRP to their respective performance and hardware. 

You can say we're spoilt by gains from Kepler to Maxwell, Maxwell to Pascal. Pascal to Turing brought along a LOT of price hike, way more than the general performance gain, just because it has extra features called RTX which is rarely used (and ignore how some features get used badly).


"What's under the heatsink?" ep1, "Why it's not as good as it seem?" AMD fanboy edition out, episode 2 "Why my gaming board is a scam?" Intel fanboy edition coming soon (this is a link)

Hardware specs below

CPU: i7-2600K 4751MHz 1.44V (software) --> 1.4?V at the back of the socket Motherboard: Asrock Z77 Extreme4 (BCLK: 103.3MHz) CPU Cooler: Noctua NH-D15 RAM: Adata XPG 2x8GB DDR3 (XMP: 2133MHz 10-11-11-30 CR2, custom: 2203MHz 10-11-10-26 CR1 tRFC:230 tREFI:14000) GPU: Asus GTX 1070 Dual (Super Jetstream vbios, +70(2025-2088MHz)/+400(8.8Gbps)) SSD: Samsung 840 Pro 256GB (main boot drive), Transcend SSD370 128GB PSU: Seasonic X-660 80+ Gold Case: Antec P110 Silent, 5 intakes 1 exhaust Monitor: AOC G2460PF 1080p 144Hz (150Hz max w/ DP, 121Hz max w/ HDMI) TN panel Keyboard: Logitech G610 Orion (Cherry MX Blue) with SteelSeries Apex M260 keycaps Mouse: BenQ Zowie FK1

 

Model: HP Omen 17 17-an110ca CPU: i7-8750H (0.125V core & cache, 50mV SA undervolt) GPU: GTX 1060 6GB Mobile (+80/+450, 1650MHz~1750MHz 0.78V~0.85V) RAM: 8+8GB DDR4-2400 18-17-17-39 2T Storage: 1TB HP EX920 PCIe x4 M.2 SSD + 1TB Seagate 7200RPM 2.5" HDD (ST1000LM049-2GH172), 128GB Toshiba PCIe x2 M.2 SSD (KBG30ZMV128G) gone cooking externally Monitor: 1080p 126Hz IPS G-sync

 

Desktop benching:

Cinebench R15 Single thread:168 Multi-thread: 833 

SuperPi (v1.5 from Techpowerup, PI value output) 16K: 0.100s 1M: 8.255s 32M: 7m 45.93s

Link to post
Share on other sites

I like how in this thread people are actually arguing about whether it's a scam or not but in the YouTube comments section, people are just praising Anthony. 

Link to post
Share on other sites

Ray tracing is great, exciting technology and will truly bring life-like graphics to games while reducing the workload of game developers. It's how rendering should be done and I really respect and admire Nvidia for boldly bringing it to everyone's attention with their RTX cards. Really looking forward to ray tracing becoming more mainstream!

Link to post
Share on other sites
37 minutes ago, recoSNIPE13 said:

Ray tracing is great, exciting technology and will truly bring life-like graphics to games while reducing the workload of game developers. It's how rendering should be done and I really respect and admire Nvidia for boldly bringing it to everyone's attention with their RTX cards. Really looking forward to ray tracing becoming more mainstream!

That really won't be a thing for about two years when Turing is replaced, though.  Plus, Turing didn't even bring raytracing to developers it's been around for several years.


VashTheStampede 4.0:

CPU: AMD Threadripper 1950x | CPU Cooling: EKWB Liquid Cooling(EK-Supremacy sTR4 RGB - Nickel, EK-CoolStream SE 280, EK-Vardar EVO 140ER Black x 2, EK-XRES 100 SPC-60 MX PWM (incl. pump), EK-ACF Fitting 10/13mm - Red (6-pack), EK-DuraClear 9,5/12,7mm 3M, and Scarlet Red Premix) | Compound: Thermal Grizzly Kryonaut | Mobo: Asrock X399 Taichi | Ram: G.Skill Ripjaws V 32GBs (2x16) DDR4-3200 | Storage: Crucial MX500 500GB M.2-2280 SSD/PNY CS900 240GB SSD/Toshiba X300 4TB 7200RPM | GPU: Zotac Geforce GTX 1080 8GB AMP! Edition(Replacing with a Radeon VII | Case: Fractal Define R5 Blackout Edition w/Window | PSU: EVGA SuperNOVA G2 750W 80+ Gold | Operating System: Windows 10 Pro | Keyboard: Ducky Shine 7 Blackout Edition with Cherry MX Silent Reds | Mouse: Corsair M65 Pro RGB FPS | Headphones:  AKG K7XX Mass Drop Editions(Replacing with k712s) | Mic: Audio-Technica ATR2500 | Speakers: Mackie MR624 Studio Monitors

 

Surtr:

CPU: AMD Ryzen 3 2200G(Temp/Upping to a Zen 2 Ryzen 7) | CPU Cooling: Wraith(Dark Rock Pro 4 when I get the 3700x or 3800x) | Compound: Thermal Grizzly Kryronaut | Mobo: Asrock x470 Taichi | Ram: G.Skill Ripjaws V 16GBs (2x8) DDR4-3200 | Storage: PNY - BX500 240 GB SSD+Seagate Constellation ES.3 1TB 7200RPM | GPU: PowerColor - Radeon RX VEGA 64 8 GB RED DEVIL | Case: Corsair - SPEC-DELTA RGB | PSU: EVGA SuperNOVA G2 750W 80+ Gold | Optical Drive: Random HP DVD Drive | Operating System: Windows 10 | Keyboard: Corsair K70 with Cherry MX Reds | Mouse: Corsair M65 Pro RGB FPS Speakers: JBL LSR 305 Studio Monitors(At some point)

 

Prince of Dark Rock:

CPU: AMD Ryzen 5 2600 | CPU Cooling: be quiet! - Dark Rock Pro 4 | Compound: Thermal Grizzly Kryronaut | Mobo: MSI B450 Tomahawki | Ram: G.Skill Ripjaws V 8GBs (2x4) DDR4-3200 | Storage: Crucial - BX200 240 GB SSD+Seagate Constellation ES.3 1TB 7200RPM | GPU: EVGA - GeForce GTX 1060 6GB SSC | Case: Cooler Master - MasterBox MB511 | PSU: Corsair - CXM 550W | Optical Drive: Random HP DVD Drive | Operating System: Windows 10 Home | Keyboard: Rosewill - NEON K85 RGB BR | Mouse: Razer DeathAdder Elite Destiny 2 Edition 

Link to post
Share on other sites

My only gripe: about 2 minutes in, the video makes it sound like driver support is explicitly needed for DXR support. It doesn't because back in November 2018 or so there were people on Reddit running the DXR fallback samples on non RTX hardware: https://www.reddit.com/r/nvidia/comments/9lcs4u/microsoft_dxr_demos_compiled_for_windows_10/

 

All NVIDIA did was make it driver wide, rather than require explicit application support. AMD can do the same thing if they want.

Link to post
Share on other sites

I have a 1080ti which replaced my old 1060 GTX 6gb. Do you think I can use the 1060GTX for the RTX calculations and the 1080ti just to meld everything together?

 

I'm quite curious if the unlocked drivers would allow such a thing.

Link to post
Share on other sites
5 hours ago, valdyrgramr said:

1080 Ti was 700 USD on MSRP.  The 2080 Ti's MSRP is 1200 bucks.  There is no justification to the MSRP on the 2080 Ti.  The so called justification for the price hike on the 2080 was the Real Time Ray Tracing and DLSS feats which barely are supported and when they are it's a hot mess.  The price increase is not justified by performance solely.  It's literally them trying to say Real Time Ray Tracing and DLSS is worth the price in crease until you get to the Titan and the 2080 Ti then those are just a fuck you to the consumer because no competition.

You're uh, new to this whole "cost of new tech" thing, aren't you? First generation of something that hasn't really been done before (remember that this is not GPGPU -- this is not CUDA -- this is adding special silicon for things like Tensor cores and ray tracing) will be more expensive. This isn't re-purposing texture shaders for something else, this is literally dedicated, first generation silicon. There's R&D cost that has to be recouped somewhere. That cost goes to you, the consumer.

 

When CUDA came out, the cards cost more. Hell, dedicated compute cards like the nVidia Titan still cost a pretty penny.

 

I remember when you had to make sure you had to make sure that the game you were buying supported your soundcard. Nothing like getting home from Fry's only to realize that your fancy new game (now out of the shrink wrap) doesn't support AdLib except for a few crappy MIDI renditions of the soundtrack because it was built for the Soundblaster PCM FM channels, so you go and buy a SoundBlaster only to realize that it's the same MIDI renditions just with slightly better sounding instruments. Today, there's little difference: Software support is still getting there and the first few generations of software and hardware are going to be... spotty, if not a little lacklustre.

 

Remember when HDR was new? And how we all oohed and ahhed when it was all over our faces, only to realize it was just Bloom and some subjective lighting? How about when everybody realized Doom wasn't really 3D? Or when you really honestly needed an SLI setup to make Unreal work right?

Link to post
Share on other sites
17 minutes ago, indrora said:

You're uh, new to this whole "cost of new tech" thing, aren't you? First generation of something that hasn't really been done before (remember that this is not GPGPU -- this is not CUDA -- this is adding special silicon for things like Tensor cores and ray tracing) will be more expensive. This isn't re-purposing texture shaders for something else, this is literally dedicated, first generation silicon. There's R&D cost that has to be recouped somewhere. That cost goes to you, the consumer.

 

When CUDA came out, the cards cost more. Hell, dedicated compute cards like the nVidia Titan still cost a pretty penny.

 

I remember when you had to make sure you had to make sure that the game you were buying supported your soundcard. Nothing like getting home from Fry's only to realize that your fancy new game (now out of the shrink wrap) doesn't support AdLib except for a few crappy MIDI renditions of the soundtrack because it was built for the Soundblaster PCM FM channels, so you go and buy a SoundBlaster only to realize that it's the same MIDI renditions just with slightly better sounding instruments. Today, there's little difference: Software support is still getting there and the first few generations of software and hardware are going to be... spotty, if not a little lacklustre.

 

Remember when HDR was new? And how we all oohed and ahhed when it was all over our faces, only to realize it was just Bloom and some subjective lighting? How about when everybody realized Doom wasn't really 3D? Or when you really honestly needed an SLI setup to make Unreal work right?

I'm not new to this I'm saying 3 games of support is not worth the price hike.  The 2080 Ti is more of a fuck you to the consumer due to a lack of competition.  Nvidia focuses on marketing, gimmicks, and screwing the customer when it comes to pricing.  That's always been their thing.  Remember how much they hyped shit like hairworks?  PhysX?  Nobody cares about those anymore, but they made money on it through marketing campaigns.  RTX crap is only supported in 3 titles and 2 demos.  Is that enough of a reason to pay 500 more?   It's just marketing and blind conformity at this point.  When people first saw that price all the big reviewers tried to speculate there would be no Titan due to that price.  But, then a few months later the RTX Titan came to be!  Nobody can justify that price increase from the 1080 Ti to the 2080 Ti.  The extra hundreds for this gimmick is not worth it.  Performance is on thing, but Turing riding hard on a gimmick that won't be massively supported for 2 years if at all.

 

6 hours ago, KaitouX said:

If it worked like that the 1060 would have costed $600, or the 970 would have been $700 which in this case would cause the 1060 to be over $1000, luckily it doesn't work like that, otherwise prices would double every generation.

Also if you consider what you are saying, the 2000 series is barely better than the 3 year old 1000 series while costing the same without any significant improvement on efficiency or noise level which then comes back to how useless the main features of the 2000 series are.

^This really hits the nail on the head.


VashTheStampede 4.0:

CPU: AMD Threadripper 1950x | CPU Cooling: EKWB Liquid Cooling(EK-Supremacy sTR4 RGB - Nickel, EK-CoolStream SE 280, EK-Vardar EVO 140ER Black x 2, EK-XRES 100 SPC-60 MX PWM (incl. pump), EK-ACF Fitting 10/13mm - Red (6-pack), EK-DuraClear 9,5/12,7mm 3M, and Scarlet Red Premix) | Compound: Thermal Grizzly Kryonaut | Mobo: Asrock X399 Taichi | Ram: G.Skill Ripjaws V 32GBs (2x16) DDR4-3200 | Storage: Crucial MX500 500GB M.2-2280 SSD/PNY CS900 240GB SSD/Toshiba X300 4TB 7200RPM | GPU: Zotac Geforce GTX 1080 8GB AMP! Edition(Replacing with a Radeon VII | Case: Fractal Define R5 Blackout Edition w/Window | PSU: EVGA SuperNOVA G2 750W 80+ Gold | Operating System: Windows 10 Pro | Keyboard: Ducky Shine 7 Blackout Edition with Cherry MX Silent Reds | Mouse: Corsair M65 Pro RGB FPS | Headphones:  AKG K7XX Mass Drop Editions(Replacing with k712s) | Mic: Audio-Technica ATR2500 | Speakers: Mackie MR624 Studio Monitors

 

Surtr:

CPU: AMD Ryzen 3 2200G(Temp/Upping to a Zen 2 Ryzen 7) | CPU Cooling: Wraith(Dark Rock Pro 4 when I get the 3700x or 3800x) | Compound: Thermal Grizzly Kryronaut | Mobo: Asrock x470 Taichi | Ram: G.Skill Ripjaws V 16GBs (2x8) DDR4-3200 | Storage: PNY - BX500 240 GB SSD+Seagate Constellation ES.3 1TB 7200RPM | GPU: PowerColor - Radeon RX VEGA 64 8 GB RED DEVIL | Case: Corsair - SPEC-DELTA RGB | PSU: EVGA SuperNOVA G2 750W 80+ Gold | Optical Drive: Random HP DVD Drive | Operating System: Windows 10 | Keyboard: Corsair K70 with Cherry MX Reds | Mouse: Corsair M65 Pro RGB FPS Speakers: JBL LSR 305 Studio Monitors(At some point)

 

Prince of Dark Rock:

CPU: AMD Ryzen 5 2600 | CPU Cooling: be quiet! - Dark Rock Pro 4 | Compound: Thermal Grizzly Kryronaut | Mobo: MSI B450 Tomahawki | Ram: G.Skill Ripjaws V 8GBs (2x4) DDR4-3200 | Storage: Crucial - BX200 240 GB SSD+Seagate Constellation ES.3 1TB 7200RPM | GPU: EVGA - GeForce GTX 1060 6GB SSC | Case: Cooler Master - MasterBox MB511 | PSU: Corsair - CXM 550W | Optical Drive: Random HP DVD Drive | Operating System: Windows 10 Home | Keyboard: Rosewill - NEON K85 RGB BR | Mouse: Razer DeathAdder Elite Destiny 2 Edition 

Link to post
Share on other sites
13 hours ago, GabenJr said:

We didn’t care much for RTX when it launched… And now that RTX is available on GTX cards, has ANYTHING changed?

 

 

Buy a GeForce RTX 2080:
On Amazon: https://geni.us/wmWXXvN
On Newegg: https://lmg.gg/8KVRb

 

Buy a GeForce RTX 2060:
On Amazon: https://geni.us/vWn7vq
On Newegg: https://lmg.gg/8KVRQ

 

Buy a GeForce GTX 1660 Ti:
On Amazon: https://geni.us/Y9y6Gv
On Newegg: https://lmg.gg/8KVR6

 

RTX is at present stuck in the same doldrums as the CUDA cores. Nothing will ever use it directly, and only nVidia's libraries will ever make use of it without telling the developers specifically how to use it. If a GTX card could do everything that the RTX can do, then they would have devoted more die space to it in the first place.

 

Like you have to look at it in the same way as hardware-offload works back when AGP cards had "Mpeg motion calculation" as a defining feature for then Mpeg-1 and Mpeg-2 video (DVD), it may exist as a fixed-function feature until the central chip logic can do it in programmable logic at a reasonable cost. Like in theory, the lowend RTX card probably can do "ray tracing" in it's RT fixed function blocks at a level that is better than the GTX 1080ti in programmable logic, but that doesn't mean the 2060 RTX is going to run circles around it in any game, because clearly the way the RT stuff has been setup is to extend from the programmable shader logic at present and not necessarily really kind of path-tracing.

 

At any rate I don't see this being the same issue as VR or "3D screens" both which are essentially dead tech that's been revived only to see customers ignore it.

 

Link to post
Share on other sites
54 minutes ago, indrora said:

Remember when HDR was new? And how we all oohed and ahhed when it was all over our faces, only to realize it was just Bloom and some subjective lighting? How about when everybody realized Doom wasn't really 3D? Or when you really honestly needed an SLI setup to make Unreal work right?

 

That's wrong. HDR is used in camera photography by sampling multiple exposures so you get more detail from the same image. That was before 4K UHD and rec.2020 colorspace. With rec.2020 you actually get "whiter than white" and "blacker than black" on OLED screens. So things like lightning, mirrors and water surfaces actually reflect light in a way that looks natural rather than just clipping at the maximum color reproduction value.

 

In a 3D game, HDR is essentially unused except when you transition between an indoor and an outdoor environment. Nobody has a 4K rec.2020 screen, most gamers still use TN screens with 6-bit color channel depth. So they of course don't see any HDR effects. You need 10-bit channels for that data to be present.

Link to post
Share on other sites
2 hours ago, Kisai said:

At any rate I don't see this being the same issue as VR or "3D screens" both which are essentially dead tech that's been revived only to see customers ignore it.

 

I think for VR it's the price, always has been always will be. 3D screens are annoying to many people (which is why the 3DS had the option to be turned off). IMO 3D glasses and VR are the best of the 3 the glasses are cheaper than the screens and VR allowed a full immersion experience. As tech moves forward I can see VR picking up pace (it's not dead, it's in a 10 year infancy). The problem with VR is much like all EV makers and their charging methods their heads are up their butts trying to make you the consumer buy theirs instead of working together. We'll get there where VR can be had for sub $250 w/o using a phone, and no I don't count Oculus Go.

Link to post
Share on other sites
On 5/17/2019 at 12:22 AM, Arika S said:

scam? no

overpriced? yes

Thus making it a scam.


INTERREX : CPU: Ryzen 7 2700 @ 4.0GhZ || CPU COOLER : Cooler Master ML240L || MOBO : MSi X370 Gaming Pro Carbon || GPU: ASUS GTX 1080 Strix OC || RAM: 2x8GB G.SKILL Aegis (3000) || SSDs: Crucial P1 500GB (Boot), MX500 1TB || HDD: Seagate Barracuda 2TB || PSU: SeaSonic Focus+ Gold 650W w/ Black & Red CableMod Extensions || CASE: Phanteks Eclipse P300 || Monitor: Acer Predator X34A, Acer KG271U || KEYBOARD: Corsair K95 RGB Platinum (MX Browns, White Keycaps) || Mouse: Logitech G502 Hero with deskpad || Audio: Hyper, Blue Snowball iCE || Case Fans : 2x Noctua NF-F12 Chromax, 1x Noctua NF-F12 iPPC-2000 ||

JUSTINIAN - Dell G5 15" Special Edition: CPU: Core i7-9750H || GPU: RTX 2060 Mobile|| RAM: 2*8GB 2666MhZ DDR4 SODIMM || SSD: 512GB M.2 PCIe || CASE: 15.6" Laptop with dBrand skin || Monitor: 15" 3840x2160 Touchscreen || KEYBOARD: 4-Zone RGB Keyboard || Mouse: Logitech G502 || Audio: HyperX Cloud II

MARCUS AURELIUS II (HTPC) : CPU: Ryzen 5 1600 @3.8GhZ || CPU COOLER : ARCTIC Freezer 34 eSports Duo || MOBO : Gigabyte B450M-DS3H || GPU: EVGA GTX 980 Ti FTW ACX 2.0+ || RAM: 2x8GB Corsair Vengeance White (3000) || SSD: 850 Evo 1TB || HDD: Seagate Barracuda 3TB || PSU: Corsair RM650x w/ Corsair Premium Cables || CASE: Corsair Crystal 280X White || Monitor: TCL 65R617 65" 4K TV || KEYBOARD: Corsair K63 Wireless || Mouse: Logitech M720 (Poor man's MX Master) || Audio: Bose Solo 5 Sound Bar || Case Fans : 2x Corsair ML140, 1x Cooler Master Masterfan Pro 120

OTHER : Old laptop (i5, iGPU, 500GB SSD) || Various Q2Q Dells || White MacBook with 240GB TeamGroup L5 Lite 3D

MOBILE : Galaxy S7 (32GB + 64GB uSD, main phone) || Honor 7X (Europe) || iPad Air 2 || Rooted Kindle FIre 7" ||

ONGOING : PowerMac G5 workstation ||

Forum salt merchant so granted by @dizmo, Grand Deacon of the Glorious Brethren and Guardians of the Most High Overlords Asus and Arctic Cooling. All hail thy lord and savior the Arctic Freezer 33 eSports ONE

Link to post
Share on other sites
2 minutes ago, Arika S said:

Being overpriced doesn’t automatically make something a scam...

Depends on your definition of "scam" I guess. Personally I define most Apple products as a scam because you're paying above average rate for below average performance, as an example. With RTX the improvement is minimal (having tried it on my 1080 Ti) and you pay $100-250 more for the video card as a result making it a scam - it doesn't deliver the promised improvements, not to mention the FOUR games that support it


INTERREX : CPU: Ryzen 7 2700 @ 4.0GhZ || CPU COOLER : Cooler Master ML240L || MOBO : MSi X370 Gaming Pro Carbon || GPU: ASUS GTX 1080 Strix OC || RAM: 2x8GB G.SKILL Aegis (3000) || SSDs: Crucial P1 500GB (Boot), MX500 1TB || HDD: Seagate Barracuda 2TB || PSU: SeaSonic Focus+ Gold 650W w/ Black & Red CableMod Extensions || CASE: Phanteks Eclipse P300 || Monitor: Acer Predator X34A, Acer KG271U || KEYBOARD: Corsair K95 RGB Platinum (MX Browns, White Keycaps) || Mouse: Logitech G502 Hero with deskpad || Audio: Hyper, Blue Snowball iCE || Case Fans : 2x Noctua NF-F12 Chromax, 1x Noctua NF-F12 iPPC-2000 ||

JUSTINIAN - Dell G5 15" Special Edition: CPU: Core i7-9750H || GPU: RTX 2060 Mobile|| RAM: 2*8GB 2666MhZ DDR4 SODIMM || SSD: 512GB M.2 PCIe || CASE: 15.6" Laptop with dBrand skin || Monitor: 15" 3840x2160 Touchscreen || KEYBOARD: 4-Zone RGB Keyboard || Mouse: Logitech G502 || Audio: HyperX Cloud II

MARCUS AURELIUS II (HTPC) : CPU: Ryzen 5 1600 @3.8GhZ || CPU COOLER : ARCTIC Freezer 34 eSports Duo || MOBO : Gigabyte B450M-DS3H || GPU: EVGA GTX 980 Ti FTW ACX 2.0+ || RAM: 2x8GB Corsair Vengeance White (3000) || SSD: 850 Evo 1TB || HDD: Seagate Barracuda 3TB || PSU: Corsair RM650x w/ Corsair Premium Cables || CASE: Corsair Crystal 280X White || Monitor: TCL 65R617 65" 4K TV || KEYBOARD: Corsair K63 Wireless || Mouse: Logitech M720 (Poor man's MX Master) || Audio: Bose Solo 5 Sound Bar || Case Fans : 2x Corsair ML140, 1x Cooler Master Masterfan Pro 120

OTHER : Old laptop (i5, iGPU, 500GB SSD) || Various Q2Q Dells || White MacBook with 240GB TeamGroup L5 Lite 3D

MOBILE : Galaxy S7 (32GB + 64GB uSD, main phone) || Honor 7X (Europe) || iPad Air 2 || Rooted Kindle FIre 7" ||

ONGOING : PowerMac G5 workstation ||

Forum salt merchant so granted by @dizmo, Grand Deacon of the Glorious Brethren and Guardians of the Most High Overlords Asus and Arctic Cooling. All hail thy lord and savior the Arctic Freezer 33 eSports ONE

Link to post
Share on other sites

Nope, it's been great so far IMO.

 

New technology has to start someplace just like always.

 

Those of us who have been around and a long time remember the OLD DAYS and all of the improvements over the years because we lived through them.

 

 

 

 


i9 9900K @ 5.0 GHz, NH D15, 32 GB GSKILL Trident Z RGB, AORUS Z390 MASTER, EVGA RTX 2080Ti FTW3 Ultra, Samsung 970 EVO Plus 500GB, Samsung 860 EVO 1TB, Samsung 860 EVO 500GB, ASUS ROG Swift PG279Q 27", HYPERX Alloy Elite KB, Logitech Gaming Pro Mouse, CM Master Case 5, Seasonic Prime Titanium Ultra 750W. 

 

NEW PSU Tier List      MB Tier List

Link to post
Share on other sites
On 5/17/2019 at 2:19 AM, Kisai said:

That's wrong. HDR is used in camera photography by sampling multiple exposures so you get more detail from the same image. That was before 4K UHD and rec.2020 colorspace. With rec.2020 you actually get "whiter than white" and "blacker than black" on OLED screens.

I'm not talking about the camera trick that produces funky looking images, I'm talking about the HDR we get in games. It also doesn't inherently need multiple exposure samples (you can do certain amounts of HDR with RAW DNG and compensation techniques, a cheater HDR that some photographers use when handling moving subjects and want to avoid ghosts).

No, I'm talking about games supporting HDR. Some of the first were in Unreal 3 and an early demo (HL2: Lost Cost), and a lot of the work was done by keeping two textures around and in other areas apply a bloom shader effect to the bright bits of the scene. It wasn't real HDR, but it was close enough. For an interesting look at how Valve cheated, they talked about it at SIGGRAPH 06, but the basic gist of it is that they're not really doing the raytrace HDR that you would want to do (and which RTX/DXR enables today) but instead pre-baked all their HDR stuff before rendering it fully.

 

The long and short of it is thus:

  • Microsoft released DXR into the world
  • nVidia then released their RTX cards which have hardware support for DXR
  • A few games kinda-sorta rush DXR support in
  • nVidia announces they're kinda supporting DXR on GTX 10-series cards through CUDA cores, but that it comes with a significant performance hit
  • Unreal Engine is slowly adding the baked-in support for DXR
  • People get angry that there's a performance hit on GTX cards
  • Everyone completely loses their mind and the rational folk in the room apply Hanlon's Razor
  • Look where we are now.

PhysX needed special physics cores that later were integrated into GPUs by nVidia and then replaced with CUDA GPGPU kernels -- in fact It wasn't until much later that GPU support was added. Only two games supported the accelerator card anywhere near its launch. When GPU support was added (which required quite high spec cards,) only five games supported physX on the GPU. Was physX a scam too? You later could run the physics simulations on the CPU but your framerate would tank.

If we really apply the logic that's been going on, the Voodoo2 cards were a scam! There were a handful of games that depended on the Voodoo2 in order to provide 3D accelerated graphics and later got software rendering support. Hell, we can even go back as far as the Intel 8087 Math Coprocessor, since there were games that demanded it but worked fine after patching that out... but with tanked framerates because of the integer approxomations.

 

Hmmm... It's like we've been down this path before.

Link to post
Share on other sites
On 5/18/2019 at 1:19 PM, indrora said:

 

PhysX needed special physics cores that later were integrated into GPUs by nVidia and then replaced with CUDA GPGPU kernels -- in fact It wasn't until much later that GPU support was added. Only two games supported the accelerator card anywhere near its launch. When GPU support was added (which required quite high spec cards,) only five games supported physX on the GPU. Was physX a scam too? You later could run the physics simulations on the CPU but your framerate would tank.

 

We always go down this path with supposedly new tech, or re-wrapped new-old-tech. Certain kinds of tech have to be introduced together for mass adoption, otherwise it just falls on it's face as a niche product.

 

Everything old is new again, Stereoscopic photographic has been around since 1838. You'd think someone would have figured out how to do this properly in 180 years. It remains a niche. Don't believe me, look up "viewmaster", which was the most successful incarnation of this. Yet, if you owned one of these, you'd see what the limitation is. It doesn't look "3D", it looks like a pop-up book. Some were better than others. 3D glasses for 3D films? The effect isn't 3D, the effect is "things being thrown at the audience for a throw-away gag". The 3DS was only slightly better at this, since it works on a different effect. Instead of popping out of the screen, it rather had "depth" into the screen. Current generation VR headsets make people sick, and current generation "3D" stereo glasses give people headaches. This is something that could be fixed, but it requires increasing the frame rates.

 

Surround sound? It seems like it got no adoption outside of the very-expensive-man-cave people. The problem with adoption of is that you can't add a $1000 surround receiver to a $200 TV/Monitor. A $300 headset? Still stereo. USB ones claim to be surround sound, but they don't actually process surround sound like a home theater would. 

 

Haptic feedback? Remember the dual-shock controllers on the playstation? That was the first mass-practical use of this tech (it was previously available in chairs, and vests or as the .1 in the 5.1 surround theater via the subwoofer signal.) Still no haptic feedback in VR, the closest we've come is haptic feedback on keyboards of mobile phones and tablets. We seem to be coming back to the vests again for VR.

 

 There's actually a ultrasound tech that could combine the surround and the haptic feedback. It's yet to be seen in a gaming environment.

 

All this tech exists in the theater in some form (If you pay for the D-BOX experience at the theaters in BC, you basically pay $50 for seats the move/vibrate/push air, I can't remember all of it, I've only paid for it twice.)

 

You'd think that surround sound and 3D graphics would be in the same hardware, since the GPU is processing the 3D visuals...

 

Ray-tracing is the last piece of the puzzle here. Where 3D positional audio requires knowing where sound originates, ray-tracing requires knowing where light originates. So trace the sounds and light, and you will finally get a true immersive experience. Current VR experiences are low-resolution, mostly just stereo experiences that feel gimmicky in the same way that the Viewmaster reels do. At present VR's only real practical use isn't games, it's real estate/engineering which would allow engineers and potential buyers see how these buildings look pre-build, instead of building "showcase" units that get demolished later.

 

So what I predict is that we may see "VR" escape rooms before we see home versions of VR that aren't bad. 

Link to post
Share on other sites

I'd say that RTX wasn't really a scam. Even after watching the video, I'm really not convinced that the RTX line (and the features it offers, no NOT RTX) wasn't a scam at all. It was a refresh that Nvidia needed, especially as 4k gaming has gone mainstream, and even 1080 ti SLI was struggling to keep up. That being said, the Ray-Tracing functionality of the cards? Yeah, at least a little bit, kind of, sort of, maybe a scam. However, as the numbers come out, I predict we'll see that the GTX cards will be absolutely crippled with "RTX-on". Even the RTX-series has a hard time keeping frames smooth with ray-tracing enabled. Plus, as we've seen, if you don't have ray-tracing enabled (this was before the whole RTX-on-GTX kerfuffle), the 2080 is pretty comparable to the 1080 ti, in terms of raw performance, for only a very slight bump in price. And for enthusiasts, the 2080 ti is perfect, maintaining what even 1080 ti SLI struggled to handle. There was a large amount of GPU failure, especially in the 2080 ti early on, but we haven't seen many bad apples since that early batch, so at least Nvidia and its board partners were able to replace the duds that had been shipped initially. The GTX series is still a great gaming line, but for those that want a bit more (and again, let's not forget the added bang-for-the-buck that the 2080 and 2070 give), I would say the RTX cards, and yes, even a little bit of that magical ray-tracing, is the future.

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×