Jump to content

Why are people hating on RT

Smackaroy

recently I have seem more and more comments hating on RT calling it a gimmick. I don't get it RT has enabled near photo realistic lighting. This could just be a coincidence but this seems to have mostly come from when people relized rx6000 loses in RT, are people just getting extra fanboyey 

Link to comment
Share on other sites

Link to post
Share on other sites

RT will be a gimmick, just like PhysX, until it sees mainstream game adoption.

Main: AMD Ryzen 7 5800X3D, Nvidia GTX 1080 Ti, 16 GB 4400 MHz DDR4 Fedora 38 x86_64

Secondary: AMD Ryzen 5 5600G, 16 GB 2667 MHz DDR4, Fedora 38 x86_64

Server: AMD Athlon PRO 3125GE, 32 GB 2667 MHz DDR4 ECC, TrueNAS Core 13.0-U5.1

Home Laptop: Intel Core i5-L16G7, 8 GB 4267 MHz LPDDR4x, Windows 11 Home 22H2 x86_64

Work Laptop: Intel Core i7-10510U, NVIDIA Quadro P520, 8 GB 2667 MHz DDR4, Windows 10 Pro 22H2 x86_64

Link to comment
Share on other sites

Link to post
Share on other sites

people just dont like the fact that we're going backwards in terms of frame rates and resolutions when we spend more on graphics cards each year just to run RT. Same happened before, when Nvidia made tesellation (via Gameworks) overused in so many games.

CPU: i7-2600K 4751MHz 1.44V (software) --> 1.47V at the back of the socket Motherboard: Asrock Z77 Extreme4 (BCLK: 103.3MHz) CPU Cooler: Noctua NH-D15 RAM: Adata XPG 2x8GB DDR3 (XMP: 2133MHz 10-11-11-30 CR2, custom: 2203MHz 10-11-10-26 CR1 tRFC:230 tREFI:14000) GPU: Asus GTX 1070 Dual (Super Jetstream vbios, +70(2025-2088MHz)/+400(8.8Gbps)) SSD: Samsung 840 Pro 256GB (main boot drive), Transcend SSD370 128GB PSU: Seasonic X-660 80+ Gold Case: Antec P110 Silent, 5 intakes 1 exhaust Monitor: AOC G2460PF 1080p 144Hz (150Hz max w/ DP, 121Hz max w/ HDMI) TN panel Keyboard: Logitech G610 Orion (Cherry MX Blue) with SteelSeries Apex M260 keycaps Mouse: BenQ Zowie FK1

 

Model: HP Omen 17 17-an110ca CPU: i7-8750H (0.125V core & cache, 50mV SA undervolt) GPU: GTX 1060 6GB Mobile (+80/+450, 1650MHz~1750MHz 0.78V~0.85V) RAM: 8+8GB DDR4-2400 18-17-17-39 2T Storage: HP EX920 1TB PCIe x4 M.2 SSD + Crucial MX500 1TB 2.5" SATA SSD, 128GB Toshiba PCIe x2 M.2 SSD (KBG30ZMV128G) gone cooking externally, 1TB Seagate 7200RPM 2.5" HDD (ST1000LM049-2GH172) left outside Monitor: 1080p 126Hz IPS G-sync

 

Desktop benching:

Cinebench R15 Single thread:168 Multi-thread: 833 

SuperPi (v1.5 from Techpowerup, PI value output) 16K: 0.100s 1M: 8.255s 32M: 7m 45.93s

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Smackaroy said:

This could just be a coincidence but this seems to have mostly come from when people relized rx6000 loses in RT, are people just getting extra fanboyey 

It's probably a combination of that, it being a new thing and people not liking change, people not actually knowing what it is, and Nvidia's crappy launch of the RTX series. 

Link to comment
Share on other sites

Link to post
Share on other sites

It is a gimmick. Sure, its cool and all, but the performance loss for both AMD and NVIDIA is just too expensive. 
If i need to play a game competively, I'm not gonna use RT on any of the platforms and most of the RT implementations on games is just... Not awesome. Sure, the shadows are a little bit more realistic or whatever - If I'm gonna be playing a game where I care about it (lets say, maybe Cyberpunk when it releases?), I'll have to find a balance between visual fidelity, quality and framerate.
Give it 5+ years, when all of the GPUs can push 144hz+ in RT and the implementation of the technology is better (I think one of the best is Star Citizen's volumetric fog, which can't be done in real time without heavy use of Raytracing). 
Until then, it'll be a gimmick.

 

The only "legit" use in my opinion is in professional workloads, same as CUDA.

 

Edit: And I bought a RX 6800 KNOWING RT was a gimmick. I didn't self-convince. The 16GB VRAM and higher performance is a greater selling point than Nvidias propietary technologies for me, as I'm not gonna be using them and I much rather have the extra rasterization performance.

Planning on trying StarCitizen (Highly recommended)? STAR-NR5P-CJFR is my referal link 

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, svmlegacy said:

RT will be a gimmick, just like PhysX, until it sees mainstream game adoption.

It did gain mainstream adoption though, with both AMD and the new consoles having support for it, what other mainstream gaming platforms out there other than Nintendo that doesn't have it?

 

I think people don't understand that this technology is still in its infancy and its potential is not even fully realized, it will take time before a full transformation to happen until all games and every aspect of lightning is Ray Traced with no option for rasterization methods, but it will happen eventually, it also allows developers to focus on other aspects of the game rather than having to work on lighting with baking lights and such.

Quote or Tag people so they know that you've replied.

Link to comment
Share on other sites

Link to post
Share on other sites

Meh, don't get an rt card then. 

 

I like the versatility of the cards and the lighting effects can be a real step up when designed for it.  Cp2077 might look pretty good, we'll see.  

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

RT in games is mostly a gimmick, it isn't worth it in most games as it just makes a few things more shiny, or adds some more realistic shadows. Until developers can agree on a standard, not just Nvidia's proprietary RTX vs AMD RT using DXR, I don't see the feature being used enough to have mainstream adoption.

it isn't worth the performance impact to me in games, I care more about better performance in games and game quality more than the RT effects. I also don't like how it has caused the price of graphics cards to increase, while also having to pay more for a feature I won't have a use for.

Link to comment
Share on other sites

Link to post
Share on other sites

The fact you don't see much change between Ray Tracing and traditional Rasterization is because the latter method has been perfected and meticulously worked on, but it does require a lot of work to achieve that and the lighting isn't dynamic, Ray Tracing makes that process much simpler, it might be slower to run now but it will keep getting better, and once GPUs with Ray Tracing are standard among the common population then rasterization isn't required anymore.

 

It really is more of a Developer feature rather than a Customer feature, you likely won't notice much change when the industry fully switches to it, though the main advantage is that it is dynamic which makes the world feel more realistic, though people likely won't care much and that's understandable, but it is happening regardless imo.

Quote or Tag people so they know that you've replied.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, .Apex. said:

The fact you don't see much change between Ray Tracing and traditional Rasterization is because the latter method has been perfected and meticulously worked on, but it does require a lot of work to achieve that and the lighting isn't dynamic, Ray Tracing makes that process much simpler, it might be slower to run now but it will keep getting better, and once GPUs with Ray Tracing are standard among the common population then rasterization isn't required anymore.

 

It really is more of a Developer feature rather than a Customer feature, you likely won't notice much change when the industry fully switches to it, though the main advantage is that it is dynamic which makes the world feel more realistic, though people likely won't care much and that's understandable, but it is happening regardless imo.

it does seem to be a bit of a crutch at the moment, i feel like in games that have rtx effects, the devs dont put as much effort into the rasterized visuals so turning rtx on looks particularly better, but if you compare current rtx implementations to good rasterization implementations, rasterization looks pretty damn good and runs twice as fast. some effects just arent feasible with rasterization like @gabrielcarvfer said. as for the realism part, raytraced effects help but npc/character model quality and game physics in general hurt so much more. like in wd legion, aside from the laughably terrible optimization, it actually looks pretty decent scene-wise. but the npcs running around break any sort of immersion with how they look in comparison, lol.

 

i may wait to see how cyberpunk handles rtx before buying a new card. also, interested in seeing how amd's rtx hardware performs in upcoming releases that will presumably be optimized for consoles/rdna 2. might just get a 2080ti and call it a day, i dunno. it's a weird time for gpus right now :/ 

topics i need help on:

Spoiler

 

 

my "oops i bought intel right before zen 3 releases" build

CPU: Ryzen 5 3600 (placeholder)

GPU: Gigabyte 980ti Xtreme (also placeholder), deshroud w/ generic 1200rpm 120mm fans x2, stock bios 130% power, no voltage offset: +70 core +400 mem 

Memory: 2x16gb GSkill Trident Z RGB 3600C16, 14-15-30-288@1.45v

Motherboard: Asus ROG Strix X570-E Gaming

Cooler: Noctua NH-D15S w/ white chromax bling
OS Drive: Samsung PM981 1tb (OEM 970 Evo)

Storage Drive: XPG SX8200 Pro 2tb

Backup Storage: Seagate Barracuda Compute 4TB

PSU: Seasonic Prime Ultra Titanium 750W w/ black/white Cablemod extensions
Case: Fractal Design Meshify C Dark (to be replaced with a good case shortly)

basically everything was bought used off of reddit or here, only new component was the case. absolutely nutty deals for some of these parts, ill have to tally it all up once it's "done" :D 

Link to comment
Share on other sites

Link to post
Share on other sites

Its OPTIONAL anyway guys...

Always has been, people are upset over having more finer control. You can disable it or enable it.

As Humans you have the right to use your brain and disable or enable it. Tweak and Tune, or not do so...

 

The lighting is genuinely nice, reflections (done well) are nice in campaign titles.

The shadows are not my priority, I disable those.

Its USER choice.

Want Frames, Disable that you do not want to use.

 

Is it your brain telling you its no longer MAXED out when RTX disabled..that bothers you?

Treat THAT as your Ultra then right?

Maximums - Asus Z97-K /w i5 4690 Bclk @106.9Mhz * x39 = 4.17Ghz, 8GB of 2600Mhz DDR3,.. Gigabyte GTX970 G1-Gaming @ 1550Mhz

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Smackaroy said:

rx6000 loses in RT

Only in existing games, which have either no optimizations at all, or are only optimized for RTX. In the upcoming Dirt 5, it has AMD optimizations for raytracing, making even the 6800 non-XT outperform the 3090 in raytracing. Look to see more of that in the future since AMD has a console raytracing monopoly.

CPURyzen 7 5800X Cooler: Arctic Liquid Freezer II 120mm AIO with push-pull Arctic P12 PWM fans RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-30

MotherboardASRock X570M Pro4 GPUASRock RX 5700 XT Reference with Eiswolf GPX-Pro 240 AIO Case: Antec P5 PSU: Rosewill Capstone 750M

Monitor: ASUS ROG Strix XG32VC Case Fans: 2x Arctic P12 PWM Storage: HP EX950 1TB NVMe, Mushkin Pilot-E 1TB NVMe, 2x Constellation ES 2TB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to comment
Share on other sites

Link to post
Share on other sites

who watches the mirrors , watter paddles while gaming?

 

old game shadows are fine, most player dont care and lower / disable shadows

Link to comment
Share on other sites

Link to post
Share on other sites

RT is not a gimmick, it is new in gaming so will take tie to adapt. Its always like this with any new tech coming into gaming.  

 

especially now the PC in a box i mean consoles are advertising RT as a selling point game developers will use it more.

 

Folding Stats

 

SYSTEM SPEC

AMD Ryzen 5 5600X | Motherboard Asus Strix B550i | RAM 32gb 3200 Crucial Ballistix | GPU Nvidia RTX 3070 Founder Edition | Cooling Barrow CPU/PUMP Block, EKWB Vector GPU Block, Corsair 280mm Radiator | Case NZXT H1 | Storage Sabrent Rocket 2tb, Samsung SM951 1tb

PSU NZXT S650 SFX Gold | Display Acer Predator XB271HU | Keyboard Corsair K70 Lux | Mouse Corsair M65 Pro  

Sound Logitech Z560 THX | Operating System Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Ray tracing today is like pixel shaders on the GeForce 3 Ti. Remember that? How great was pixel shading on the GeForce 3? 


It was fucking awful.

 

Morrowind had a massive performance penalty to use it for strictly water effects.

 

 

In 15 years, RT will be where pixel shading is now. Just an expected function that will be past its current garish super-shine days, and it'll run super smooth on even the cheapest of cards.  

Work Rigs - 2015 15" MBP | 2019 15" MBP | 2021 16" M1 Max MBP | Lenovo ThinkPad T490 |

 

AMD Ryzen 9 5900X  |  MSI B550 Gaming Plus  |  64GB G.SKILL 3200 CL16 4x8GB |  AMD Reference RX 6800  |  WD Black SN750 1TB NVMe  |  Corsair RM750  |  Corsair H115i RGB Pro XT  |  Corsair 4000D  |  Dell S2721DGF  |
 

Fun Rig - AMD Ryzen 5 5600X  |  MSI B550 Tomahawk  |  32GB G.SKILL 3600 CL16 4x8GB |  AMD Reference 6800XT  | Creative Sound Blaster Z  |  WD Black SN850 500GB NVMe  |  WD Black SN750 2TB NVMe  |  WD Blue 1TB SATA SSD  |  Corsair RM850x  |  Corsair H100i RGB Pro XT  |  Corsair 4000D  |  LG 27GP850  |

Link to comment
Share on other sites

Link to post
Share on other sites

It's a gimmick until properly implemented and less of a hit to performance.

 

I could care less about the ray tracing in Watch Dogs Legion tbh.

AMD Ryzen 9 5900X - Nvidia RTX 3090 FE - Corsair Vengeance Pro RGB 32GB DDR4 3200MHz - Samsung 980 Pro 250GB NVMe m.2 PCIE 4.0 - 970 Evo 1TB NVMe m.2 - T5 500GB External SSD - Asus ROG Strix B550-F Gaming (Wi-Fi 6) - Corsair H150i Pro RGB 360mm - 3 x 120mm Corsair AF120 Quiet Edition - 3 x 120mm Corsair ML120 - Corsair RM850X - Corsair Carbide 275R - Asus ROG PG279Q IPS 1440p 165hz G-Sync - Logitech G513 Linear - Logitech G502 Lightsync Wireless - Steelseries Arctic 7 Wireless

Link to comment
Share on other sites

Link to post
Share on other sites

 so what I understand is that people thinks it's a gimmick because of the performance loss as jurrunio pointed out. Personally I thinks it's not a gimmick because in a game like spider man where the city is  reflected in the building really looks amazing, but i can understand people not thinking it's enough of an improvement for the loss of fps

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, BTGbullseye said:

Only in existing games, which have either no optimizations at all, or are only optimized for RTX. In the upcoming Dirt 5, it has AMD optimizations for raytracing, making even the 6800 non-XT outperform the 3090 in raytracing. Look to see more of that in the future since AMD has a console raytracing monopoly.

Quess we will find out but I do expect nvidia to pull ahead because of the extra generation of implementing it

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, Xaring said:

It is a gimmick. Sure, its cool and all, but the performance loss for both AMD and NVIDIA is just too expensive. 
If i need to play a game competively, I'm not gonna use RT on any of the platforms and most of the RT implementations on games is just... Not awesome. Sure, the shadows are a little bit more realistic or whatever - If I'm gonna be playing a game where I care about it (lets say, maybe Cyberpunk when it releases?), I'll have to find a balance between visual fidelity, quality and framerate.
Give it 5+ years, when all of the GPUs can push 144hz+ in RT and the implementation of the technology is better (I think one of the best is Star Citizen's volumetric fog, which can't be done in real time without heavy use of Raytracing). 
Until then, it'll be a gimmick.

 

The only "legit" use in my opinion is in professional workloads, same as CUDA.

 

Edit: And I bought a RX 6800 KNOWING RT was a gimmick. I didn't self-convince. The 16GB VRAM and higher performance is a greater selling point than Nvidias propietary technologies for me, as I'm not gonna be using them and I much rather have the extra rasterization performance.

It doesn't have extra performance it trades blows in 4k with the 3080 slightly pulling ahead where as in 1440p the 6800xt pulls slightly ahead.  Oh you got the non xt and are comparing it to the 3070 nevermind 

Edited by Smackaroy
not reading his message correctly
Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, Smackaroy said:

recently I have seem more and more comments hating on RT calling it a gimmick. I don't get it RT has enabled near photo realistic lighting. This could just be a coincidence but this seems to have mostly come from when people relized rx6000 loses in RT, are people just getting extra fanboyey 

Real reason why RT is still a "gimmick" is that its still in development. You have Nvidias RT cores and API, you have AMDs baked in RT chunks in the CUs and API. And then you have Microsoft Direct X RT. RT needs standardization for wide adoption, otherwise you will always see a one sided battle: either Nvidia or AMD trading punches. That way game devs wont be bothered to put extra work in optimizing for BOTH APIs but have to choose one. And the un-supported hardware wont work as well. So, if you want RT to get out of its "gimmick" status, you have to hope for a unified standard. And that can come from either Nvidias design of dedicated RT cores or AMDs baked in chunks with highs peed interconnect. I predict there will be a symbiotic middle ground, with dedicated RT/generic compute cores and high speed interconnect for scalability. Or Nvidia will make their own dedicated "gaming" architecture without any specific "cores" for productivity. As AMD did with CGNA and RDNA. 

 

Until then...only time will tell what approach will pan out

Link to comment
Share on other sites

Link to post
Share on other sites

Simply because the performance drop is horrible and what it give in return is barely noticeable in some cases. 

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

ray tracing doesn't always make sense but it does *more often* with DLSS.

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, BTGbullseye said:

it has AMD optimizations for raytracing, making even the 6800 non-XT outperform the 3090 in raytracing

Yeah no. Nvidia simply has much more raytracing horsepower. That'll never happen unless Nvidia cards are intentionally gimped in a certain game.

 

From what I've seen, AMD only does decently okay in certain titles with "lighter" raytracing implementations

Current System: Ryzen 7 3700X, Noctua NH L12 Ghost S1 Edition, 32GB DDR4 @ 3200MHz, MAG B550i Gaming Edge, 1TB WD SN550 NVME, SF750, RTX 3080 Founders Edition, Louqe Ghost S1

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×