Jump to content

Why are people hating on RT

Smackaroy
12 hours ago, Jurrunio said:

people just dont like the fact that we're going backwards in terms of frame rates and resolutions when we spend more on graphics cards each year just to run RT. Same happened before, when Nvidia made tesellation (via Gameworks) overused in so many games.

Yes I have a feeling more RT than necessary is getting turned on.  If you run the new rt benchmark if you drop the sample count it jumps from 40ish fps to 100fps...

 

Almost as if settings are being exaggerated to make AMD raytracing look worse...

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Hymenopus_Coronatus said:

Yeah no. Nvidia simply has much more raytracing horsepower. That'll never happen unless Nvidia cards are intentionally gimped in a certain game.

 

From what I've seen, AMD only does decently okay in certain titles with "lighter" raytracing implementations

Both have different ways of doing it, if one game favor one way then the other will suffer. 

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, xAcid9 said:

Both have different ways of doing it, if one game favor one way then the other will suffer. 

That is true, but in "raw" horsepower Nvidia does have a big advantage.

Current System: Ryzen 7 3700X, Noctua NH L12 Ghost S1 Edition, 32GB DDR4 @ 3200MHz, MAG B550i Gaming Edge, 1TB WD SN550 NVME, SF750, RTX 3080 Founders Edition, Louqe Ghost S1

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Hymenopus_Coronatus said:

That is true, but in "raw" horsepower Nvidia does have a big advantage.

Doesn't matter if it not optimize for it. Just like AMD GCN "raw" compute power.

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

32 minutes ago, Hymenopus_Coronatus said:

Yeah no. Nvidia simply has much more raytracing horsepower. That'll never happen unless Nvidia cards are intentionally gimped in a certain game.

 

From what I've seen, AMD only does decently okay in certain titles with "lighter" raytracing implementations

Nvidia with the 3000 series cards are on their 2nd gen RT core, so they've had longer to implement ray traying. AMD's version of ray tracing isn't bad for their first attempt, though both Nvidia RTX and AMD RT have too much of a performance impact to actually use in a lot of games. But what would really be terrible is if some games are optimized for Nvidia, others get optimized for AMD, I'd rather developers have a solution that supports everything well.

Link to comment
Share on other sites

Link to post
Share on other sites

Basically Nvidia advirtising ray tracing as game changing "magic" but it's current implementation is limited and a gimick.

 

First of all, all things aside it's not worth the drop in performance, smoothness with high framerates is often much more impactfull that something being prettier.

The astonishing FPS drop with RTX on isn't worth it.

Worse still in some titles ray tracing does little to nothing.

 

It has gotten better though, with DLSS ray tracing could definitly become mainstream.

We have titles like Control where it works well.

 

However too few titles have DLSS and Ray tracing as it is.

It could become a thing if game devs implemented this features more rather than being "tacked on"

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Hymenopus_Coronatus said:

Yeah no. Nvidia simply has much more raytracing horsepower. That'll never happen unless Nvidia cards are intentionally gimped in a certain game.

So, they're intentionally gimped in Dirt 5 is your assertion?

 

CPURyzen 7 5800X with Arctic Liquid Freezer II 120mm AIO & push-pull Arctic P12 PWM fans RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-30

PSU: Be Quiet! Pure Power 12 M 1000W GPUASRock RX 5700 XT Reference with Eiswolf GPX-Pro 240 AIO & 2x Arctic P12 PWM fans Case: Antec P5

MotherboardASRock X570M Pro4 Monitor: ASUS ROG Strix XG32VC Storage: HP EX950 1TB NVMe, Mushkin Pilot-E 1TB NVMe, 2x Constellation ES 2TB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, BTGbullseye said:

So, they're intentionally gimped in Dirt 5 is your assertion?

 

Dirt 5 is an AMD title that'll obviously perform better on AMD. I'm pretty sure it has a very light raytracing implementation as well

Current System: Ryzen 7 3700X, Noctua NH L12 Ghost S1 Edition, 32GB DDR4 @ 3200MHz, MAG B550i Gaming Edge, 1TB WD SN550 NVME, SF750, RTX 3080 Founders Edition, Louqe Ghost S1

Link to comment
Share on other sites

Link to post
Share on other sites

Brutally honest answer: because AMD's RT solution isn't competitive enough. As soon as it is, it'll be an expected, normal feature.

I like cute animal pics.

Mac Studio | Ryzen 7 5800X3D + RTX 3090

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, n0stalghia said:

Brutally honest answer: because AMD's RT solution isn't competitive enough. As soon as it is, it'll be an expected, normal feature.

I was going to say it probably comes down to this.  Same with DLSS and AMD working on that.  Once it is ubiquitous itll be almost expected.

 

I really cant understand the people who poo on DLSS either.  When it was 1.0 and it looked like blurry oil painting, it was appropriate to sh*t on it, as it looked just bad.  But it is done so well now, I dont know how people dont love it.  Control with DLSS and RT on was awesome.  Looked almost as good as native 4k, was so smooth, and the RT effects were so good.  I just wish more games had it TBH.  Maybe I suppose that could make it a gimmick?   Either way I like it.

El Zoido:  9900k + RTX 4090 / 32 gb 3600mHz RAM / z390 Aorus Master 

 

The Box:  3900x + RTX 3080 /  32 gb 3000mHz RAM / B550 MSI mortar 

Link to comment
Share on other sites

Link to post
Share on other sites

everyone calling it a gimmick are people that haven't been around long enough to base an opinion. mid to late 90s.. we had shadows!! those things that are in every game.. that before were just a generic blob under object.. they were gimmicks, they weren't worth the performance hit. well now look at games.. thing they are still a gimmick. and then the physx argument.. the reason that's not thrown around anymore is because it became such a norm that it wasn't a toggle feature any more.

 

the vast majority of people shit on RT because when it 1st cad out they didn't like the cost of the 2080ti, this making it bad. the vast majority of people hating it won't have an RTX card.. you'll see 1060s, 1650s etc etc.. that's like me saying pagan Hondas are dumb because the fuel consumption is bad.. while rolling around in my defender... that's shaped like a house brick.

Gaming PC: • AMD Ryzen 7 3900x • 16gb Corsair Vengeance RGB Pro 3200mhz • Founders Edition 2080ti • 2x Crucial 1tb nvme ssd • NZXT H1• Logitech G915TKL • Logitech G Pro • Asus ROG XG32VQ • SteelSeries Arctis Pro Wireless

Laptop: MacBook Pro M1 512gb

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, gabrielcarvfer said:

I think AMD, MSFT and Sony will come up with something completely different.

From the way it's pictured and the name, it's likely just a well done super-rez method, where it takes the aggregate of multiple frames to estimate a higher resolution image. This would actually produce a better image when there is a slight change in the image between frames. (meaning anything in motion will look better)

CPURyzen 7 5800X with Arctic Liquid Freezer II 120mm AIO & push-pull Arctic P12 PWM fans RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-30

PSU: Be Quiet! Pure Power 12 M 1000W GPUASRock RX 5700 XT Reference with Eiswolf GPX-Pro 240 AIO & 2x Arctic P12 PWM fans Case: Antec P5

MotherboardASRock X570M Pro4 Monitor: ASUS ROG Strix XG32VC Storage: HP EX950 1TB NVMe, Mushkin Pilot-E 1TB NVMe, 2x Constellation ES 2TB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to comment
Share on other sites

Link to post
Share on other sites

It *is* a gimmick. We're decades away from fully ray traced games... 

 

 

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 11/20/2020 at 11:35 AM, PriitM said:

Real reason why RT is still a "gimmick" is that its still in development. You have Nvidias RT cores and API, you have AMDs baked in RT chunks in the CUs and API. And then you have Microsoft Direct X RT. RT needs standardization for wide adoption, otherwise you will always see a one sided battle: either Nvidia or AMD trading punches. That way game devs wont be bothered to put extra work in optimizing for BOTH APIs but have to choose one. And the un-supported hardware wont work as well. So, if you want RT to get out of its "gimmick" status, you have to hope for a unified standard. And that can come from either Nvidias design of dedicated RT cores or AMDs baked in chunks with highs peed interconnect. I predict there will be a symbiotic middle ground, with dedicated RT/generic compute cores and high speed interconnect for scalability. Or Nvidia will make their own dedicated "gaming" architecture without any specific "cores" for productivity. As AMD did with CGNA and RDNA. 

 

Until then...only time will tell what approach will pan out

There are only 2 RT apis vulcan and direct x.  nvidia uses a unreleased modified version of the the vulcan api

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, gabrielcarvfer said:

It is an awesome tech, but severely limited in support, which means it isn't worth shit for most people.

 

It would be way different if it supported a ton of additional games, but how do you add support for 16k rendering to old games in order to train the AI? Not happening.

 

I think AMD, MSFT and Sony will come up with something completely different.

old games are easier to run so there will be no need to add it

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Smackaroy said:

There are only 2 RT apis vulcan and direct x.  nvidia uses a unreleased modified version of the the vulcan api

Still, game devs have to choose one to build on. You wont see wide spread adoption until a standard is agreed upon

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, PriitM said:

Still, game devs have to choose one to build on. You wont see wide spread adoption until a standard is agreed upon

I would say there isn't a standard api on normal rendering either, some games use direct x 11, direct x 12, vulcan and opengl to a lesser extent.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×