Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

Could Ray Tracing be like PhysX?

Considering that Ray Tracing is suppose to be the new way to experience games *when it is implemented*, would it not be possible for Nvidia to make a card to only compute the Ray Tracing effects?  This brings me back to when PhysX was being implemented into video games.  With PhysX you can have two cards in SLI and one different video card by itself dedicated to PhysX.  Having decent cards two GTX 980Ti's, I would actually find it more beneficial to sell a card just dedicated to Ray Tracing, like how we could set up a card dedicated to PhysX.  That way instead of hoping straight on the RTX hype train people could go with what could be a cheaper option and possibly get on-board when they release their next series.  What does everyone else think?  ?

Link to comment
Share on other sites

Link to post
Share on other sites

No, it would not happen. Nvidia want you to buy their RTX cards, there would be no benefit to them to only release a dedicate RT card.

 

It will only become cheaper once other companies release similar Ray Tracing cards to compete with the RTX line. AMD are apparently working on it in some capacity and with intel looking to get into the GPU market it's almost something we may see them invest in.

🌲🌲🌲

Judge the product by its own merits, not by the Company that created it.

Link to comment
Share on other sites

Link to post
Share on other sites

The problem is that add-in cards means adoption of the new technology is practically zilch. Very few, if any, developers are going to care about it because if they do implement it, now you're asking someone to buy yet another thing to enjoy the new technology. If you can integrate it into something the consumer will likely buy anyway, adoption and a reason to use it go way up.

 

There's a reason why the moment NVIDIA purchased Aegia, they ported all of the PhysX API to CUDA and stopped production of PhysX processor: it's so they could get rid of needing an add-in card.

 

EDIT: Basically if you want your new technology to not become a footnote in history, you have force it on everyone going forward.

Link to comment
Share on other sites

Link to post
Share on other sites

There was another thread (i forget which) where someone had mentioned something similar. What they said was something to the effect of "ray-tracing might be the next best thing, but it could also be phys-X, and we all know how that turned out..." Basically saying ray-tracing could be a dud that no one uses, due to it's chicken-and-the-egg type scenario. NVIDIA has a card that processes it, so now it's on game developers to implement it in games. But they don't want to spend time on it because ray-tracing cards are expensive. Ray-tracing cards are expensive because it's this new technology that could be the next best thing. But we don't actually know what it looks like because game developers haven't implemented it into games yet... and the cycle repeats.

My Build, v2.1 --- CPU: i7-8700K @ 5.2GHz/1.288v || MoBo: Asus ROG STRIX Z390-E Gaming || RAM: 4x4GB G.SKILL Ripjaws 4 2666 14-14-14-33 || Cooler: Custom Loop || GPU: EVGA GTX 1080 Ti SC Black, on water || PSU: EVGA G2 850W || Case: Corsair 450D || SSD: 850 Evo 250GB, Intel 660p 2TB || Storage: WD Blue 2TB || G502 & Glorious PCGR Fully Custom 80% Keyboard || MX34VQ, PG278Q, PB278Q

Audio --- Headphones: Massdrop x Sennheiser HD 6XX || Amp: Schiit Audio Magni 3 || DAC: Schiit Audio Modi 3 || Mic: Blue Yeti

 

[Under Construction]

 

My Truck --- 2002 F-350 7.3 Powerstroke || 6-speed

My Car --- 2006 Mustang GT || 5-speed || BBK LTs, O/R X, MBRP Cat-back || BBK Lowering Springs, LCAs || 2007 GT500 wheels w/ 245s/285s

 

The Experiment --- CPU: i5-3570K @ 4.0 GHz || MoBo: Asus P8Z77-V LK || RAM: 16GB Corsair 1600 4x4 || Cooler: CM Hyper 212 Evo || GPUs: Asus GTX 750 Ti, || PSU: Corsair TX750M Gold || Case: Thermaltake Core G21 TG || SSD: 840 Pro 128GB || HDD: Seagate Barracuda 2TB

 

R.I.P. Asus X99-A motherboard, April 2016 - October 2018, may you rest in peace. 5820K, if I ever buy you a new board, it'll be a good one.

Link to comment
Share on other sites

Link to post
Share on other sites

29 minutes ago, Mysteryman2000 said:

Considering that Ray Tracing is suppose to be the new way to experience games *when it is implemented*, would it not be possible for Nvidia to make a card to only compute the Ray Tracing effects?  This brings me back to when PhysX was being implemented into video games.  With PhysX you can have two cards in SLI and one different video card by itself dedicated to PhysX.  Having decent cards two GTX 980Ti's, I would actually find it more beneficial to sell a card just dedicated to Ray Tracing, like how we could set up a card dedicated to PhysX.  That way instead of hoping straight on the RTX hype train people could go with what could be a cheaper option and possibly get on-board when they release their next series.  What does everyone else think?  ?

Well RTX itself is already a kind of PhysX considering it's already an underused and obsolete technology for consumers, so there is no reason for a dedicated ray tracing card. I guarantee that ray tracing will be a niche thing only found in the same way PhysX effects are, in triple A titles with a half baked effort on the developers' part and a performance loss that won't be worth the horsepower with the exception of screenshots. 

8086k Winner BABY!!

My tech stuff

 

Main rig

Cpu: R5 1600AF 3.95ghz 1.32v (literally the worst bin I've ever seen)

Mobo: MSI b450 A-Pro MAX

Ram: 16gb Team Group T-Force Xtreem 3600 cl18 (3466 14-14-14-14-28) kinda tuned subs

Gpu: MSI 1080 ti Duke OC

PSU: Bitfenix Formula Gold 650w

SSD: 512gb Inland premium nvme

HDD: 2tb Seagate Barracuda Compute

 

Samsung Galaxy S9 | SD845 | Adreno 640

 

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, M.Yurizaki said:

The problem is that add-in cards means adoption of the new technology is practically zilch. Very few, if any, developers are going to care about it because if they do implement it, now you're asking someone to buy yet another thing to enjoy the new technology. If you can integrate it into something the consumer will likely buy anyway, adoption and a reason to use it go way up.

 

There's a reason why the moment NVIDIA purchased Aegia, they ported all of the PhysX API to CUDA and stopped production of PhysX processor: it's so they could get rid of needing an add-in card.

 

EDIT: Basically if you want you're new technology to not become a footnote in history, you have force it on everyone going forward.

I hear what you mean, I have always leaned more toward bridging technologies, having the option to participate without having to go all in.  I find you can get more people to buy a full product if they had a taste of it.  But I guess I am the optimist I guess.

1 minute ago, TheDankKoosh said:

Well RTX itself is already a kind of PhysX considering it's already an underused and obsolete technology for consumers, so there is no reason for a dedicated ray tracing card. I guarantee that ray tracing will be a niche thing only found in the same way PhysX effects are, in triple A titles with a half baked effort on the developers' part and a performance loss that won't be worth the horsepower with the exception of screenshots. 

That makes sense, but I sort of want it to succeed in a sense where it doesn't fail like other technologies *cough 3D*.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Mysteryman2000 said:

I hear what you mean, I have always leaned more toward bridging technologies, having the option to participate without having to go all in.  I find you can get more people to buy a full product if they had a taste of it.  But I guess I am the optimist I guess.

 

14 minutes ago, TheDankKoosh said:

Well RTX itself is already a kind of PhysX considering it's already an underused and obsolete technology for consumers, so there is no reason for a dedicated ray tracing card. I guarantee that ray tracing will be a niche thing only found in the same way PhysX effects are, in triple A titles with a half baked effort on the developers' part and a performance loss that won't be worth the horsepower with the exception of screenshots. 

 

Practically every new 3D graphics technology, including 3D accelerators themselves, had very little software to show for it once they were released. And usually the generation that launched with the new technology either as good as at best, to not to whatever standards people were enjoying with the now-last generation technology.

 

Basically, we're spoiled by the jump from the GeForce 900 series to the GeForce 10 series. To which I'd argue, the GeForce 10 series was solely because of a pent up Moore's Law wanting to burst (it skipped a process node)

 

If this technology dies because the market doesn't care for it, that's fine. But I'd rather see someone try and do it, especially since developers were going how this was the holy grail of real-time graphics, than nobody ever trying it because the end users are saying "it'll never catch on."

Link to comment
Share on other sites

Link to post
Share on other sites

RTX is frankly a pointless tech until it becomes so mainstream everyone will support it by default. Until that happens, it's basically a dead gimmick tech, just like PhysX. PhysX in HW never took off and never will for as long as it'll be NVIDIA only thing. No one can use it for core gameplay because then they'd be leaving out everyone with AMD and Intel graphics. Same is with RTX where NVIDIA is basically begging and paying developers to implement it. And even then it's grasping at the gimmicky territory because they are intentionally dumbing down effects we already had and they looked WAY better than they are showcasing them against RTX. They were doing the same thing with PhysX, intentionally dumbing down CPU physics to make PhysX on GPU look better (like glass shattering in 5 pieces and disappearing when it hits the ground even though we had games like Red Faction from 2001 running on single core CPU doing physically correct shattering of glass with debris remaining on the ground for minutes). Yeah, I'll never forget that nonsense in Mirror's Edge...

 

So, yeah, RTX is frankly a pointless tech that needs to be implemented in graphic cards for generations to come by everyone until everyone has it and then it'll become a thing, because devs would actually have a reason to use it by default and not as some extra gimmick smeared on top of what we already have with standard rasterization. You may say "but that's not how it works", but it's exactly how it works. RTX has been out for months and all the promised RTX stuff still doesn't exist ANYWHERE. It's pathetic thinking they are selling us a +1000€ graphic cards all around a feature you still can't use months after launch. It's almost laughable...

AMD Ryzen 7 5800X | ASUS Strix X570-E | G.Skill 32GB 3733MHz CL16 | PALIT RTX 3080 10GB GamingPro | Samsung 850 Pro 2TB | Seagate Barracuda 8TB | Sound Blaster AE-9 MUSES Edition | Altec Lansing MX5021 Nichicon/MUSES Edition

Link to comment
Share on other sites

Link to post
Share on other sites

No ray tracing will be like dx12

liafs of hype and nothing to show then it comes out and will be worse than dx11

then they will get it working with dlss

 

and it will be like how shadow of the tomb raider runs better in dx12

 

not saying it will take that long but it will get worse before it gets better

 

and 5 years down the line you will be on hear complaining about call of booty black cops 7 not having enough ray tracing in it 

 

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, TheDankKoosh said:

Well RTX itself is already a kind of PhysX considering it's already an underused and obsolete technology for consumers, so there is no reason for a dedicated ray tracing card. I guarantee that ray tracing will be a niche thing only found in the same way PhysX effects are, in triple A titles with a half baked effort on the developers' part and a performance loss that won't be worth the horsepower with the exception of screenshots. 

 

6 hours ago, M.Yurizaki said:

 

 

Practically every new 3D graphics technology, including 3D accelerators themselves, had very little software to show for it once they were released. And usually the generation that launched with the new technology either as good as at best, to not to whatever standards people were enjoying with the now-last generation technology.

 

Basically, we're spoiled by the jump from the GeForce 900 series to the GeForce 10 series. To which I'd argue, the GeForce 10 series was solely because of a pent up Moore's Law wanting to burst (it skipped a process node)

 

If this technology dies because the market doesn't care for it, that's fine. But I'd rather see someone try and do it, especially since developers were going how this was the holy grail of real-time graphics, than nobody ever trying it because the end users are saying "it'll never catch on."

 

6 hours ago, RejZoR said:

RTX is frankly a pointless tech until it becomes so mainstream everyone will support it by default. Until that happens, it's basically a dead gimmick tech, just like PhysX. PhysX in HW never took off and never will for as long as it'll be NVIDIA only thing. No one can use it for core gameplay because then they'd be leaving out everyone with AMD and Intel graphics. Same is with RTX where NVIDIA is basically begging and paying developers to implement it. And even then it's grasping at the gimmicky territory because they are intentionally dumbing down effects we already had and they looked WAY better than they are showcasing them against RTX. They were doing the same thing with PhysX, intentionally dumbing down CPU physics to make PhysX on GPU look better (like glass shattering in 5 pieces and disappearing when it hits the ground even though we had games like Red Faction from 2001 running on single core CPU doing physically correct shattering of glass with debris remaining on the ground for minutes). Yeah, I'll never forget that nonsense in Mirror's Edge...

 

So, yeah, RTX is frankly a pointless tech that needs to be implemented in graphic cards for generations to come by everyone until everyone has it and then it'll become a thing, because devs would actually have a reason to use it by default and not as some extra gimmick smeared on top of what we already have with standard rasterization. You may say "but that's not how it works", but it's exactly how it works. RTX has been out for months and all the promised RTX stuff still doesn't exist ANYWHERE. It's pathetic thinking they are selling us a +1000€ graphic cards all around a feature you still can't use months after launch. It's almost laughable...

Comparing to physx is not even close

 

Comparing to gameŵorks would be closer

Makes the developers job easier i heard like gameŵorks for better graphics and less time 

There is no way this should die become a niche 

This was wanted couple decades ago by many developers

Intel was pushing for this on larrabee, nvidia and amd decline to support it then

Problem was having it done in real-time

 

Dx10 dx11 dx12 and vulkan didn't get games right away 

How long dx12 been around? Shit takes time now considering games take longer to produce now days

Hopefully rt will help with that

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share


×