Jump to content

Quake II RTX is out now for free

illegalwater

A shame I just got a 1050 ti.
Even if I'm planning to buy a 1660 ti, I won't be able to fully enjoy this Quake II edition as it should be. :C

(I suppose I would get around 10 fps? Taking in account it's not an RTX enabled card, even if the drivers support it)

Link to comment
Share on other sites

Link to post
Share on other sites

51 minutes ago, steelo said:

Yes, it's exciting that RT may be a 'thing' in the next 5-10 years.

For some people, it's a thing right now.

 

51 minutes ago, steelo said:

NVidia PR strategy is idiotic, to say the least.

Not really, they're demonstrating what type of lighting RTX and similar technologies permit.

Come Bloody Angel

Break off your chains

And look what I've found in the dirt.

 

Pale battered body

Seems she was struggling

Something is wrong with this world.

 

Fierce Bloody Angel

The blood is on your hands

Why did you come to this world?

 

Everybody turns to dust.

 

Everybody turns to dust.

 

The blood is on your hands.

 

The blood is on your hands!

 

Pyo.

Link to comment
Share on other sites

Link to post
Share on other sites

54 minutes ago, Drak3 said:

For some people, it's a thing right now.

 

Not really, they're demonstrating what type of lighting RTX and similar technologies permit.

I've seen RT videos with a RX 580 and a 3rd gen i5.  A $150 card vs a $1,100 one.

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, steelo said:

I've seen RT videos with a RX 580 and a 3rd gen i5.  A $150 card vs a $1,100 one.

And?

 

The $1000 card is more powerful, and performs better.

Come Bloody Angel

Break off your chains

And look what I've found in the dirt.

 

Pale battered body

Seems she was struggling

Something is wrong with this world.

 

Fierce Bloody Angel

The blood is on your hands

Why did you come to this world?

 

Everybody turns to dust.

 

Everybody turns to dust.

 

The blood is on your hands.

 

The blood is on your hands!

 

Pyo.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, comander said:

Game looks great on my 2080. Doesn't support ultra-wide resolutions which is eugh. 
I did have to lower the resolution though and it's not as smooth as I'd like. 

I didn't play for much time but my general take away is that we're probably 1-2 more generations of cards away from Ray tracing being fully viable. 

Or if it was done the way Crytek is doing it, with adaptive precision for reflections and getting smooth framerate on mid end raster graphic card. But this has been the age old same thing by NVIDIA. Overkill simulation of effects coz reasons. Remember PhysX? It didn't really show anything that couldn't generally be done on CPU apart from fluids and yet decade has passed and HW accelerated physics still aren't a thing in games. In fact we still have either most basic physics with ragdolls and few pieces of debris or super precise super accurate PhysX HW to do same thing, but slower. I still can't get the nonsense I've seen in Mirror's Edge with shattered glass. Few pieces of glass and it bogged down a pretty capable system. Same few pieces of glass were simulated pretty much the same on single core CPU back in 2001 in Red Faction. And it almost felt better. It was unnecessarily precise where there was no need for it. RTX is kinda doing the same thing all over again. 

 

They do full precision super sharp reflections of everything in everything. Not only it's stupid demanding, it's not even realistic or good looking. Windows never reflect things like a mirror. Cars don't reflect things like a mirror. Most metals don't reflect like mirrors. Not even water does that. And yet RTX does exactly that. It's stupid wasteful approach that doesn't even give good results. Crytek's RT on the other hand, runs on any card, gives good framerate and actually looks more natural because of "worse" precision in reflections. I don't care if it's "faked". One thing is faking effect itself, another thing is just faking the accuracy of reflection that you don't need to be 100% 1:1 reflection of world objects anyway. Being less precise, but still react realistically is what I want, not fake super sharp reflections in everything that just looks fake and overdone in the end.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, RejZoR said:

Remember PhysX? It didn't really show anything that couldn't generally be done on CPU apart from fluids and yet decade has passed and HW accelerated physics still aren't a thing in games.

DirectCompute mostly superseded PhysX. Why use a proprietary solution when you can have a standard one?

 

Quote

In fact we still have either most basic physics with ragdolls and few pieces of debris or super precise super accurate PhysX HW to do same thing, but slower. I still can't get the nonsense I've seen in Mirror's Edge with shattered glass. Few pieces of glass and it bogged down a pretty capable system. Same few pieces of glass were simulated pretty much the same on single core CPU back in 2001 in Red Faction. And it almost felt better. It was unnecessarily precise where there was no need for it. RTX is kinda doing the same thing all over again. 

While I'll agree that it doesn't seem as impressive, there's still the argument of how it was implemented in the first place because poor implementation can create software performance issues. Much like how Crysis doesn't seem to perform better on later hardware.

 

However I will argue that the glass in Red Faction reacts only in a single way. Once you shoot something a window, the pieces just fall straight down (or maybe with a random angle) and are not affected by anything else except level geometry which doesn't look impressive as the glass actually being thrown around. Plus I'm willing to be the glass shards aren't procedurally generated. It's likely a pre-generated mesh map minus where the glass was hit.

 

Quote

They do full precision super sharp reflections of everything in everything. Not only it's stupid demanding, it's not even realistic or good looking. Windows never reflect things like a mirror. Cars don't reflect things like a mirror. Most metals don't reflect like mirrors. Not even water does that. And yet RTX does exactly that. It's stupid wasteful approach that doesn't even give good results. Crytek's RT on the other hand, runs on any card, gives good framerate and actually looks more natural because of "worse" precision in reflections.

I'm going to sound mean but... have you actually been outside?

Spoiler

 

McCombs12273c2_master.jpg?width=768

4791342-reflecting-in-business-center-mi

 

Puddlegram-Reflection-iPhone-Photos-11.j

Puddlegram-Reflection-iPhone-Photos-20.j

 

93711-1-BYOwliV.jpg

 

6266017736_701775f66d.jpg

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

But it ran on a freaking 1GHz single core from 18 years ago. Considering the age it looks freaking amazing even today. And no, glass wasn't pre-animated. Yes, it shattered from the point of shot, but the glass on the ground was thrown around by explosions. And windows were blown depending on where explosion happened. It looked and still does look amazing even almost 2 decades later.

 

My ancient recording. Crytek's solution is what Red Factions glass was to PhysX glass. It's just a shame so few games use CryEngine, otherwise this ray tracing would be super popular. Unreal on the other hand is chasing same overkill approach and it'll not take off all that well for quite some time despite engine popularity.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, comander said:

Game looks great on my 2080. Doesn't support ultra-wide resolutions which is eugh. 
I did have to lower the resolution though and it's not as smooth as I'd like. 

I didn't play for much time but my general take away is that we're probably 1-2 more generations of cards away from Ray tracing being fully viable. 

Or you could just play Doom 2016 which looks far better and will run at native 21:9 200fps on a RTX 2080...

Personal Desktop":

CPU: Intel Core i7 10700K @5ghz |~| Cooling: bq! Dark Rock Pro 4 |~| MOBO: Gigabyte Z490UD ATX|~| RAM: 16gb DDR4 3333mhzCL16 G.Skill Trident Z |~| GPU: RX 6900XT Sapphire Nitro+ |~| PSU: Corsair TX650M 80Plus Gold |~| Boot:  SSD WD Green M.2 2280 240GB |~| Storage: 1x3TB HDD 7200rpm Seagate Barracuda + SanDisk Ultra 3D 1TB |~| Case: Fractal Design Meshify C Mini |~| Display: Toshiba UL7A 4K/60hz |~| OS: Windows 10 Pro.

Luna, the temporary Desktop:

CPU: AMD R9 7950XT  |~| Cooling: bq! Dark Rock 4 Pro |~| MOBO: Gigabyte Aorus Master |~| RAM: 32G Kingston HyperX |~| GPU: AMD Radeon RX 7900XTX (Reference) |~| PSU: Corsair HX1000 80+ Platinum |~| Windows Boot Drive: 2x 512GB (1TB total) Plextor SATA SSD (RAID0 volume) |~| Linux Boot Drive: 500GB Kingston A2000 |~| Storage: 4TB WD Black HDD |~| Case: Cooler Master Silencio S600 |~| Display 1 (leftmost): Eizo (unknown model) 1920x1080 IPS @ 60Hz|~| Display 2 (center): BenQ ZOWIE XL2540 1920x1080 TN @ 240Hz |~| Display 3 (rightmost): Wacom Cintiq Pro 24 3840x2160 IPS @ 60Hz 10-bit |~| OS: Windows 10 Pro (games / art) + Linux (distro: NixOS; programming and daily driver)
Link to comment
Share on other sites

Link to post
Share on other sites

I get about 45-50 fps on a 2080 ti and some serious input lag with all the settings on high 

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, RejZoR said:

And no, glass wasn't pre-animated.

I didn't say it was pre-animated. I said it used pre-generated mesh maps to apply the shatter pattern. i.e., the shards aren't procedurally generated, but there's enough patterns and the fact there's a lot of shards that you don't really notice.

 

Quote

Yes, it shattered from the point of shot, but the glass on the ground was thrown around by explosions.

At an arguably weak trajectory that also looks uniform, which isn't realistic.

 

Quote

And windows were blown depending on where explosion happened. It looked and still does look amazing even almost 2 decades later.

Meh, when seen in comparison to glass shattering through a physics simulator, it kind of doesn't:

 

 

(and okay, a non-PhysX using game)

 

 

I mean, it's impressive for the time, but compared to glass being simulated on today's physics engines? Nah.

 

EDIT: Also I find the argument of using the 2007 Mirror's Edge game as an example of PhysX being useless as kind of a weak one, given it was a more or less "first generation" PhysX game after NVIDIA ported it to CUDA. A combination of Moore's Law and software improvements (hopefully) have saved it from the issues that plagued it in later games.

 

Quote

My ancient recording. Crytek's solution is what Red Factions glass was to PhysX glass. It's just a shame so few games use CryEngine, otherwise this ray tracing would be super popular. Unreal on the other hand is chasing same overkill approach and it'll not take off all that well for quite some time despite engine popularity.

And it'll take time before developers realize what the "best practices" are for shiny new features. Take tessellation for example. The first thing that showed it off pretty obviously was Unigine's Heaven; the amount they used was overkill and exaggerated in many places. And I would argue it's an even more subtle effect than ray tracing is when used more or less properly.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, RejZoR said:

Or if it was done the way Crytek is doing it, with adaptive precision for reflections and getting smooth framerate on mid end raster graphic card. But this has been the age old same thing by NVIDIA. Overkill simulation of effects coz reasons. Remember PhysX? It didn't really show anything that couldn't generally be done on CPU apart from fluids and yet decade has passed and HW accelerated physics still aren't a thing in games. In fact we still have either most basic physics with ragdolls and few pieces of debris or super precise super accurate PhysX HW to do same thing, but slower. I still can't get the nonsense I've seen in Mirror's Edge with shattered glass. Few pieces of glass and it bogged down a pretty capable system. Same few pieces of glass were simulated pretty much the same on single core CPU back in 2001 in Red Faction. And it almost felt better. It was unnecessarily precise where there was no need for it. RTX is kinda doing the same thing all over again. 

 

They do full precision super sharp reflections of everything in everything. Not only it's stupid demanding, it's not even realistic or good looking. Windows never reflect things like a mirror. Cars don't reflect things like a mirror. Most metals don't reflect like mirrors. Not even water does that. And yet RTX does exactly that. It's stupid wasteful approach that doesn't even give good results. Crytek's RT on the other hand, runs on any card, gives good framerate and actually looks more natural because of "worse" precision in reflections. I don't care if it's "faked". One thing is faking effect itself, another thing is just faking the accuracy of reflection that you don't need to be 100% 1:1 reflection of world objects anyway. Being less precise, but still react realistically is what I want, not fake super sharp reflections in everything that just looks fake and overdone in the end.

Smooth framerate..? It was running at 1080 30 FPS lol, and that was in a tech demo, not a real game.

https://www.techspot.com/news/80004-performance-details-behind-crytek-rtx-free-neon-noir.html

Dell S2721DGF - RTX 3070 XC3 - i5 12600K

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, System32.exe said:

Smooth framerate..? It was running at 1080 30 FPS lol, and that was in a tech demo, not a real game.

https://www.techspot.com/news/80004-performance-details-behind-crytek-rtx-free-neon-noir.html

One that we can't even get our hands on to verify anything (as far as I know)

 

At least with a lot of the RT stuff NVIDIA's been parading we can run the same demos now.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, System32.exe said:

Smooth framerate..? It was running at 1080 30 FPS lol

If framerates are consiatant, 30FPS is a smooth frame rate.

Come Bloody Angel

Break off your chains

And look what I've found in the dirt.

 

Pale battered body

Seems she was struggling

Something is wrong with this world.

 

Fierce Bloody Angel

The blood is on your hands

Why did you come to this world?

 

Everybody turns to dust.

 

Everybody turns to dust.

 

The blood is on your hands.

 

The blood is on your hands!

 

Pyo.

Link to comment
Share on other sites

Link to post
Share on other sites

39 minutes ago, Drak3 said:

If framerates are consiatant, 30FPS is a smooth frame rate.

Nah, 30 FPS is pretty bad, even when consistent. It's the bare minimum.

 

Regardless, 1080p 30 FPS (in a controlled environment too) is terrible compared to what the RTX cards can do with RT in real games. I easily get over double that at 1440p with my 2080 in all three DXR games. The 2060, a card close to Vega 56's price bracket, trounces the V56 at 1080p with RT, getting over double the FPS.

Dell S2721DGF - RTX 3070 XC3 - i5 12600K

Link to comment
Share on other sites

Link to post
Share on other sites

 

9 hours ago, RejZoR said:

LMAO. RTX is DXR, it's just neatly packaged under their name and can be used as an exclusive feature locked down to NVIDIA cards. There is no special magic there, other than that they stuffed dedicated hardware for ray tracing into GPU (which conforms to DirectX ray tracing feature set aka DXR). Also, what push? 3 games lmao. They can "announce" 5000 and it matters dick if all we get is 3 games so far lol. And a tech demo based on 2 decades old game. So 4.

 

Can you please stop posting, you clearly don't know what you are talking about, RTX is not the same as DXR.     That's why this whoel thing is called RTX and not just RT.

8 hours ago, Princess Luna said:

This has to be one of the most idiotic things nVidia done this year... and that's saying a lot when it's nVidia.

 

Yes the game only cares about RT Cores, the TITAN V tanks just as much as the 1080 Ti. No a game from 1997 using DXR is not a way to promote the RTX 2080 over the GTX 1080 Ti and Radeon 7.

 

 

Of course it only cares about RT cores, it was expressly created to showcase the RTX implementation on NVIDIA cards.  

 

Honestly, do you people cry marketing stunt  every time you don't understand a product?

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, Mira Yurizaki said:

 

While NVIDIA lit the fire, I think the PS5 will add fuel to the fire, if what Cerny said about it having "ray tracing support" meant anything.

 

Though then again, Epic did recently add RT capabilities to Unreal Engine such that all the developer seemingly has to do is flip a switch.

Who's Cerny?

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Mira Yurizaki said:

Mark Cerny, the PlayStation system architect.

Well it would be pretty stupid to tell everyone your products has a feature it doesn't.  But like all things of this nature, how well it does it is yet to be realised.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

This looks awful and way too over dramatic.

 

I don't think I've ever seen a better example of how to ruin a classic game.

What does windows 10 and ET have in common?

 

They are both constantly trying to phone home.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, mr moose said:

 

 

Can you please stop posting, you clearly don't know what you are talking about, RTX is not the same as DXR.     That's why this whoel thing is called RTX and not just RT.

 

Of course it only cares about RT cores, it was expressly created to showcase the RTX implementation on NVIDIA cards.  

 

Honestly, do you people cry marketing stunt  every time you don't understand a product?

LMAO. It's the same shit. Just because they pack it into their framework with their fancy name doesn't mean it's entirely different super special thing.

Link to comment
Share on other sites

Link to post
Share on other sites

43 minutes ago, RejZoR said:

LMAO. It's the same shit. Just because they pack it into their framework with their fancy name doesn't mean it's entirely different super special thing.

https://pcper.com/2018/03/nvidia-rtx-technology-accelerates-ray-tracing-for-microsoft-directx-raytracing-api/

 

Quote

 

Alongside support and verbal commitment to DXR, NVIDIA is announcing RTX Technology. This is a combination of hardware and software advances to improve the performance of ray tracing algorithms on its hardware and it works hand in hand with DXR.

 

 

It is more than just DXR, it is Hardware specifically utilised on top of DXR.  You can use DXR on just about anything fast enough, but Gameworks RTX acceleration cannot be.  

 

 

 

EDIT: heres some more:

 

http://cgicoffee.com/blog/2018/03/what-is-nvidia-rtx-directx-dxr
 

Quote


Raytracing and NVIDIA RTX, which without any explanation can be quite confusing, seeing how NVIDIA heavily focuses on the native hardware-accelerated tech which RTX is, whist Microsoft stresses out that DirecX DXR is an extension of an existing DX tool-set and compatible with all of the future certified DX12-capable graphics cards

 

 

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

It is more than just DXR, it is Hardware specifically utilised on top of DXR.  You can use DXR on just about anything fast enough, but Gameworks RTX acceleration cannot be.  

 

How is DXR named RTX and locked to NVIDIA only any different than DXR lol? Hint: It's not. Principles are the same, they just stuck it with their framework that's allegedly "easier" to implement in games just by sticking their SDK in it instead of coding it all yourself. Like ever single other proprietary thing done by NVIDIA in the last 2 decades.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, RejZoR said:

It is more than just DXR, it is Hardware specifically utilised on top of DXR.  You can use DXR on just about anything fast enough, but Gameworks RTX acceleration cannot be.  

 

How is DXR named RTX and locked to NVIDIA only any different than DXR lol? Hint: It's not. Principles are the same, they just stuck it with their framework that's allegedly "easier" to implement in games just by sticking their SDK in it instead of coding it all yourself. Like ever single other proprietary thing done by NVIDIA in the last 2 decades.

 

How is Unix named OSX and locked to apple any different than Unix lol?   Please...

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, mr moose said:

 

How is Unix named OSX and locked to apple any different than Unix lol?   Please...

How is Unix (DXR) named OSX (RTX) and locked to apple (NVIDIA) any different than Unix (DXR) lol?   Please...

 

Did you just literally confirmed my point? LMAO

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×