Jump to content

Battlefield V with DXR on tested from Techspot(Hardware Unboxed)

kiska3
3 hours ago, pas008 said:

visuals are performance

 

you play at 480p or a higher resolution? ok then why do you play at higher resolutions? oh visuals hmm

lol smh

Game performance is measured in frames per second. When that is severely decreased for gimmicks that no one cared about until nvidia said you should, then that's taking a performance hit.

 

A Core 2 Duo can play games at 1080p. If that's done at 15 FPS then that is shit performance and making a sacrifice. It's a pretty straight forward concept but I guess as long as you believe your own nonsensical excuses that's all that really matters. Carry on.

 

If your next excuse is that it’s being done at 4K resolution then I will direct you to linus’ recent video regarding average people actually being able to tell the difference between higher resolutions at the same panel size. Spoiler, You can’t and I’ve argued this before here while other members with just as little lack of understanding of basic fundamentals like you claimed to have super human senses that could.

 

It never gets old seeing fan boys attempt to justify shitty consol level frame rates because “omg ray tranccccccccccccccccing durrrrrrrr”…..

 

"lol smh".

 

 

What does windows 10 and ET have in common?

 

They are both constantly trying to phone home.

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, leadeater said:

Well Volta was the start of tensor development, Turing is just reusing that along with the front end improvements. RT cores I think are an evolution/re-purpose of Tensor cores because of how restrictive the utilization of the Tensor cores is.

I believe Nvidia is investing in multi chip designs now. A few publications as well as AdoredTV reported in this. Coincidentally spending jumped after this was published.

6 hours ago, Brooksie359 said:

Yeah why would people play woth shadows on at all or high settings if they would rather not sacrifice performance for visuals. If all anyone cared about was performance everyone would just run at the lowest visual settings. 

I think reasonably sacrificing performance is what you're falling short of grasping. I'm perfectly happy to jump from low textures to ultra because it results in a margin of error performance decrease whilst bumping object detail from something like PS3 quality to far beyond what consoles today can achieve or shadows off to high because they are the difference between a game looking like arse to one that looks great whilst costing something like 20% performance. What I'm not ok with is something that looks decently better than traditional methods but at a 60% EXTRA performance hit. Games today look great and with current hardware there is no justifiable reason to take such a performance hit for what is a modest visual improvement. Plus, resolution is another element of visuals which needs to be considered.

CPU - Ryzen Threadripper 2950X | Motherboard - X399 GAMING PRO CARBON AC | RAM - G.Skill Trident Z RGB 4x8GB DDR4-3200 14-13-13-21 | GPU - Aorus GTX 1080 Ti Waterforce WB Xtreme Edition | Case - Inwin 909 (Silver) | Storage - Samsung 950 Pro 500GB, Samsung 970 Evo 500GB, Samsung 840 Evo 500GB, HGST DeskStar 6TB, WD Black 2TB | PSU - Corsair AX1600i | Display - DELL ULTRASHARP U3415W |

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Hellion said:

Game performance is measured in frames per second. When that is severely decreased for gimmicks that no one cared about until nvidia said you should, then that's taking a performance hit.

 

 

A Core 2 Duo can play games at 1080p. If that's done at 15 FPS then that is shit performance and making a sacrifice. It's a pretty straight forward concept but I guess as long as you believe your own nonsensical excuses that's all that really matters. Carry on.

 

 

If your next excuse is that it’s being done at 4K resolution then I will direct you to linus’ recent video regarding average people actually being able to tell the difference between higher resolutions at the same panel size. Spoiler, You can’t and I’ve argued this before here while other members with just as little lack of understanding of basic fundamentals like you claimed to have super human senses that could.

 

 

It never gets old seeing fan boys attempt to justify shitty consol level frame rates because “omg ray tranccccccccccccccccing durrrrrrrr”…..

 

"lol smh".

 

 

lol so you play at 480p?

and ray tracing does give very noticeable visual difference if you arent blind especially on reflections of light which is the first step on ray tracing they are taking right now

 

and fps is visual like all said

 

game performance is measured in fps when comparing with same visual settings fyi

smh

hence why reviewers say a resolution at very high setting with no aa or this aa or that aa

wow

you are lost

visuals are performance plain and simple

 

and lol at underlined part

guess there is no reason for 4k to exist then?

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, pas008 said:

guess there is no reason for 4k to exist then?

Honestly not really, 1440p and 1600p (16:10) are about as high as you realistically need to go. Once you stop chasing 4k and focus on actual visual improvement, not pixel density, games will actually start to look different. Original Doom won't look any better at 4k ?.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, leadeater said:

Honestly not really, 1440p and 1600p (16:10) are about as high as you realistically need to go. Once you stop chasing 4k and focus on actual visual improvement, not pixel density, games will actually start to look different. Original Doom won't look any better at 4k ?.

hmm

ppi is very important too

like 42inch 1080p vs 1440 vs 4k is noticeable

you can notice at 34 inches also

27 is tough

viewing distance comes into play also

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, leadeater said:

Honestly not really, 1440p and 1600p (16:10) are about as high as you realistically need to go. Once you stop chasing 4k and focus on actual visual improvement, not pixel density, games will actually start to look different. Original Doom won't look any better at 4k ?.

and here i am perfectly happy with my 1080p, i have no plans to go up in resolution, if anything i'll just get a higher refresh rate monitor, i see no issues with 1080p

🌲🌲🌲

 

 

 

◒ ◒ 

Link to comment
Share on other sites

Link to post
Share on other sites

28 minutes ago, pas008 said:

hmm

ppi is very important too

like 42inch 1080p vs 1440 vs 4k is noticeable

you can notice at 34 inches also

27 is tough

viewing distance comes into play also

At the 30 inch mark 4k doesn't actually do much unless you're far too close to the screen to see the edges. The benefit is much less than looking for visual improvements in other areas like higher detail game assets.

 

There's a difference between looking at a static image and trying to spot the improvement of 4k vs 1440p and in game play with movement, once images start moving our ability to discern detail reduces so most side by side comparisons are rather flawed because of that.

 

Edit:

Not once have I while watching a movie on my 1080p projector wanted or needed to upgrade to 4k, it's a Panasonic AE7000, and it's projecting a 96 inch image. It's also by far a better experience than a large 4k monitor/TV, even OLED.

Link to comment
Share on other sites

Link to post
Share on other sites

35 minutes ago, leadeater said:

At the 30 inch mark 4k doesn't actually do much unless you're far too close to the screen to see the edges. The benefit is much less than looking for visual improvements in other areas like higher detail game assets.

 

There's a difference between looking at a static image and trying to spot the improvement of 4k vs 1440p and in game play with movement, once images start moving our ability to discern detail reduces so most side by side comparisons are rather flawed because of that.

I notice the difference ez on work related stuff, browser games,  and videos at work lol becoming a monitor whore there too

 

home office I am 1080p surround user with 2 1080p monitors so no comparison there and i'm exactly 24 inches from the center monitor

everyone bitches how can I play with those bezels though

I dont see them after awhile but will notice gameplay detail so I can see some of your point

 

but not everyone sees things the same its like how many fps your eyes can see argument

I know I have huge peripheral vision hence my wanting of surround/eyefinity/softth/th2g like stuff

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, pas008 said:

home office I am 1080p surround user with 2 1080p monitors so no comparison there and i'm exactly 24 inches from the center monitor

2 is kinda of a bad setup if you're playing games where the center image matters a lot, 3 monitors is an excellent setup though because it doesn't disrupt that center line. I've done dual monitor before and most often only used 1 while gaming.

 

1 hour ago, pas008 said:

I notice the difference ez on work related stuff, browser games,  and videos at work lol becoming a monitor whore there too

Both work related stuff and browser games come under that static image profile where your much more likely to notice the difference. Watching a video without specifically trying to find the difference is negligible, if you specifically try sure most people can but it's nothing like 480p to 1080p.

 

On two equivalent quality panels, one at 1440p and one at 4k playing anything with full screen full image movement i.e. not a browser game you're just not going to notice over actually paying attention to playing the game. It's just a simple cognitive limitation it doesn't matter how good one's eyesight is, we have limits.

 

The above applies in real life to what we see not through a monitor, turning your head or moving at speed you lose the ability to discern as much detail if you were not moving.

 

That's why improving realism or actual graphical quality is in my opinion a better spend of resources than trying to render at 4k rather than 1440p with the exact same game assets and effects, it's wasted computation on barely any noticeable improvement. A bad quality photo of something still looks like what it's supposed to be, a good render of something still looks fake.

Link to comment
Share on other sites

Link to post
Share on other sites

48 minutes ago, leadeater said:

2 is kinda of a bad setup if you're playing games where the center image matters a lot, 3 monitors is an excellent setup though because it doesn't disrupt that center line. I've done dual monitor before and most often only used 1 while gaming.

 

Both work related stuff and browser games come under that static image profile where your much more likely to notice the difference. Watching a video without specifically trying to find the difference is negligible, if you specifically try sure most people can but it's nothing like 480p to 1080p.

 

On two equivalent quality panels, one at 1440p and one at 4k playing anything with full screen full image movement i.e. not a browser game you're just not going to notice over actually paying attention to playing the game. It's just a simple cognitive limitation it doesn't matter how good one's eyesight is, we have limits.

 

The above applies in real life to what we see not through a monitor, turning your head or moving at speed you lose the ability to discern as much detail if you were not moving.

 

That's why improving realism or actual graphical quality is in my opinion a better spend of resources than trying to render at 4k rather than 1440p with the exact same game assets and effects, it's wasted computation on barely any noticeable improvement. A bad quality photo of something still looks like what it's supposed to be, a good render of something still looks fake.

With 2 extra monitors sry forgot a word sry

 

So 5 total

 

Cad work 3d modeling wise i can't say is static at all 

there is a huge difference

 

 

And i like this article

www.techspot.com/amp/article/1113-4k-monitor-see-difference/

 

 

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 11/15/2018 at 10:11 AM, VegetableStu said:

people don't get this though ._. they see 60 fps evaporating and people go OMG RAYTRACING SUX

That happens because of nVidia's marketing.

When you hype Ray Tracing like it's the Second Coming or something along those lines, people will expect something good. 

Reducing your performance to  1/3 to enable these effects is not good in the eyes of the consumer...that little number will make most people disregard the fact that the 2080Ti is a good chunk faster than the 1080Ti and really, really good at 4K.

 

Had they reversed the marketing technique, emphasizing classic performance, followed by the concept of RT, people would've responded better. Oh and the whole "Dying 2080Tis" isn't helping either.

 

 

MARS_PROJECT V2 --- RYZEN RIG

Spoiler

 CPU: R5 1600 @3.7GHz 1.27V | Cooler: Corsair H80i Stock Fans@900RPM | Motherboard: Gigabyte AB350 Gaming 3 | RAM: 8GB DDR4 2933MHz(Vengeance LPX) | GPU: MSI Radeon R9 380 Gaming 4G | Sound Card: Creative SB Z | HDD: 500GB WD Green + 1TB WD Blue | SSD: Samsung 860EVO 250GB  + AMD R3 120GB | PSU: Super Flower Leadex Gold 750W 80+Gold(fully modular) | Case: NZXT  H440 2015   | Display: Dell P2314H | Keyboard: Redragon Yama | Mouse: Logitech G Pro | Headphones: Sennheiser HD-569

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, pas008 said:

And i like this article

www.techspot.com/amp/article/1113-4k-monitor-see-difference/

Like all comparisons this doesn't take in to account when the image is moving and your brain can't actually process the information. That's why eye witness accounts are so unreliable, our brain lies to us all the time and what we see is often what our brain created not what is actually there. There's some really interesting experiments you can do to demonstrate just how bad our brains are, like totally missing a man in an ape suit walking past.

 

Then there is another interesting one that sales people use, if you're told something looks or sounds better then it will. When a sales person says something has more bass than another cheaper product, even if it doesn't, it'll sound like it does. If you do a 'blind test' on two monitors and you believe one is 4k and the other is less you will then believe that this supposed 4k monitor looks better when it might actually be exactly the same monitor. If you believe something is better you'll perceive it as better, our brains suck sometimes ?.

 

Edit: Here's a simple experiment you can do, get an image of a sign with text on it. Look at the image, now drag the image around your screen. While it's moving does it look as sharp as when it's not moving? Make sure you move the image slower than your monitor will make it start ghosting though.

 

1 hour ago, pas008 said:

Cad work 3d modeling wise i can't say is static at all 

there is a huge difference

CAD design is way less movement than say an FPS game, you're focused on the part of the screen you are working on not the screen image as a whole. Our cognitive inability is well documented and understood, just like we know how superior an eagle is in this regard compared to us. It's not an eye sight issue.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, leadeater said:

Like all comparisons this doesn't take in to account when the image is moving and your brain can't actually process the information. That's why eye witness accounts are so unreliable, our brain lies to us all the time and what we see is often what our brain created not what is actually there. There's some really interesting experiments you can do to demonstrate just how bad our brains are, like totally missing a man in an ape suit walking past.

 

Then there is another interesting one that sales people use, if you're told something looks or sounds better then it will. When a sales person says something has more bass than another cheaper product, even if it doesn't, it'll sound like it does. If you do a blind test on two monitors and you believe one is 4k and the other is less you will then believe that this supposed 4k monitor looks better when it might actually be exactly the same monitor. If you believe something is better you'll perceive it as better, our brains suck sometimes ?.

 

CAD design is way less movement than say an FPS game, you're focused on the part of the screen you are working on not the screen image as a whole. Our cognitive inability is well documented and understood, just like we know how superior an eagle is in this regard compared to us. 

 

its about noticing it when you are looking at it. if you can notice just a small hint numerous times that means you noticed something

you can state eye witness stuff and how bad our brain is or easily tricked I know this

same with our ears, are you going to tell me what I hear now too of my unique hearing?

 

plz head over to the display forum and tell them they are seeing anything because you say so with their eyes

side by side comparison on my work screens I notice the hell out of them

and movement too, sports, movies, games even being browser games

 

find this funny its like people telling me what fps I see

or what frequencies I hear or prefer

 

even after I link an article

oh wait that is on static image, no if I keep noticing changes I notice a change which is enough

just like resolution on audio its there on visuals

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, CarlBar said:

Ray tracing has no relation to paraxial whatsoever. I suggest watching this: 

I'm sorry, you seem to have misunderstood. I mentioned paraxial optics because it is a branch of optics that can formulate ray paths in pure matrix products. Apparently, it is referred to differently in English (Myoptics professor used archaic naming. Transfer matrices are the more common term.  https://en.wikipedia.org/wiki/Ray_transfer_matrix_analysis)

 

 

This was to argue for my question of what the differences between RT and Tensor cores are. 

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, Brooksie359 said:

You may call the 2080 ti overpriced but it isn't a price of crap. It is objectively the fastest consumer gpu on the market for gaming and can put out higher fps than any other card on the market at 4k by a significant margin. As a tech enthusiast I will 100% sacrifice performance for Ray tracing. 

I am sick of hearing this argument repeated, the fastest gpu. So i guess everyone was rocking a titan untill now? And if he apply the same argument to the cpu's everyone was paring it with a 7980xe? 

When did pricing vs performance became irrelevant? 

 

The other argument i guess we should just start the old "the human eye cant see more than 60fps anyway" again

.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, pas008 said:

 

its about noticing it when you are looking at it. if you can notice just a small hint numerous times that means you noticed something

you can state eye witness stuff and how bad our brain is or easily tricked I know this

same with our ears, are you going to tell me what I hear now too of my unique hearing?

 

plz head over to the display forum and tell them they are seeing anything because you say so with their eyes

side by side comparison on my work screens I notice the hell out of them

and movement too, sports, movies, games even being browser games

 

find this funny its like people telling me what fps I see

or what frequencies I hear or prefer

 

even after I link an article

oh wait that is on static image, no if I keep noticing changes I notice a change which is enough

just like resolution on audio its there on visuals

 

 

Have you actually tried what I said to do? Actually move the image around the screen and see it become less clear. Or focus on a word on the screen then move your head around.

 

The article is not wrong, the math and information is true, for static images only.

 

1 hour ago, pas008 said:

side by side comparison on my work screens I notice the hell out of them

This can be deeply flawed if you're not comparing equivalent quality panels with equivalent input processors on the monitor. That's exactly why a $100 1080p screen looks and performs worse than a $400 1080p screen.

 

1 hour ago, pas008 said:

oh wait that is on static image, no if I keep noticing changes I notice a change which is enough

But do you notice while playing a full screen game, with lots of movement. Situation matters you know, we are talking about games here not CAD design, looking at photos or reading text.

 

Because at no point did I say it's not possible to notice the difference, the point is you will not while playing and concentrating on the game.

 

Edit:

Plus your missing the point I actually made, 4K is a complete resource waste in improving visual quality over other options we have. 4K doesn't actually make the game look better, like I said original Doom at 4k still looks garbage.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, leadeater said:

Like all comparisons this doesn't take in to account when the image is moving and your brain can't actually process the information. That's why eye witness accounts are so unreliable, our brain lies to us all the time and what we see is often what our brain created not what is actually there. There's some really interesting experiments you can do to demonstrate just how bad our brains are, like totally missing a man in an ape suit walking past.

 

Then there is another interesting one that sales people use, if you're told something looks or sounds better then it will. When a sales person says something has more bass than another cheaper product, even if it doesn't, it'll sound like it does. If you do a 'blind test' on two monitors and you believe one is 4k and the other is less you will then believe that this supposed 4k monitor looks better when it might actually be exactly the same monitor. If you believe something is better you'll perceive it as better, our brains suck sometimes ?.

 

Edit: Here's a simple experiment you can do, get an image of a sign with text on it. Look at the image, now drag the image around your screen. While it's moving does it look as sharp as when it's not moving? Make sure you move the image slower than your monitor will make it start ghosting though.

 

CAD design is way less movement than say an FPS game, you're focused on the part of the screen you are working on not the screen image as a whole. Our cognitive inability is well documented and understood, just like we know how superior an eagle is in this regard compared to us. It's not an eye sight issue.

Like no... almost. People "say it looks better" because you force them to (indirectly/socially), not because it *does* look better to them.

 

Our brains do very well, however small defects or improvements are noticed. It is down to individual visual ability (vision/eye health/brain's natural or trained responeses) to what people like and to some degree can actually see.

 

However, some improvements and "tricks" are very much real, as are some defects, in monitor type or technology. So for 4k, it has some difference in colour/clarity. Some of this might be from removing some jpeg/mpeg articating. Or from smoother motion (effectively sub pixel even in not visible at the sitting distance, and even if at lower FPS).

 

But I agree, it is diminishing returns, 8k+ and no, the human eye cannot see the difference (with the exception of edge cases, say a 99.999% gradient or a super sharp text edge with line wobble removing the  strobe effects lower resolutions get). 

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, TechyBen said:

Like no... almost. People "say it looks better" because you force them to (indirectly/socially), not because it *does* look better to them.

 

Our brains do very well, however small defects or improvements are noticed. It is down to individual visual ability (vision/eye health/brain's natural or trained responeses) to what people like and to some degree can actually see.

 

However, some improvements and "tricks" are very much real, as are some defects, in monitor type or technology. So for 4k, it has some difference in colour/clarity. Some of this might be from removing some jpeg/mpeg articating. Or from smoother motion (effectively sub pixel even in not visible at the sitting distance, and even if at lower FPS).

 

But I agree, it is diminishing returns, 8k+ and no, the human eye cannot see the difference (with the exception of edge cases, say a 99.999% gradient or a super sharp text edge with line wobble removing the  strobe effects lower resolutions get). 

Well yes however like I said my entire point is specific to moving images i.e. playing a game. When you stop concentrating on the game to specifically look to see if you can see a visual difference between the resolutions your ability to do so vastly improves. If however you are concentrating on the game your ability to notice the difference decreases, a lot.

 

Our brains do very well but are easily overloaded, it's in that situation that anyone says they can significantly notice the difference between 1440p and 4k is mostly likely a victim of placebo effect, I expect a difference so perceive a difference.

Link to comment
Share on other sites

Link to post
Share on other sites

 

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, daimonie said:

This was to argue for my question of what the differences between RT and Tensor cores are. 

I'm not sure anyone outside of Nvidia actually knows, I haven't seen any detailed white papers on RT cores.

 

Here is some information I mentioned about how you need to use the Tensor cores.

image.png.011d42e669664fd79d86908649591e22.png

http://images.nvidia.com/content/volta-architecture/pdf/volta-architecture-whitepaper.pdf

 

Edit:

@daimonie Scratch that, looks like Nvidia does have a decent enough white paper.

 

Quote

At the heart of Turing’s hardware-based ray tracing acceleration is the new RT Core included in each SM. RT Cores accelerate Bounding Volume Hierarchy (BVH) traversal and ray/triangle intersection testing (ray casting) functions. (See Appendix D Ray Tracing Overview on page 68 for more details on how BVH acceleration structures work). RT Cores perform visibility testing on behalf of threads running in the SM.

 

RT Cores traverse the BVH autonomously, and by accelerating traversal and ray/triangle intersection tests, they offload the SM, allowing it to handle other vertex, pixel, and compute shading work. Functions such as BVH building and refitting are handled by the driver, and ray generation and shading is managed by the application through new types of shaders.

 

Essentially, the process of BVH traversal would need to be performed by shader operations and take thousands of instruction slots per ray cast to test against bounding box intersections in the BVH until finally hitting a triangle and the color at the point of intersection contributes to final pixel color (or if no triangle is hit, background color may be used to shade a pixel).

 

Ray tracing without hardware acceleration requires thousands of software instruction slots per ray to test successively smaller bounding boxes in the BVH structure until possibly hitting a triangle. It’s a computationally intensive process making it impossible to do on GPUs in real-time without hardware-based ray tracing acceleration (see Figure 19).

 

The RT Cores in Turing can process all the BVH traversal and ray-triangle intersection testing, saving the SM from spending the thousands of instruction slots per ray, which could be an enormous amount of instructions for an entire scene. The RT Core includes two specialized units. The first unit does bounding box tests, and the second unit does ray-triangle intersection tests. The SM only has to launch a ray probe, and the RT core does the BVH traversal and ray-triangle tests, and return a hit or no hit to the SM.

https://www.nvidia.com/content/dam/en-zz/Solutions/design-visualization/technologies/turing-architecture/NVIDIA-Turing-Architecture-Whitepaper.pdf

Link to comment
Share on other sites

Link to post
Share on other sites

There's a reason raytracing hasn't been used, it's quite simply not worth it. The extra visual quality in reflections is almost meaningless, and anyone who plays games seriously will be keeping settings such as reflections, shadows etc on low if not the lowest detail settings. NVIDIA is simply cashing in on a gimmick. (At least for now, but the technology to power ray-tracing will not be affordable in the near future if the 2080TI with its specifications cannot handle 60FPS at 1080p now).

i7-7700k | GTX 1080 Ti | 32GB 3100MHz | Prime Z270-AR

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, qqqqqq said:

There's a reason raytracing hasn't been used, it's quite simply not worth it. The extra visual quality in reflections is almost meaningless, and anyone who plays games seriously will be keeping settings such as reflections, shadows etc on low if not the lowest detail settings. NVIDIA is simply cashing in on a gimmick. (At least for now, but the technology to power ray-tracing will not be affordable in the near future if the 2080TI with its specifications cannot handle 60FPS at 1080p now).

I would phrase that quite differently:

  • It has not been used because it was too expensive computationally
  • The extra visual quality was worth it, but not for real time gaming.  
  • Anyone who plays competitive games will keep detail settings low.
  • NVidia reached the point where 60 FPS is doable, and wants to get some of its R&D costs back
  • The AI features of the cards (DLSS etc) aren't used yet, but seem promising
  • Those same features are worth it for some contexts. I've used GPUs in academic research for RT calculations for e.g. a high frequency tricolour fluorescent microscope. Tensor cores would've been awesome.

All of that seems fine to me. The RTX 2070 was competitive in pricing with the 1080, so I bought it.

 

Edit: Regarding battlefield, it seems their implementation is quite bad. From what I can gather their "enable it everywhere" gave too big a performance drop, and they filter out RTX features to control performance. That gives weird artefacts in addition to being not consistent. 

Link to comment
Share on other sites

Link to post
Share on other sites

40 minutes ago, daimonie said:

Edit: Regarding battlefield, it seems their implementation is quite bad. From what I can gather their "enable it everywhere" gave too big a performance drop, and they filter out RTX features to control performance. That gives weird artefacts in addition to being not consistent. 

the point of RTX is less work for the coders, if you have to go to every eye, tree, piece of grass, pool of water  bridge, window,... and say this one has rtx, this one doesn't, then it would be worst then doing it the old fashion way. I think that when you enable it, it works for everything, any other way it really makes no sense.

.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, asus killer said:

the point of RTX is less work for the coders, if you have to go to every eye, tree, piece of grass, pool of water  bridge, window,... and say this one has rtx, this one doesn't, then it would be worst then doing it the old fashion way. I think that when you enable it, it works for everything, any other way it really makes no sense.

Seems like it. But then because BF enabled it later in their process, they had all those entities in place already. So things that shouldn't reflect - e.g. tree bark - were reflecting when they turned it all on (What I called "enable it everywhere). It's not a RTX problem, but just that they developed without RTX and enabled it later. Clearly you don't want to go refactor everything, so they tried to fix it by using some kind of filtering. 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×