Jump to content

Turkish website finally tests 2080 Ti vs 1080 Ti in 4K with benchmarks

asim1999

Hmm, it's starting to look like this "Ray Tracing" stuff is probably not extremely fancy new cores and just brute force: If you get 60 fps average at 4k you can probably get 120 average on 1080p and then you cut that by one third again to land at 40 fps and boom: ray tracing in real time!

 

The real story is that the 2080 will be basically really close to the 1080ti so my pre-emptive advise would be to snatch a 1080ti while you can for a lower price than a 2080 and just wait for the 3080 in 2 years to jump into ray tracing.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Misanthrope said:

Hmm, it's starting to look like this "Ray Tracing" stuff is probably not extremely fancy new cores and just brute force: If you get 60 fps average at 4k you can probably get 120 average on 1080p and then you cut that by one third again to land at 40 fps and boom: ray tracing in real time!

 

The real story is that the 2080 will be basically really close to the 1080ti so my pre-emptive advise would be to snatch a 1080ti while you can for a lower price than a 2080 and just wait for the 3080 in 2 years to jump into ray tracing.

my guess is 2080 > 1080ti and 2070 is close to 1080ti which is why we arent seeing it right away because they want more of 1080ti gone from inventory

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, TOMPPIX said:

right now it is inefficient. but who knows if it would be in ... 5 years?

Inefficient design have been used in games for years and often times it's just to get something out the door now so that people can see what it's used for. Crysis had SSAO which helped make it look impressive, but it was likely an unoptimized algorithm. Today, SSAO is a relatively cheap way to improve image quality.

 

I made a post somewhere else that basically many times something new in GPUs comes out, the first generation that has the feature doesn't vastly improve performance over the previous generation unless you give the GPU an application that was designed for the feature.

 

You have to start somewhere. NVIDIA wants game developers to break through new grounds in graphics development. If all you care about is FPS, then okay, this isn't for you yet.

Link to comment
Share on other sites

Link to post
Share on other sites

What I want to know is who are these fictional "2080 Ti" users?

If they exist they are clearly breaking the NDA.

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, AverageAfro said:

I'm honestly not feeling that this is enough performance increase to justify the 1.1k

This plus raytracing make it worth it for me tbh. 

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, M.Yurizaki said:

But that's just you. Some people are willing to sacrifice some FPS for higher quality.

 

If you don't like the card's performance with the new features turned on, then don't buy it. Buy whatever suits you and leave it at that.

 

Requirements are vague. What's "real" ray tracing?

ray tracing is the simulation of real light photons, meaning you start by sending a bunch of photons from the light source and have it hit all over the place to make the image, so real ray tracing is extremely intensive, what nvidea is doing is reduce the amount of rays a lot then denoise it to try and make it less noticeable, they are also limiting the parts of the scene that get this partial ray trace to help reduce performance impact, for real ray tracing we need more than 10x the compute power we have now,

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, cj09beira said:

ray tracing is the simulation of real light photons, meaning you start by sending a bunch of photons from the light source and have it hit all over the place to make the image,

The requirement is vague. "A bunch" is relative. 1,000 rays can sound like "a bunch" to me. 1,000,000 sounds like "a bunch" too. NVIDIA claims the RTX 2080 can do 10,000,000,000. That sounds like a "bunch". It also doesn't really answer why that's "real" ray tracing. What makes say 1 trillion rays "real" ray tracing but 9,999,999,999 not?

 

Yes I'm sounding like a pedantic ass, but being vague opens the floor to goal-post shifting.

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, cj09beira said:

ray tracing is the simulation of real light photons, meaning you start by sending a bunch of photons from the light source and have it hit all over the place to make the image, so real ray tracing is extremely intensive, what nvidea is doing is reduce the amount of rays a lot then denoise it to try and make it less noticeable, they are also limiting the parts of the scene that get this partial ray trace to help reduce performance impact, for real ray tracing we need more than 10x the compute power we have now,

Who even cares? They are using a raytracing implementation to improve the realistic nature of light in games improving the overall image quality. To say "but it's not raytracing" is dumb as it gives the desired effect and uses the raytracing principles. 

Link to comment
Share on other sites

Link to post
Share on other sites

So we don;t know how legitimate these numbers are, but seeing as most people are treating them as legitimate, doesn't it make you wonder why some people see 30% improvement and claim its barely better than the old generation and some see the same 30% figure and claims it's a good figure for a new card?  I mean we are all seeing the same figures here, surely it's not hard to acknowledge when a product is actually better or are we so hell bent on hating Nvidia that we are going to poo poo them for everything whether its rational or not? 

 

Like it or not they have packed this new card with extra processing power to handle ray tracing and AI,  you don't have to buy it, you can wait for AMD to release something of equivalent 4K performance without all the ray tracing if you want. Not wanting to pay nvidia for ray tracing doesn't make the card overpriced, it just makes your personal desires unmet. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

How come nobody is mentioning the obvious? Where did that folks get their values from? I HEAVILY doubt they got an engineering sample or even a review card since that would mean they'd had to sign an NDA. Breaching NDA would be disastrous for their chanel (nobody will trust them anymore) and potentially VERY expensive. If they actually ran benchmarks themselves then this breach of NDA would basically free all other reviewers of it since it's only forbidden to post information and data not already published. 

 

Side note: I actually doubt reviewers even got their cards at this point so how would they get their hands on one?

 

So there are just two possibilites for how they got those values:

a) wild guess (heck I even give it to them and might call it an educated guess if there's no actual data)

b) a compilation of some values taken from NVIDIA promotional material recorded under unknown circumstances on (partially) unknown hardware running unknown drivers and games with unknown settings. 

 

Either way those values mean literally nothing. 

 

Never trust just one source of data. Sure, those values could proof to be right but there's no point of knowing until reviewers actually start posting similar benchmark results.

Use the quote function when answering! Mark people directly if you want an answer from them!

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, pas008 said:

my guess is 2080 > 1080ti and 2070 is close to 1080ti which is why we arent seeing it right away because they want more of 1080ti gone from inventory

Possibly, but I'm just not that optimistic it seems to be too much of a jump to get a card that's 2 steps above vs the older generation on the high end bracket. Like Nvidia will claim that for sure and on select titles with really good ray tracing optimization and really tiny FPS numbers (If you're at 40 FPS  10  frames can actually means 25%) and also all other settings and stuff available (This numbers might only be possible with both ray tracing and the machine learning whatever AA) 

 

I think my estimation (2080 being maybe 5 to 10% faster than 1080ti for most non-ray-tracing games) it's more realistic outside of fringe cases.

28 minutes ago, mr moose said:

So we don;t know how legitimate these numbers are, but seeing as most people are treating them as legitimate, doesn't it make you wonder why some people see 30% improvement and claim its barely better than the old generation and some see the same 30% figure and claims it's a good figure for a new card?  I mean we are all seeing the same figures here, surely it's not hard to acknowledge when a product is actually better or are we so hell bent on hating Nvidia that we are going to poo poo them for everything whether its rational or not? 

 

Like it or not they have packed this new card with extra processing power to handle ray tracing and AI,  you don't have to buy it, you can wait for AMD to release something of equivalent 4K performance without all the ray tracing if you want. Not wanting to pay nvidia for ray tracing doesn't make the card overpriced, it just makes your personal desires unmet. 

The issue is that for some reason people expect linear price increase (20% more price gets you 20% more performance) probably because for a while and specially for low and mid range cards this has been truth.

 

People don't seem to understand that Nvidia is basically untouched in the high end market right now, so they can do exponential price increments: You want 20% more peformance? 20% more money. Another 20% on top? That's now 35% increase. You want the very best we know will have basically no competition for at least 2 years? That's another 20% increase in performance but this time a 50% increase in price.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

30 minutes ago, mr moose said:

So we don;t know how legitimate these numbers are, but seeing as most people are treating them as legitimate, doesn't it make you wonder why some people see 30% improvement and claim its barely better than the old generation and some see the same 30% figure and claims it's a good figure for a new card?  I mean we are all seeing the same figures here, surely it's not hard to acknowledge when a product is actually better or are we so hell bent on hating Nvidia that we are going to poo poo them for everything whether its rational or not? 

 

Like it or not they have packed this new card with extra processing power to handle ray tracing and AI,  you don't have to buy it, you can wait for AMD to release something of equivalent 4K performance without all the ray tracing if you want. Not wanting to pay nvidia for ray tracing doesn't make the card overpriced, it just makes your personal desires unmet. 

people are still butt hurt about the 2080ti replacing the titan and costing 1200 bucks. if the name was titan instead of 2080ti nobody would care at all. 

Link to comment
Share on other sites

Link to post
Share on other sites

41 minutes ago, M.Yurizaki said:

The requirement is vague. "A bunch" is relative. 1,000 rays can sound like "a bunch" to me. 1,000,000 sounds like "a bunch" too. NVIDIA claims the RTX 2080 can do 10,000,000,000. That sounds like a "bunch". It also doesn't really answer why that's "real" ray tracing. What makes say 1 trillion rays "real" ray tracing but 9,999,999,999 not?

 

Yes I'm sounding like a pedantic ass, but being vague opens the floor to goal-post shifting.

i guess real ray tracing is when you don't use rasterizaion at all, and all pixels on screen are rendered by following rays, then having more or less rays just improves the quality (if very few are used it looks like there are a bunch of small light sources instead of a single image).

rasterization is the art of emploing a bunch of tricks to proximate the image, many of them need some work to be store in the game files with some extra info about the scene

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Misanthrope said:

People don't seem to understand that Nvidia is basically untouched in the high end market right now, so they can do exponential price increments: You want 20% more peformance? 20% more money. Another 20% on top? That's now 35% increase. You want the very best we know will have basically no competition for at least 2 years? That's another 20% increase in performance but this time a 50% increase in price.

People understand that very well, but this card is not a 1080ti replacement. You might be paying 50% more, but you are paying for 30% more performance and 20% more features.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

40 minutes ago, Brooksie359 said:

Who even cares? They are using a raytracing implementation to improve the realistic nature of light in games improving the overall image quality. To say "but it's not raytracing" is dumb as it gives the desired effect and uses the raytracing principles. 

who said i did, just responding to a question :| 

Link to comment
Share on other sites

Link to post
Share on other sites

1. This is the reason NVidia wants to limit reviewers. And also the reason I agree with them.

This video is full of crap. Collecting benchmarks from "users" and NVidia that all run on different systems. At best this is a useless comparison that does not show the real performance numbers, does not show any of RTX features added in and at worst is flat out wrong altogether.

 

2. We really need more topics about what Tensor Cores are, what RT cores are and how good the DLSS works / looks. It is amazing to read so many replies that flat out talk like tensor cores are used for RT at all and therefore can't be used for DLSS at the same time for example.

 

Please peeps, no matter how hard you want to hate on the prices. At least try to look at the full picture and don't fall for videos like that.

Link to comment
Share on other sites

Link to post
Share on other sites

34 minutes ago, cj09beira said:

i guess real ray tracing is when you don't use rasterizaion at all, and all pixels on screen are rendered by following rays, then having more or less rays just improves the quality (if very few are used it looks like there are a bunch of small light sources instead of a single image).

rasterization is the art of emploing a bunch of tricks to proximate the image, many of them need some work to be store in the game files with some extra info about the scene

If I took that reasoning of what "real" Ray tracing is, then I could say RTX using RDX isn't doing "real" rasterization because rendering pipeline is not completely using traditional rendering methods either

 

At this point I feel like people are arguing semantics just to hate on something because it doesn't meet their arbitrary requirements

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, M.Yurizaki said:

If I took that reasoning of what "real" Ray tracing is, then I could say RTX using RDX isn't doing "real" rasterization either because it's not completely rasterized either.

 

At this point I feel like people are arguing semantics just to hate on something because it doesn't meet their arbitrary requirements

people are just trying to figure out whats going on, haven't seen a drop of hate in here

Link to comment
Share on other sites

Link to post
Share on other sites

This is why NVidia wants to verify the reviewers and make sure they aren't incorrectly presenting their product. This could be misleading, and may or may not actually represent how the product is actually intended to perform at launch with correct drivers.

 

I would personally suggest this be at back of your mind with enough salt to cause Hypernatremia.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, TOMPPIX said:

remove the tensor cores and all that other bullshit and drop the price, then it might be worth it. don't force everyone to become an "early adopter"

i'd rather join in on the ray tracing when it's actually ready.

 

What exactly would be the point of releasing another card that competes directly with their existing ones?  There is only so much they can do without a die shrink.

 

While raytracing may be inefficient in many ways, using fixed-function hardware is MORE efficient at doing its job than programmable shaders.  Why would they invest so much into this if they could make just as much profit by just slapping on twice as many CUDA cores?  Clearly its not that simple.

nVidia have put a lot of R&D into this hardware so of course they are trying to get their investment back on the high-end release.  Its not like its something they have never done before, nor are you forced to buy it.

Maybe they WILL release a GTX 2080 Ti later but odds are its just RTX 2080 Ti chips where the RTX is faulty.  So I very much doubt its going to be more powerful, almost everyone seems to think that has to wait for a die shrink.

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

53 minutes ago, mr moose said:

People understand that very well, but this card is not a 1080ti replacement. You might be paying 50% more, but you are paying for 30% more performance and 20% more features.

Well I sort of agree but this implies a sad reality where the 1080ti replacement is barely faster than 1080ti. Because if we consider that the average here is 37.5% increase (average from all 10 games from OP) then we have to assume that the true replacement will be, say, 10% faster than 1080ti + ray tracing which it can barely handle (because 2080ti can barely handle ray tracing so why even consider 2080 for this feature).

CPU: i7 6950X  |  Motherboard: Asus Rampage V ed. 10  |  RAM: 32 GB Corsair Dominator Platinum Special Edition 3200 MHz (CL14)  |  GPUs: 2x Asus GTX 1080ti SLI 

Storage: Samsung 960 EVO 1 TB M.2 NVME  |  PSU: In Win SIV 1065W 

Cooling: Custom LC 2 x 360mm EK Radiators | EK D5 Pump | EK 250 Reservoir | EK RVE10 Monoblock | EK GPU Blocks & Backplates | Alphacool Fittings & Connectors | Alphacool Glass Tubing

Case: In Win Tou 2.0  |  Display: Alienware AW3418DW  |  Sound: Woo Audio WA8 Eclipse + Focal Utopia Headphones

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, Lathlaer said:

Well I sort of agree but this implies a sad reality where the 1080ti replacement is barely faster than 1080ti. Because if we consider that the average here is 37.5% increase (average from all 10 games from OP) then we have to assume that the true replacement will be, say, 10% faster than 1080ti + ray tracing which it can barely handle (because 2080ti can barely handle ray tracing so why even consider 2080 for this feature).

Not really, because we don't really know what the performance impact of ray tracing is yet and these figures may not be reflective of the card.  So far without it it's 37% faster.  Which means the other features only have to further it's performance by 13% to justify the 50% increase in cost.

 

EDIT: and of course this only accounts for if we want to talk about improvements in the terms of raw percentages.  There really isn't a away to value some aspects of the card, AI cores might be the best thing since spliced bread adding a whole next level to gaming but not everyone will agree how much that experience is worth.   For me I game at 1080p and don't really care much for anything above that, so it's performance at 4K is not worth the cost to me, but that doesn't mean it isn't worth cost to someone else who loves their 4K and that certainly doesn't mean the card is artificially over priced as some are trying to argue. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×