Jump to content

Turkish website finally tests 2080 Ti vs 1080 Ti in 4K with benchmarks

asim1999
1 hour ago, Misanthrope said:

Possibly, but I'm just not that optimistic it seems to be too much of a jump to get a card that's 2 steps above vs the older generation on the high end bracket. Like Nvidia will claim that for sure and on select titles with really good ray tracing optimization and really tiny FPS numbers (If you're at 40 FPS  10  frames can actually means 25%) and also all other settings and stuff available (This numbers might only be possible with both ray tracing and the machine learning whatever AA) 

 

I think my estimation (2080 being maybe 5 to 10% faster than 1080ti for most non-ray-tracing games) it's more realistic outside of fringe cases.

The issue is that for some reason people expect linear price increase (20% more price gets you 20% more performance) probably because for a while and specially for low and mid range cards this has been truth.

 

People don't seem to understand that Nvidia is basically untouched in the high end market right now, so they can do exponential price increments: You want 20% more peformance? 20% more money. Another 20% on top? That's now 35% increase. You want the very best we know will have basically no competition for at least 2 years? That's another 20% increase in performance but this time a 50% increase in price.

You know 970 out did 780ti

1070 980ti right?

This gen might be close to the same

 

And find it funny people are whining about rt

If rt gives way more realistic visuals then isn't that like whining about handling resolution

1080p with rt might actually be better then 4k visually i don't think this gen but you get the drift i hope

Link to comment
Share on other sites

Link to post
Share on other sites

These are really good numbers, if real, for alpha/beta drivers. It can only get better from here honestly. We are now about to play 4K reliably above 60fps hopefully.

 

Again, the price is just fine. If you cant afford it and dont play 4k, just go get a discounted 10 series or save up. When AMD isnt competitive enough still, then you can charge more. I'm a strong believer that the 2080Ti is now squeezing inbetween the Titan and Ti area. It just makes sense.

*Insert Name* R̶y̶z̶e̶n̶ Intel Build!  https://linustechtips.com/main/topic/748542-insert-name-r̶y̶z̶e̶n̶-intel-build/

Case: NZXT S340 Elite Matte White Motherboard: Gigabyte AORUS Z270X Gaming 5 CPU: Intel Core i7 7700K GPU: ASUS STRIX OC GTX 1080 RAM: Corsair Ballistix Sport LT 2400mhz Cooler: Enermax ETS-T40F-BK PSU: Corsair CX750M SSD: PNY CS1311 120GB HDD: Seagate Momentum 2.5" 7200RPM 500GB

 

Link to comment
Share on other sites

Link to post
Share on other sites

when a 2080ti is around $600-700 it'll probably be time to buy. 

It'll also probably mean new cards are coming out so not time to buy. 

Wait for AMD Navi? Got a good feeling about them...

"If a Lobster is a fish because it moves by jumping, then a kangaroo is a bird" - Admiral Paulo de Castro Moreira da Silva

"There is nothing more difficult than fixing something that isn't all the way broken yet." - Author Unknown

Spoiler

Intel Core i7-3960X @ 4.6 GHz - Asus P9X79WS/IPMI - 12GB DDR3-1600 quad-channel - EVGA GTX 1080ti SC - Fractal Design Define R5 - 500GB Crucial MX200 - NH-D15 - Logitech G710+ - Mionix Naos 7000 - Sennheiser PC350 w/Topping VX-1

Link to comment
Share on other sites

Link to post
Share on other sites

I would have been happy with this if it was a normal new generation GPU launch.

2080ti replaces the 1080ti

2080 replaces the 1080

2070 replaces the 1070

 

But what they are doing instead is creating a new market segment and making high end PC gaming more elitist.

 

A new gen GPU launch is supposed to improve perf/$, which they could have done here by replacing the old gen.

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, mr moose said:

EDIT: and of course this only accounts for if we want to talk about improvements in the terms of raw percentages.  There really isn't a away to value some aspects of the card, AI cores might be the best thing since spliced bread adding a whole next level to gaming but not everyone will agree how much that experience is worth.   For me I game at 1080p and don't really care much for anything above that, so it's performance at 4K is not worth the cost to me, but that doesn't mean it isn't worth cost to someone else who loves their 4K and that certainly doesn't mean the card is artificially over priced as some are trying to argue. 

I value better graphical fidelity over performance or outright resolution, resolution is not a direct reflection of graphical fidelity which seems to get lost quickly in conversations about such things. I also much prefer smarter and better game logic/AI. I mean I'd honestly be fine with a new AoE game with the same graphics as AoE 2 but with super amazing new game features and much smarter AI or for Metal Gear Solid enemies that actually have realistic human sight and aren't totally blind if you're just that 1 pixel out of the static sight/detection boxes.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Humbug said:

I would have been happy with this if it was a normal now generation GPU launch.

2080ti replaces the 1080ti

2080 replaces the 1080

2070 replaces the 1070

 

But what they are doing instead is creating a new market segment and making high end PC gaming more elitist.

 

A new gen GPU launch is supposed to improve perf/$, which they could have done here by replacing the old gen.

 

Visuals is performance

 resolution aa etc

Not only fps

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Humbug said:

But what they are doing instead is creating a new market segment and making high end PC gaming more elitist.

And what's wrong with that, they are adding a new product that does new things, why is that bad?
 you don;t have to buy it you can still get all the other cards that were available before the 20series launch.  I have no doubt both AMD and Nvidia will still release cards that don't have the RT and tensor stuff.

1 hour ago, Humbug said:

A new gen GPU launch is supposed to improve perf/$, which they could have done here by replacing the old gen.

 

Says who?  there are no guarantees in life, just because there has been a general trend in one market doesn't mean that market is not going to diversify and and become larger.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, dgsddfgdfhgs said:

Everyone will buy 2 GTX 1070 and SLI

Nah. Already preordered one from evga.

Link to comment
Share on other sites

Link to post
Share on other sites

48 minutes ago, mr moose said:

And what's wrong with that, they are adding a new product that does new things, why is that bad?
 you don;t have to buy it you can still get all the other cards that were available before the 20series launch.  I have no doubt both AMD and Nvidia will still release cards that don't have the RT and tensor stuff.

I like the new RT and tensor stuff. I like that we are just starting to dip our toes in ray tracing even in a limited way. I think that this new hardware from Nvidia is good, and the engineers seem to have done a good job.

 

I just don't agree with the product positioning from corporate/marketing. More than two years have passed since the GTX 1080/1070 launched, and I think now would have been a good time to say "here are the replacements based on the Turing architecture, they are faster and have some new features". Nvidia has developed the right hardware to be able to do that but they choose not to...

 

54 minutes ago, mr moose said:

Says who?  there are no guarantees in life, just because there has been a general trend in one market doesn't mean that market is not going to diversify and and become larger.

 Fair enough... it's not the end of the world. I can easily wait for 7nm GPUs next year since my R9 290 still kicks ass at 1080p 60fps... I do understand that Nvidia is a business and wants to maximize their margins since people are willing to pay $1200. I was just speaking from a consumer point of view it was an opportunity for Nvidia (since they have the hardware) to move the PC gaming market forward with more adoption of higher performance GPUs...

Link to comment
Share on other sites

Link to post
Share on other sites

Why would anyone take this video seriously? I mean, cards ARE NOT RELEASED YET. 

I mean, who stops Nvidia from benchmarking 2080 Ti using medium settings, while people put 1080 Ti at max? 

Seriously, stop, get help.

Ex-EX build: Liquidfy C+... R.I.P.

Ex-build:

Meshify C – sold

Ryzen 5 1600x @4.0 GHz/1.4V – sold

Gigabyte X370 Aorus Gaming K7 – sold

Corsair Vengeance LPX 2x8 GB @3200 Mhz – sold

Alpenfoehn Brocken 3 Black Edition – it's somewhere

Sapphire Vega 56 Pulse – ded

Intel SSD 660p 1TB – sold

be Quiet! Straight Power 11 750w – sold

Link to comment
Share on other sites

Link to post
Share on other sites

57 minutes ago, Quadriplegic said:

Why would anyone take this video seriously? I mean, cards ARE NOT RELEASED YET. 

I mean, who stops Nvidia from benchmarking 2080 Ti using medium settings, while people put 1080 Ti at max? 

Seriously, stop, get help.

The video should definitely be looked at in incredulity as there are many unknowns.

 

To be fair, the contrary could be said as well. The gist is, no one should make conclusions until the product is officially released to the public. 

 

Please don't take this the wrong way as I truly mean no disrespect, but people will be more receptive of your ideas and points if you don't lambaste them.

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, Dan Castellaneta said:

Any proof that it isn't true?

I'm not saying it is true nor am I saying for anyone to take this as definitive results, but seriously. Tell me how this is definitively not true.

How obvious it needs to be so you can look through a lie and people intentions ?

He tries to push some information about the RTX card as it's a hot topic right now, as he is not a big YouTube channel he needs relevancy. Thus he made a video which doesn't show any gameplay with or without the new card, there is just some game footage on a small screen at the top right with no fps meter, just a footage of a game with unknown GPU. Just because someone tried to make 3-4 tables with something written on it talking like other tech channels doesn't make it trustful. He DON'T have the card nor he made benchmarks he made them based on some articles online and maybe his grain in salt in there.

Sometimes a lie don't need hard proof to know it's a lie.

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, M.Yurizaki said:

Tell me more.

cringe

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Humbug said:

 

 Fair enough... it's not the end of the world. I can easily wait for 7nm GPUs next year since my R9 290 still kicks ass at 1080p 60fps... I do understand that Nvidia is a business and wants to maximize their margins since people are willing to pay $1200. I was just speaking from a consumer point of view it was an opportunity for Nvidia (since they have the hardware) to move the PC gaming market forward with more adoption of higher performance GPUs...

My R9 380 is still good enough now for what I do.   Only so much of the market is going to move toward high end gaming. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, Brooksie359 said:

people are still butt hurt about the 2080ti replacing the titan and costing 1200 bucks. if the name was titan instead of 2080ti nobody would care at all. 

This, and I think Nvidia even took it a step further and actually intended the 2080ti being the Titan, 2080 being the 1080ti and 2070 being the 1080. Which would explain the higher price for the current gen.

"To the wise, life is a problem; to the fool, a solution" (Marcus Aurelius)

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, JCBiggs said:

to bad CPU's don't improves at the rate GPU's can.  

Prior to Ryzen a mid range CPU such as an i5 had a CB score of around 600, now a mid end chip such as a R5 2600 scores around 1400 at the same price.

11 hours ago, JCBiggs said:

30% with 4k video is a MASSIVE improvement.   23 frames in 4k is like 80 in 1080p

It is a ~30% improvement in every resolution, only because 1080p results in higher frame rates is the gain in FPS higher. The smaller absolute difference in 4K would actually be more noticeable as the perceived difference decreases with increasing framerate (120Hz is smoother than 60Hz but not "twice as smooth")

Link to comment
Share on other sites

Link to post
Share on other sites

Well I sold my 1080Ti and will be on the lookout for a new card. The 2080 seems a better value buy, but being on 3440 UW resolution makes me hanker after the 2080Ti!!!

Link to comment
Share on other sites

Link to post
Share on other sites

25 minutes ago, mr moose said:

My R9 380 is still good enough now for what I do.   Only so much of the market is going to move toward high end gaming. 

im moving to the high end. but like the high end 4 years ago. 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, JediFragger said:

Well I sold my 1080Ti and will be on the lookout for a new card. The 2080 seems a better value buy, but being on 3440 UW resolution makes me hanker after the 2080Ti!!!

sorry to be that guy, but the 2080 has less cudacores than the 1080ti........... so unless you are raytracing, you should really have waited untill reviews started popping up.

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, TOMPPIX said:

when the can actually do real ray tracing without having to use ai to denoise it.

Real ray tracing is not needed. The graphical fidelity will be mostly irrelevant with shooters such as Battlefield - those who are looking how shiny that random tin can is will die.

 

The law of diminishing returns comes into this to, 10x the light rays will not make the game or even the reflections look 10x better.

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, JCBiggs said:

to bad CPU's don't improves at the rate GPU's can.  

Actually they did, but they increase core count not IPC, the thing is GPU's dont get 60% performance boost in terms of IPC, gpu's benefit massively from parralel shader cores thats why if you add 25% more shader cores to the same GPU you can expect up to a 25% performance increase in best case scenario

 

CPU's cant give you 60% more frame rates because you simply cant gain that much IPC every 2 years, but you can add more cores, if the game engine doesnt use those cores properly game doesnt scale so it looks like you get 5-10% from 2-4 extra cores, while gpu's get almost 100% scaling from extra shader cores.

 

Games have complex engines and you need a solid trio for more frames, high core CPU+Mhz + good RAM + Good GPU, simply using a better CPU cant get you more frames without software making use of it, take for example AVX/AVX2 most games dont use these instructions, they still use SSE2 for compatibility reasons which is lame.

 

In order to get the benefit of newer CPU you need better algorithms that scale to cores + AVX new instructions and game engines that have good cache/memory layout and usage, but if you do that the CPU is no longer a bottleneck, im sure you noticed there are some games that are very well multithreaded, use little cpu and there isnt a big difference in performance even when comparing very old CPU vs new.

 

On the other hand if you throw a parralel load on a CPU like video rendering and compare 4 core vs 8 core from the same generation you see the 8 core is 2x faster or more if it has more cache. The game's logic that run's on the cpu simply isnt parralel enough or better said its not parralel until you make it, so you cant see the benefit of 2 extra cores, but the cpu's did actually improve especially since ryzen came out and intel had to respond adding extra cores.

 

Nowdays you gain a lot more performance CPU wise from algorithm than by overclocking or extra cores, thats just how it works.

Link to comment
Share on other sites

Link to post
Share on other sites

I agree with a few other commenters that the reason people are so upset is because they called it the 2080Ti. The Ti line have historically been cut down versions of the Titan Lineup with more reasonable price points. Like the 1080Ti is basically exactly the same as my Titan X (Pascal) just at half the price. They really should have just called the 2080Ti the "Nvidia GeForce RTX Titan T" and nobody would have batted an eye at anything other than their terrible naming scheme.

 

I don't know why anybody even cares about this though. Reading the comments on this article was kinda hilarious. Wait until the cards come out and you actually have real verifiable numbers before starting to talk value propositions and whether a card is worth it's price. Repeat after me: "Never Pre-Order".

Link to comment
Share on other sites

Link to post
Share on other sites

37 minutes ago, GoldenLag said:

sorry to be that guy, but the 2080 has less cudacores than the 1080ti........... so unless you are raytracing, you should really have waited untill reviews started popping up.

Nah, I got a realllllly good price for my Ti so it'll be almost a straight switch. No way are they going to let it be slower than the old Ti ;)

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×