Jump to content

NVidia Geforce RTX 2080 confirmed up to 2x GTX 1080 performance. Evidence for 2080 ti as well.

50 minutes ago, mr moose said:

Hence my question, why are people expecting it to be cheaper or the same price when there is already a clear difference between this new one and the old one?  Even before we get to actual performance figures there are notable differences. 

 

Yet people are still making claims that Nvidia "don't understand the market" or "are confused" or "are greedy" etc etc.   These people literally expect the 4x4 model of an SUV to be the same price as the 2 year old 2X4 model.

Because some people are sometimes unreasonable on the internet? I guess I’m not sure what sort of answer you’re after. 

Link to comment
Share on other sites

Link to post
Share on other sites

35 minutes ago, Rattenmann said:

 

That doesn't answer my question. Are you using 3Dmark11, Vantage, Firestrike, Firestrike Extreme, Firestrike Ultra, Time Spy or Time Spy Extreme to gather these results? Not counting the mobile-low end versions such as Sky Diver and Ice Storm.

Our Grace. The Feathered One. He shows us the way. His bob is majestic and shows us the path. Follow unto his guidance and His example. He knows the one true path. Our Saviour. Our Grace. Our Father Birb has taught us with His humble heart and gentle wing the way of the bob. Let us show Him our reverence and follow in His example. The True Path of the Feathered One. ~ Dimboble-dubabob III

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, DildorTheDecent said:

That doesn't answer my question. Are you using 3Dmark11, Vantage, Firestrike, Firestrike Extreme, Firestrike Ultra, Time Spy or Time Spy Extreme to gather these results? Not counting the mobile-low end versions such as Sky Diver and Ice Storm.

It does answer your question, or I am not understanding it correctly.

There is only one highscore I can find on the webpage for the 1080ti, and I took those values. I don't use the software at all.

 

https://benchmarks.ul.com/hardware/gpu/NVIDIA+GeForce+GTX+1080+Ti+review

 

Maybe you are looking at those overclocking, charts? Those with liquid nitrogen? Obviously did not include THOSE scores.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, WMGroomAK said:

Another thing to add in consideration of all this is that from the best that I can figure nVidia basically added between 15 & 20% more CUDA cores across all of the cards (At least based on the specs I saw on Anandtech)...

 

1080Ti (3584 CUDA) -> 2080Ti (4352 CUDA) ~ 21% increase

1080 (2566 CUDA) -> 2080 (2944 CUDA) ~ 15% increase

1070 (1920 CUDA) -> 2070 (2304 CUDA) ~ 20% increase

 

We also don't know what clocks each of the Cards were running at (were they at stock configs, similar clocks, etc.) and/or how much performance gain is from switching to 12 nm from 16 nm process.  

Gen jumps always have more cuda cores but this gen has rt and tensor cores and the needed software with expensive ram

 

1 hour ago, mr moose said:

But  it is exactly what people are doing, they are pointing back to the 900 series, the 700 series and even a 8800gtx in this thread.  If they fail to account for the actual differences and only complain about the price then they do expect it to be thee same.

I brought up past gens because performance increases were less in older gens and was fine now we looking at significant larger jumps but still complaints

1 hour ago, mr moose said:

Hence my question, why are people expecting it to be cheaper or the same price when there is already a clear difference between this new one and the old one?  Even before we get to actual performance figures there are notable differences. 

 

Yet people are still making claims that Nvidia "don't understand the market" or "are confused" or "are greedy" etc etc.   These people literally expect the 4x4 model of an SUV to be the same price as the 2 year old 2X4 model.

Along with features like leathers seats sun roof etc

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, Rattenmann said:

I am not understanding it correctly.

That.

14 minutes ago, Rattenmann said:

That's a terrible website. I already know that the test was 3DMark Firestrike having used it quite a bit, but if some other poor sod comes across it they certainly won't have a clue.

14 minutes ago, Rattenmann said:

Maybe you are looking at those overclocking, charts? Those with liquid nitrogen? Obviously did not include THOSE scores.

I'm looking at HWBOT and only looking at the stock cooling results. You'll have to dig for stock clock results a bit.

Our Grace. The Feathered One. He shows us the way. His bob is majestic and shows us the path. Follow unto his guidance and His example. He knows the one true path. Our Saviour. Our Grace. Our Father Birb has taught us with His humble heart and gentle wing the way of the bob. Let us show Him our reverence and follow in His example. The True Path of the Feathered One. ~ Dimboble-dubabob III

Link to comment
Share on other sites

Link to post
Share on other sites

38 minutes ago, DildorTheDecent said:

That's a terrible website. I already know that the test was 3DMark Firestrike having used it quite a bit, but if some other poor sod comes across it they certainly won't have a clue.

 

Well, pretty sure it does not matter what page lists the score, as long as the score is verified. So i picked the one that 3dmark directly links to themselves and did not try to dig into any third party sites. 

Quote

I'm looking at HWBOT and only looking at the stock cooling results. You'll have to dig for stock clock results a bit.

I did not want to dig for stock results, I wanted to get OCed results, as a stock 1080 ti does not hit 45 fps in the Demo shown on stage. I would screw over the potential result big time if I took a score that can not be correct for the data points I was using, which is 45fps in Infiltrator. Skimming over the page you linked, the score I used is pretty spot on. OCed by a lot, but not the nr 1 extreme OC either. Within 5% is fine for me, as it is just a wild extrapolation guess anyways and i am sure the best OCed 1080ti would be higher than random Reddit claims. ;-)

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, mr moose said:

Hence my question, why are people expecting it to be cheaper or the same price when there is already a clear difference between this new one and the old one?  Even before we get to actual performance figures there are notable differences. 

 

Yet people are still making claims that Nvidia "don't understand the market" or "are confused" or "are greedy" etc etc.   These people literally expect the 4x4 model of an SUV to be the same price as the 2 year old 2X4 model.

because thats how its always was. 280, 680, 480, 780 launched at an msrp of 499 and then the 1080 launched at 549. so people expect the 2080 to be in the same price bracket as the 1080

Link to comment
Share on other sites

Link to post
Share on other sites

I really want to know the performance with the RTX 2070...

Corsair iCUE 4000X RGB

ASUS ROG STRIX B550-E GAMING

Ryzen 5900X

Corsair Hydro H150i Pro 360mm AIO

Ballistix 32GB (4x8GB) 3600MHz CL16 RGB

Samsung 980 PRO 1TB

Samsung 970 EVO 1TB

Gigabyte RTX 3060 Ti GAMING OC

Corsair RM850X

Predator XB273UGS QHD IPS 165 Hz

 

iPhone 13 Pro 128GB Graphite

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, spartaman64 said:

because thats how its always was. 280, 680, 480, 780 launched at an msrp of 499 and then the 1080 launched at 549. so people expect the 2080 to be in the same price bracket as the 1080

The GTX 280 had a launch MSRP of $649

GTX 480 $499 (~$577.82)

GTX 580 $499 (~$574.72)

GTX 680 $499 (~$548.19)

GTX 780 $650 (~$702.11)

GTX 980 $549 (~$581.23)

GTX 1080 $549 (totally unnecessary but oh well I'm on a roll lol ~$567.48)

 

Something to keep in mind though the GTX 280 was launched June 17, 2008. If you factor in inflation, that's roughly ~$747.44. If you continue looking back further than that you'll see the prices increase.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm just going to wait for reviews, then that thing where we'll see that news article like "You know that card you just bought? Turns out it has this slightly crippling issue etc" and maybe eventual price drops.

 

 

$1000 is too much for me to justify right now.

Ketchup is better than mustard.

GUI is better than Command Line Interface.

Dubs are better than subs

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Dylanc1500 said:

The GTX 280 had a launch MSRP of $649

GTX 480 $499 (~$577.82)

GTX 580 $499 (~$574.72)

GTX 680 $499 (~$548.19)

Those are their top end cards, should compare those with 780 Ti/980 Ti/1080 Ti/2080 Ti instead no?

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

I may be wrong but I don't expect gaming performance to change a lot on the RTX 2080ti vs the 1080ti,

 

even though I'm not preordering (pre-orders are always a bad idea) I will be getting a 2080ti, the why is simple the 2080ti is the first consumer cards with tensor cores that's not a Titan V and is less than half the price of the titan.

 

I think Nvidia did a poor job focusing only on raytracing alone (ray trace in realtime is a huge leap) but I think this will be a huge deal starting now. I feel this is pretty much the same move AMD is doing with TR2, I feel like games are going to start using those extra Cuda cores and tensor cores not only for raytracing (I don't think GI and reflect/refract fit all art styles) but they can use those tensor cores for Better Game AI, or even (and this is fucking exciting) locally train the AI while you play so it will adapt and learn from what you do.

 

I feel like this is just the beginning of throwing the tech out there, so companies will start implementing this stuff out. the closest cards to the rtx lineups are the voltas GV100 or the teslas and the starting price point in those is crazy.

Link to comment
Share on other sites

Link to post
Share on other sites

I guess NVIDIA saved me from myself. Unlike Pascal, and Volta; they're refusing to sell Turing cards to Ireland.

Despite them being based in, sold, and shipped from Digital River in Ireland for the EU.

Oh well, guess I'll hold onto this obsolete card for a while longer.

 


VMKVk93.png


Uka5UIg.png

 

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 8/23/2018 at 8:21 AM, Lathlaer said:

You are trying to apply some kind of logic to their plans and I'm trying to say that there isn't one to find.

 

Might I remind you that this is the same company that first sold people the Titan X, then slapped them in the face with a 1080ti that was just as fast but $400 cheaper and then slapped them again with a Titan Xp that cost the same amount but was another 15% stronger. So you tell me, out of those 3 - which card didn't make sense?

 

Sure, there was some time between those launches - Titan X premiered in August 2016 (btw. 2 months AFTER 1080) but this only proves my point. I bet those who bought the Titan X were pretty sure that they have the best that Pascal has to offer but how could they foresee what will happen in 6 months (1080ti) or in 7 months (Titan Xp).

 

So tell me now, how sure are you again that someone who bought the RTX 2080ti now has the best that Turing has to offer in this generation?

 

Anyone who was paying any attention or did any basic research knew that Nvidia would release a Pascal Titan followed by a 1080ti after the 'had to have the best' crowd picked up the Titan. There was only one 'surprising' aspect about Pascal. -- the launch of a second Titan, although even that wasn't very surprising considering how cut down the original Titan was.

PSU Tier List | CoC

Gaming Build | FreeNAS Server

Spoiler

i5-4690k || Seidon 240m || GTX780 ACX || MSI Z97s SLI Plus || 8GB 2400mhz || 250GB 840 Evo || 1TB WD Blue || H440 (Black/Blue) || Windows 10 Pro || Dell P2414H & BenQ XL2411Z || Ducky Shine Mini || Logitech G502 Proteus Core

Spoiler

FreeNAS 9.3 - Stable || Xeon E3 1230v2 || Supermicro X9SCM-F || 32GB Crucial ECC DDR3 || 3x4TB WD Red (JBOD) || SYBA SI-PEX40064 sata controller || Corsair CX500m || NZXT Source 210.

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, pas008 said:

Gen jumps always have more cuda cores but this gen has rt and tensor cores and the needed software with expensive ram

 

I brought up past gens because performance increases were less in older gens and was fine now we looking at significant larger jumps but still complaints

Along with features like leathers seats sun roof etc

A lot of people dont seem to remember when the 100 or 110 chip was in the x80 cards and the 104 was in the x60 cards.

 

With Kepler, the generational improvement was so large that the first 680(GK104) card used the chip which would have been the 560ti(GF114) successor. Thats because it was both a new microarchitecture and a node shrink.

 

Nowadays you cant even get the big chip(GV100) in a consumer part, unless you consider Titan V a consumer GPU.

 

The chip code for the 2080ti and Quadro RTX8000 is TU102, meaning its the medium sized chip. The TU100 has yet to be seen, but i doubt it will use GDDR6 and will likely use HBM2 since it has no penalty for ECC, and all Teslas and higher end Quadros have ECC.

 

Considering that NEC put 48GB of HBM2 on their SX Aurora Tsubasa, i suspect Nvidia will either have 32-48GB as well. TU100 will also need ECC registers and double precision capability, things that the 102 and 104 chips lack.

Link to comment
Share on other sites

Link to post
Share on other sites

29 minutes ago, 79wjd said:

Anyone who was paying any attention or did any basic research knew that Nvidia would release a Pascal Titan followed by a 1080ti after the 'had to have the best' crowd picked up the Titan. There was only one 'surprising' aspect about Pascal. -- the launch of a second Titan, although even that wasn't very surprising considering how cut down the original Titan was.

The second Titan, meaning the Titan V? Thats a Volta GV100 and not a Pascal part at all.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Amazonsucks said:

The second Titan, meaning the Titan V? Thats a Volta GV100 and not a Pascal part at all.

The second Titan, as in the official Xp.

PSU Tier List | CoC

Gaming Build | FreeNAS Server

Spoiler

i5-4690k || Seidon 240m || GTX780 ACX || MSI Z97s SLI Plus || 8GB 2400mhz || 250GB 840 Evo || 1TB WD Blue || H440 (Black/Blue) || Windows 10 Pro || Dell P2414H & BenQ XL2411Z || Ducky Shine Mini || Logitech G502 Proteus Core

Spoiler

FreeNAS 9.3 - Stable || Xeon E3 1230v2 || Supermicro X9SCM-F || 32GB Crucial ECC DDR3 || 3x4TB WD Red (JBOD) || SYBA SI-PEX40064 sata controller || Corsair CX500m || NZXT Source 210.

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, Amazonsucks said:

The second Titan, meaning the Titan V? Thats a Volta GV100 and not a Pascal part at all.

 

There was the Titan X ( Pascal ), followed later but 1080Ti; and then another Titan XP; which was also Pascal based.

After all those the Titan V dropped.

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, 79wjd said:

The second Titan, as in the official Xp.

Oh yeah that was a lame move on their part. Maxwell and Pascal Titans dont really deserve the name, considering none of them have DP performance and theyre not even the real big chip. Titan V is the first real Titan since Kepler that deserves the name.

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, spartaman64 said:

because thats how its always was. 280, 680, 480, 780 launched at an msrp of 499 and then the 1080 launched at 549. so people expect the 2080 to be in the same price bracket as the 1080

didnt 780 and 280 release at 650?

and 1080 at 700?

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, pas008 said:

didnt 780 and 280 release at 650?

and 1080 at 700?

An interesting series of reads over on HardOCP has the 780 release at 650, 980 at 550 and the 1080 at 700 for FE or 550 MSRP.  

 

The more interesting portion of the series is the comparison of the uplift in performance between generations from the 7XX series through the 10XX series.

 

Part 1: https://www.hardocp.com/article/2018/07/25/nvidia_gpu_generational_performance_part_1/

Part 2: https://www.hardocp.com/article/2018/08/07/nvidia_gpu_generational_performance_part_2/

Part 3: https://www.hardocp.com/article/2018/08/16/nvidia_gpu_generational_performance_part_3/

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 8/23/2018 at 9:11 AM, pas008 said:

no only the supposedly increase in manufacturing because they might have high yields

 

but top tier cards were not increasing as much as last couple gens supposedly too

 

7xx to 9xx series was big jump over many previous in overall % increases

9xx to 1xxx was even bigger

and if this is even bigger along with cuda/rt/tensor with ram prices makes complete sense

 

 

46 minutes ago, WMGroomAK said:

An interesting series of reads over on HardOCP has the 780 release at 650, 980 at 550 and the 1080 at 700 for FE or 550 MSRP.  

 

The more interesting portion of the series is the comparison of the uplift in performance between generations from the 7XX series through the 10XX series.

 

Part 1: https://www.hardocp.com/article/2018/07/25/nvidia_gpu_generational_performance_part_1/

Part 2: https://www.hardocp.com/article/2018/08/07/nvidia_gpu_generational_performance_part_2/

Part 3: https://www.hardocp.com/article/2018/08/16/nvidia_gpu_generational_performance_part_3/

 

Thx for the article was looking for that

In my quote above was trying to point that out

Link to comment
Share on other sites

Link to post
Share on other sites

23 hours ago, Morgan Everett said:

Because some people are sometimes unreasonable on the internet? I guess I’m not sure what sort of answer you’re after. 

I kinda just want them to explain their logic, because for the life of me I can't see it.

23 hours ago, pas008 said:

Gen jumps always have more cuda cores but this gen has rt and tensor cores and the needed software with expensive ram

 

I brought up past gens because performance increases were less in older gens and was fine now we looking at significant larger jumps but still complaints

Along with features like leathers seats sun roof etc

But you didn't bring up older gens as evidence nvidia were charging too much.

17 hours ago, spartaman64 said:

because thats how its always was. 280, 680, 480, 780 launched at an msrp of 499 and then the 1080 launched at 549. so people expect the 2080 to be in the same price bracket as the 1080

Even if it the 2080 was a direct replacement for the 1080 and the 1080 was out of stock already,   following a trend is not evidence that a product is overpriced, especially when it has already been demonstrated to have way more features let alone the prospective performance improvements.

 

"It has always been like that" is a moot argument when you have not only a large change in the product but large changes in the market i.e ram prices, mining, consumer demand as well.  

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

So I'm late to the party, did anyone call them on using their new anti aliasing meant for the new architecture, to measure the performance increase rather than use a normal benchmark?

If anyone asks you never saw me.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×