Jump to content

NVIDIA GeForce 3070/3080/3080 Ti (Ampere): RTX Has No Perf Hit & x80 Ti Card 50% Faster in 4K! (Update 6 ~ Specs / Overview / Details)

Can't wait for these to come out got my GTX1080 (non Ti) back at launch.

Watched the 1080Ti exceed it by 20% but using more power and cost the same.

Watched the 2080 exceed it by 25% but using more power and cost more $$.

Watched the 2080Ti exceed it by 40% but use more power and cost so many more $$.

 

Hopefully the 3080Ti will use a bit more power than my GTX1080 but be 50% faster with less power than the 20XX Series.

Certainly time for an upgrade... 

CPU | AMD Ryzen 7 7700X | GPU | ASUS TUF RTX3080 | PSU | Corsair RM850i | RAM 2x16GB X5 6000Mhz CL32 MOTHERBOARD | Asus TUF Gaming X670E-PLUS WIFI | 
STORAGE 
| 2x Samsung Evo 970 256GB NVME  | COOLING 
| Hard Line Custom Loop O11XL Dynamic + EK Distro + EK Velocity  | MONITOR | Samsung G9 Neo

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, thechinchinsong said:

Big if true, as with all things related to tech rumors. Again, no matter how much more performance Nvidia is going to push out compared to Turing, pricing is still going to matter, especially for the mid to lower end cards. I don't care if the RTX-3080 is going to be as fast as the RTX-2080ti, if its the same price, its still pretty sad.

A $1200 RTX3080. How will Nvidia price the RTX3080 Ti? $1800, because Nvidia can.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, AngryBeaver said:

I would take all of this information with a HUGE grain of salt.

 

I mean there could be some truth in price drops, it just comes down to several things. First how much 7nm increase yields (this alone could be a huge cost cutter). Then you have competition with AMD which could make them try to be more competitive. Then the last big IF is where will intel be with their cards by the end of the year.

 

I mean this newest card will just be a refresh of sorts. It is just building on the current rtx lineup with a smaller node and potentially higher vram. The rest is the same... so there isn't any big r&d costs. Ray tracing had also been put for a bit now and the competition is bringing to the table to. That was another justification for the cost on the initial RTX launch.

 

I mean ultimately they should have a decent bit of cost saving this gen they could pass on to customer. It just comes down to if they will. Either way I will probably pickup a 3080ti since even if more expensive it is just a drop in the bucket to what I'll spend on the new pc I will build around it. 4th gen ryzen is coming later this year too it looks like.

"cost saving and pass on to customer."

 

These words don't exist in Jensen Huang's dictionary.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Deli said:

A $1200 RTX3080. How will Nvidia price the RTX3080 Ti? $1800, because Nvidia can.

I was rather disappointed at the price tiers, because as soon as you exceed $500, you're essentially just putting no distinction between the Geforce Gaming and the Quadro CAD part, so just sell it as a Quadro with a selection for "gaming/workstation" optimization feature in the driver. It's not like this information hasn't been around ever since they started doing unified drivers.

Link to comment
Share on other sites

Link to post
Share on other sites

All I need is something that can hold its own against the navi21 apu going into the new consoles.  Faster than that and I cease to care. That number is as yet unknown though, so I can’t buy anything.  We’ll see what happens.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

Here's to hoping AMD releases Big Navi around the time Nvidia releases the 3000 series to combat overpricing. I want a 3080 ti but do I want it bad enough??

AMD Ryzen 9 5900X - Nvidia RTX 3090 FE - Corsair Vengeance Pro RGB 32GB DDR4 3200MHz - Samsung 980 Pro 250GB NVMe m.2 PCIE 4.0 - 970 Evo 1TB NVMe m.2 - T5 500GB External SSD - Asus ROG Strix B550-F Gaming (Wi-Fi 6) - Corsair H150i Pro RGB 360mm - 3 x 120mm Corsair AF120 Quiet Edition - 3 x 120mm Corsair ML120 - Corsair RM850X - Corsair Carbide 275R - Asus ROG PG279Q IPS 1440p 165hz G-Sync - Logitech G513 Linear - Logitech G502 Lightsync Wireless - Steelseries Arctic 7 Wireless

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, Reytime said:

Here's to hoping AMD releases Big Navi around the time Nvidia releases the 3000 series to combat overpricing. I want a 3080 ti but do I want it bad enough??

My strong suspicion is they’re both ready to go and the two companies are jockeying for position unless there was some sort of agreement between them to release at different times (this would be fantastically illegal btw.  It’s more or less the definition of an anti-trust violation)

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/5/2020 at 7:05 PM, lbmoney33 said:

I'm hoping they are going to be released this year I'm saving to replace this 1080 ti.

Same. For over 3 years, 1080 Ti is an amazing card. Its performance has not been outdated yet.

Link to comment
Share on other sites

Link to post
Share on other sites

I feel that being realistic... it's not going to be cheaper. People already proved that they're stupid enough to pay a 180% markup for their next generation cards.

Even if it was, by some miracle, cheaper. These cards are going to be nowhere near the proportionate prices of before and NV will probably tout them as huge bargains, provided they are cheaper than the current gen.

Maybe I'm mad for keeping £6-800 in my mind for a proper top end card. But £1000+ is very hard to justify. Especially when looking at the performance over previous cards.

 

As others have said... The performance gain over my 1080Ti would have to be huge for me to make any serious considerations of upgrading at these prices.

Link to comment
Share on other sites

Link to post
Share on other sites

Tbh I doubt it will be cheaper as others said. Prices will be the same or jacked up higher. Right now rocking my 1080Ti and nVidia will have to seduce me to grab either the 3080 or 3080Ti and it's not gonna work.

DAC/AMPs:

Klipsch Heritage Headphone Amplifier

Headphones: Klipsch Heritage HP-3 Walnut, Meze 109 Pro, Beyerdynamic Amiron Home, Amiron Wireless Copper, Tygr 300R, DT880 600ohm Manufaktur, T90, Fidelio X2HR

CPU: Intel 4770, GPU: Asus RTX3080 TUF Gaming OC, Mobo: MSI Z87-G45, RAM: DDR3 16GB G.Skill, PC Case: Fractal Design R4 Black non-iglass, Monitor: BenQ GW2280

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, Deli said:

"cost saving and pass on to customer."

 

These words don't exist in Jensen Huang's dictionary.

Neither does handing over market share. So if they have the ability to do it and can still turn enough profit to make stakeholders happy... then I think they will do it.

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, AngryBeaver said:

Neither does handing over market share. So if they have the ability to do it and can still turn enough profit to make stakeholders happy... then I think they will do it.

If somehow the next gen Nvidia cards see a price drop will not because Jensen Huang wants to pass on saving to the customers. But because AMD has a proper response so Nvidia has to price their cards in a reasonable way. If Nvidia can get away with selling a RTX3080 for $5000, they will. Even if the production cost plus R&D is $0.50.

 

I don't know how many RTX2080 Ti Nvidia sold. However it seems everyone has at least one in their rig. Except me.....

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, Deli said:

If somehow the newer gen Nvidia cards see a price drop will not because Jensen Huang wants to pass on saving to the customers. But because AMD has a proper response so Nvidia has to price their cards in a reasonable way. If Nvidia can get away with selling a RTX3080 for $5000, they will. Even if the production cost plus R&D is $0.50.

 

I don't know how many RTX2080 Ti Nvidia sold. However it seems everyone has at least one in their rig. Except me.....

It isn't just because of AMD. It is because people still buy the cards even when over priced. If the consumers didn't buy those high priced cards then the prices would fall.

 

This is the world we live in now. All they care about is their bottom line and how much of a profit they can make. The only plus is that at least these are only graphics cards and not something people depend on like medicine which has gotten worse than all the others combined.

Link to comment
Share on other sites

Link to post
Share on other sites

I realise that the steam survey is hardly the most accurate representation....

However, looking at 1080 series cards (Ti and non) against the 2080 series (super, Ti and non) then the we're looking at about 4% vs 1.8% in favour of the 1080 series.

 

So not everyone has one.... I guess if people were stupid enough to spend that kind of money on one, then they were certainly happy to shout about it :P

Link to comment
Share on other sites

Link to post
Share on other sites

As far Steam survey is concerned I believe Turing got the lowest market share out of the recent Nvidia GPUs. Seems like Turing got less market share than Kepler(considering only the GTX 7XX cards), which would be really disappointing to Nvidia after Pascal/Maxwell success regarding the amount of market share that both got on steam.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, SADS said:

I realise that the steam survey is hardly the most accurate representation....

However, looking at 1080 series cards (Ti and non) against the 2080 series (super, Ti and non) then the we're looking at about 4% vs 1.8% in favour of the 1080 series.

 

So not everyone has one.... I guess if people were stupid enough to spend that kind of money on one, then they were certainly happy to shout about it :P

It is interesting to these kind of comparisons, but keep in mind that 10 series has been out far longer than 20 series and not everyone upgrades every generation. In that sense, it isn't looking bad. For perspective, it might be more interesting to see what kinda market share high(er) end AMD GPUs are taking in comparison. navi is probably too new to make a dent, and the VII was a rather niche card. That doesn't really leave a lot does it? Maybe Vega 56+64 would be a comparison point but they were impacted by the mining era...

 

8 minutes ago, KaitouX said:

As far Steam survey is concerned I believe Turing got the lowest market share out of the recent Nvidia GPUs. Seems like Turing got less market share than Kepler(considering only the GTX 7XX cards), which would be really disappointing to Nvidia after Pascal/Maxwell success regarding the amount of market share that both got on steam.

It is hard to do like for like comparisons between different generations. Maybe if you compare launch+1 year numbers for each it could be interesting? Turing has been out for over a year so this kinda comparison could be possible.

 

If I have some time later I might try to do something along the above lines.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, porina said:

It is hard to do like for like comparisons between different generations. Maybe if you compare launch+1 year numbers for each it could be interesting? Turing has been out for over a year so this kinda comparison could be possible.

 

If I have some time later I might try to do something along the above lines.

I kinda did that, I picked a month close to the release of the next cards and did a quick comparison(Pascal i checked closer to the middle because of the larger gap), only Turing that i just got from the most recent. So most likely Turing is the one that got the advantage.

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, Bombastinator said:

My strong suspicion is they’re both ready to go and the two companies are jockeying for position unless there was some sort of agreement between them to release at different times (this would be fantastically illegal btw.  It’s more or less the definition of an anti-trust violation)

That almost makes me wonder what Intel and AMD are doing with each other.

 

''Alright, AMD's turn to shine for a decade. Back down, Intel.''

AMD Ryzen 9 5900X - Nvidia RTX 3090 FE - Corsair Vengeance Pro RGB 32GB DDR4 3200MHz - Samsung 980 Pro 250GB NVMe m.2 PCIE 4.0 - 970 Evo 1TB NVMe m.2 - T5 500GB External SSD - Asus ROG Strix B550-F Gaming (Wi-Fi 6) - Corsair H150i Pro RGB 360mm - 3 x 120mm Corsair AF120 Quiet Edition - 3 x 120mm Corsair ML120 - Corsair RM850X - Corsair Carbide 275R - Asus ROG PG279Q IPS 1440p 165hz G-Sync - Logitech G513 Linear - Logitech G502 Lightsync Wireless - Steelseries Arctic 7 Wireless

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, porina said:

It is interesting to these kind of comparisons, but keep in mind that 10 series has been out far longer than 20 series and not everyone upgrades every generation. In that sense, it isn't looking bad. For perspective, it might be more interesting to see what kinda market share high(er) end AMD GPUs are taking in comparison. navi is probably too new to make a dent, and the VII was a rather niche card. That doesn't really leave a lot does it? Maybe Vega 56+64 would be a comparison point but they were impacted by the mining era...

I don't think that it's too unrepresentative. Sure the 10 series has been out longer, but don't forget that they discontinue the old series when the new ones come out.

The 10 series was released May 27, 2016, the 20 series September 20, 2018. So we're looking at just over 2 years of sales for the 10 series versus just over a year and a half for 20 series.

Link to comment
Share on other sites

Link to post
Share on other sites

https://www.techradar.com/news/nvidia-geforce-rtx-3080-and-3070-leaked-specs-seem-too-good-to-be-true

 

RTX 3080 with 20GB VRAM? That would be a very fruitful endeavor.

AMD Ryzen 9 5900X - Nvidia RTX 3090 FE - Corsair Vengeance Pro RGB 32GB DDR4 3200MHz - Samsung 980 Pro 250GB NVMe m.2 PCIE 4.0 - 970 Evo 1TB NVMe m.2 - T5 500GB External SSD - Asus ROG Strix B550-F Gaming (Wi-Fi 6) - Corsair H150i Pro RGB 360mm - 3 x 120mm Corsair AF120 Quiet Edition - 3 x 120mm Corsair ML120 - Corsair RM850X - Corsair Carbide 275R - Asus ROG PG279Q IPS 1440p 165hz G-Sync - Logitech G513 Linear - Logitech G502 Lightsync Wireless - Steelseries Arctic 7 Wireless

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Reytime said:

https://www.techradar.com/news/nvidia-geforce-rtx-3080-and-3070-leaked-specs-seem-too-good-to-be-true

 

RTX 3080 with 20GB VRAM? That would be a very fruitful endeavor.

What would one do with 20g of vram?  Outside of whatsitmatroms at sporting events anyway or some sort of simulation calculate stuff.  Even 12 is sort of pointless. 8g gives one 4K which is already way too much for anything that fits inside a house.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Reytime said:

https://www.techradar.com/news/nvidia-geforce-rtx-3080-and-3070-leaked-specs-seem-too-good-to-be-true

 

RTX 3080 with 20GB VRAM? That would be a very fruitful endeavor.

more better than barely enough in my eyes

3 minutes ago, Bombastinator said:

What would one do with 20g of vram?  Outside of whatsitmatroms at sporting events anyway or some sort of simulation calculate stuff.  Even 12 is sort of pointless. 8g gives one 4K which is already way too much for anything that fits inside a house.

above

streaming uses vram doesnt it?

pushing that 8k shit soon so step in right direction if all is true

not to mention developers are allocating much extra but yeah

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Bombastinator said:

What would one do with 20g of vram?  Outside of whatsitmatroms at sporting events anyway or some sort of simulation calculate stuff.  Even 12 is sort of pointless. 8g gives one 4K which is already way too much for anything that fits inside a house.

20GB VRAM would help with 4K gaming.

 

But really this way we can keep pointing and laughing at console gamers? Even after PS5 and Xbox Series X. 

AMD Ryzen 9 5900X - Nvidia RTX 3090 FE - Corsair Vengeance Pro RGB 32GB DDR4 3200MHz - Samsung 980 Pro 250GB NVMe m.2 PCIE 4.0 - 970 Evo 1TB NVMe m.2 - T5 500GB External SSD - Asus ROG Strix B550-F Gaming (Wi-Fi 6) - Corsair H150i Pro RGB 360mm - 3 x 120mm Corsair AF120 Quiet Edition - 3 x 120mm Corsair ML120 - Corsair RM850X - Corsair Carbide 275R - Asus ROG PG279Q IPS 1440p 165hz G-Sync - Logitech G513 Linear - Logitech G502 Lightsync Wireless - Steelseries Arctic 7 Wireless

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Reytime said:

20GB VRAM would help with 4K gaming.

 

But really this way we can keep pointing and laughing at console gamers? Even after PS5 and Xbox Series X. 

I will choose to keep pointing and laughing at 4k

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Bombastinator said:

I will choose to keep pointing and laughing at 4k

Nvidia is aware of that, hence the 20GB VRAM. Hopefully, lol.

 

I would really like to go full 4K for gaming but currently it's just not worth the frame rate drop.

AMD Ryzen 9 5900X - Nvidia RTX 3090 FE - Corsair Vengeance Pro RGB 32GB DDR4 3200MHz - Samsung 980 Pro 250GB NVMe m.2 PCIE 4.0 - 970 Evo 1TB NVMe m.2 - T5 500GB External SSD - Asus ROG Strix B550-F Gaming (Wi-Fi 6) - Corsair H150i Pro RGB 360mm - 3 x 120mm Corsair AF120 Quiet Edition - 3 x 120mm Corsair ML120 - Corsair RM850X - Corsair Carbide 275R - Asus ROG PG279Q IPS 1440p 165hz G-Sync - Logitech G513 Linear - Logitech G502 Lightsync Wireless - Steelseries Arctic 7 Wireless

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×