Jump to content

NVIDIA GeForce 3070/3080/3080 Ti (Ampere): RTX Has No Perf Hit & x80 Ti Card 50% Faster in 4K! (Update 6 ~ Specs / Overview / Details)

33 minutes ago, Mira Yurizaki said:

That too but you can toggle it off. If it's an application like Stadia, they likely won't have it on because there's no point.

IIRC, higher end quadros have double the VRAM because the ECC feature requires that the pool be split in half an mirrored. Might be wrong.

Come Bloody Angel

Break off your chains

And look what I've found in the dirt.

 

Pale battered body

Seems she was struggling

Something is wrong with this world.

 

Fierce Bloody Angel

The blood is on your hands

Why did you come to this world?

 

Everybody turns to dust.

 

Everybody turns to dust.

 

The blood is on your hands.

 

The blood is on your hands!

 

Pyo.

Link to comment
Share on other sites

Link to post
Share on other sites

28 minutes ago, tankyx said:

The probem is more on drivers support

Even if Quadro drivers aren't "game ready", it doesn't matter for something like Stadia. All they need to achieve is 60 FPS and it's likely they won't run the games at maximum quality anyway. Reliability trumps performance in a datacenter environment and consumer cards aren't built for that. At least at the specs we want them to be pushed.

Link to comment
Share on other sites

Link to post
Share on other sites

The 12gb is interesting.  Making a ray tracing system more powerful than Turing is kind of a no brainer  though.  That’s like an absolute minimum requirement.  The rtx stuff was more of a “look! This exists!” Than something even vaguely useful.

traditionally it takes 3 generations of tech to make something worth using.  This makes gen 2.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

Wonder if it will be a worth it upgrade from a GTX 1080 TI to a RTX 3080, or a RTX 3080 TI ?

I7 9700k- MSI Z390 Gaming Edge AC - Be Quiet Dark Pro 4 - Corsair Vengeance LPX 3200 2X8GB - EVGA RTX 2080 TI Black Edition - WD BLACK NVMe M.2 500GB -  EVGA Supernova 650 P2, 80+ Platinum -Nzxt H500 - Display Dell S2721DGF 27" - Keyboard- Razer Huntsman Mini 60%  - Mouse- Logitech G203  - Headset  Astro A50 2019 Edition - Speakers - Logitech Z623

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, lbmoney33 said:

Wonder if it will be a worth it upgrade from a GTX 1080 TI to a RTX 3080, or a RTX 3080 TI ?

if history goes looks like the 3080 will be like 2080ti like performance

maybe more considering 7nm along with other variables unless they dial the jump down at first

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, pas008 said:

if history goes looks like the 3080 will be like 2080ti like performance

maybe more considering 7nm along with other variables unless they dial the jump down at first

That's what I was going to assume.  My 1080 Ti is still great at 144 on 1080p. but games like Oddessy and Red Dead 2 are very demanding.

I7 9700k- MSI Z390 Gaming Edge AC - Be Quiet Dark Pro 4 - Corsair Vengeance LPX 3200 2X8GB - EVGA RTX 2080 TI Black Edition - WD BLACK NVMe M.2 500GB -  EVGA Supernova 650 P2, 80+ Platinum -Nzxt H500 - Display Dell S2721DGF 27" - Keyboard- Razer Huntsman Mini 60%  - Mouse- Logitech G203  - Headset  Astro A50 2019 Edition - Speakers - Logitech Z623

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/30/2019 at 1:18 PM, OlympicAssEater said:

Bitcoin miners will destroy the msrp really quick. I hate those mf bitcoin miners ruining msrp and supply.

you don't mine bitcoin on gpu's. And for the crypto's you do need gpu's to mine, the high end cards are mostly significantly less price efficient than lower end sku's which is why pretty much nobody uses the RTX 2xxx series to mine

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/30/2019 at 1:42 PM, Kongou said:

Depending on how much "cheaper", this could be the upgrade I've been waiting for. My 1080 has been great from day 1, but I'll see how it endures by the time the 3000 series is released.

 

always thought of my 1080 still being high end until I started up RDR2 this week... My baby is suffering so much even at 1080p 

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, lbmoney33 said:

Wonder if it will be a worth it upgrade from a GTX 1080 TI to a RTX 3080, or a RTX 3080 TI ?

Depends on what are your demands for upgrading. Me personally, I follow my "internal" rule of new graphic card from same price segment doubling the framerate of my existing card. Then it becomes worth considering. This rule works best if you stick with mid range cards and becomes less relevant for high end cards. From my observation, it usually means you're skipping 1 or 2 generations before rule applies and it never happens that games are struggling on your graphic card while always playing with max details. Not to mention game engines these days seem to push really high framerates compared to games from the past. I mean, would you ever imagine playing Doom 3 on a mid range card using max details back in the day? Where Doom reboot from few years ago ran on max settings absolutely smoothly on mid range cards. So, I'm just sticking with my GTX 1080Ti. Skipped RTX series entirely, skipped RX 5700 series entirely and when something will double my framerate, I'll upgrade. Unless they bring something new feature wise that might make me curious. RTX just wasn't it yet, mostly because of lack of games to justify it.

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, COTG said:

always thought of my 1080 still being high end until I started up RDR2 this week... My baby is suffering so much even at 1080p 

That's likely due to your CPU not your GPU, a 1080 is still a good 1440p card so its not your card. I only just upgraded from a 1080 to a 2080ti for 4k.

My Current Build: https://uk.pcpartpicker.com/list/36jXwh

 

CPU: AMD - Ryzen 5 3600X | CPU Cooler: Corsair H150i PRO XT | Motherboard: Asus - STRIX X370-F GAMING | RAM: G.SKILL Trident Z RGB 2x8Gb DDR4 @3000MHz | GPU: Gigabyte - GeForce RTX 2080 Ti 11 GB AORUS XTREME Video Card | Storage: Samsung - 860 EVO 250GB M.2-2280 - Sandisk SSD 240GB - Sandisk SSD 1TB - WD Blue 4TB| PSU: Corsair RM (2019) 850 W 80+ Gold Certified Fully Modular ATX Power Supply | Case: Corsair - Corsair Obsidian 500D RGB SE ATX Mid Tower Case | System Fans: Corsair - ML120 PRO RGB 47.3 CFM 120mm x 4 & Corsair - ML140 PRO RGB 55.4 CFM 140mm x 2 | Display: Samsung KS9000 |Keyboard: Logitech - G613 | Mouse: Logitech - G703 | Operating System: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

What will happen to the 16 series cards ?

Will NVIDIA cancel those as they could get Ray-tracing to the low-end parts ? ( taking into consideration the improvements in ray-tracing)

or they would just ignore the budget market ?

 

Please quote or tag me @Void Master,so i can see your reply.

 

Everyone was a noob at the beginning, don't be discouraged by toxic trolls even if u lose 15 times in a row. Keep training and pushing yourself further and further, so u can show those sorry lots how it's done !

Be a supportive player, and make sure to reflect a good image of the game community you are a part of. 

Don't kick a player unless they willingly want to ruin your experience.

We are the gamer community, we should take care of each other !

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, pas008 said:

if history goes looks like the 3080 will be like 2080ti like performance

maybe more considering 7nm along with other variables unless they dial the jump down at first

The 12nm to 7nm jump appears to be two jumps in the process node (the next jump from 12nm should be 8nm-10nm). Considering what happened with Pascal which was also two jumps, I'd imagine we'll be spoiled again.

 

6 hours ago, Void Master said:

What will happen to the 16 series cards ?

Will NVIDIA cancel those as they could get Ray-tracing to the low-end parts ? ( taking into consideration the improvements in ray-tracing)

or they would just ignore the budget market ?

The 16 series cards will be replaced by another series targeted for the low to midrange markets. Whether that'd be along the GeForce 30 (or whatever) series or something else is up for grabs.

Link to comment
Share on other sites

Link to post
Share on other sites

54 minutes ago, Mira Yurizaki said:

The 12nm to 7nm jump appears to be two jumps in the process node (the next jump from 12nm should be 8nm-10nm). Considering what happened with Pascal which was also two jumps, I'd imagine we'll be spoiled again.

 

The 16 series cards will be replaced by another series targeted for the low to midrange markets. Whether that'd be along the GeForce 30 (or whatever) series or something else is up for grabs.

3ghz stock clock you think?

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Bravo1cc said:

That's likely due to your CPU not your GPU, a 1080 is still a good 1440p card so its not your card. I only just upgraded from a 1080 to a 2080ti for 4k.

nah, got a 3700x. game is just very visually demanding when played at high/ultra settings. Looks amazing though so Ill take the fps hit

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Mira Yurizaki said:

No, they'll just stuff in more execution units.

hopefully both

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Mira Yurizaki said:

The 16 series cards will be replaced by another series targeted for the low to midrange markets. Whether that'd be along the GeForce 30 (or whatever) series or something else is up for grabs.

I wouldn't be too surprised as RT cores get more efficient that we start to see the low end cards get bumped into RTX compatibility. Even if they only have a fraction of the RT cores/Tensors as the bigger cards and technically don't do great in ray traced games, it would be a big bullet point on the box and would dramatically simplify product lines. I doubt it'll happen this next Gen (unless RT cores get a really substantial boost, which is possible), but could see an RTX 4050 or something similar in a few years. 

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, Waffles13 said:

I wouldn't be too surprised as RT cores get more efficient that we start to see the low end cards get bumped into RTX compatibility. Even if they only have a fraction of the RT cores/Tensors as the bigger cards and technically don't do great in ray traced games, it would be a big bullet point on the box and would dramatically simplify product lines. I doubt it'll happen this next Gen (unless RT cores get a really substantial boost, which is possible), but could see an RTX 4050 or something similar in a few years. 

The RTX 2060 is rated for 160W TDP. Even the Super variant is rated at 175W TDP. Conceivably I think they could get 2060 specs into the $250 market  given they were able to make get the GTX 980 performance with its 160W TDP down to the GTX 1060 and its 120W TDP. Heck, they may include it for laughs on the $150-$200 market. But I can definitely see the $220-$250 market getting RT cores.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Mira Yurizaki said:

The RTX 2060 is rated for 160W TDP. Even the Super variant is rated at 175W TDP. Conceivably I think they could get 2060 specs into the $250 market  given they were able to make get the GTX 980 performance with its 160W TDP down to the GTX 1060 and its 120W TDP. Heck, they may include it for laughs on the $150-$200 market. But I can definitely see the $220-$250 market getting RT cores. 

Plus with the node shrink, they should theoretically be able to get 2060 performance for <100W. Theoretically being the key word. 

 

The 2060 has 30 RT cores, so assuming that in the next generation or two we see approximately 2x efficiency per core (maybe a bit optimistic, but with such a new technology I'd assume that there are plenty of "easy" optimizations to go around) I think a low end card could probably get away with 8-10 along with the applicable number of Tensors. Not a great ray tracing experience, mind you, but for a low end card on 7nm I'd assume that the cost in terms of die area would be pretty low for that handful of cores. Enough that it may be worth it just so they can write RTX on the box and encourage the more casual PC players to upgrade. 

 

Then again I'm not a business man so maybe they'll just spin out a third line of GTX cards to spite me. 

Link to comment
Share on other sites

Link to post
Share on other sites

Another compelling update to this story - conceivable specifications for the Ampere GPUs (from our German friends @ 3DCenter.org):

Quote

amperespecsy.thumb.jpg.a55e42094f9029286ceaaf8d64cc3b59.jpg

 

amperespecsy23.thumb.jpg.ef9c3c64dcb63d471eb4207bd908b63d.jpg

 

SE (Shader-Einheiten) translates to Shader Unit (in this case an NVIDIA CUDA Core). It appears they are suggesting that both the Tesla and Titan cards will be based on the same silicon; as well as them having HBM2, which is very intriguing. They don't specify memory setups on the other cards (consumer), but I would imagine sticking to GDDR6 is still best for these applications. Also, looking at the proposed CUDA Core counts, for say, the RTX 3080 Ti, that chip would be around 30-50% faster than an RTX 2080 Ti / RTX Titan. As for the purported Ampere Titan (GA100), that would be a colossal 70-80% faster than an RTX 2080 Ti / RTX Titan (in best case scenarios).

 

Source 4

Source 5 

Link to comment
Share on other sites

Link to post
Share on other sites

On 11/14/2019 at 3:29 AM, lbmoney33 said:

That's what I was going to assume.  My 1080 Ti is still great at 144 on 1080p. but games like Oddessy and Red Dead 2 are very demanding.

Yeah I would think it would be worth upgrading..

I use my 1080ti with an ultra wide monitor 3440x1440 and its at its limit really, newer games like red dead kill it at that resolution.. I usually upgrade my graphics card every two generations if I get the top end one anyway. 

So yeah hoping this new generation of nvidia cards wont be silly expensive like that 2000 series. 

Link to comment
Share on other sites

Link to post
Share on other sites

On 11/15/2019 at 5:38 AM, caldrin said:

Yeah I would think it would be worth upgrading..

I use my 1080ti with an ultra wide monitor 3440x1440 and its at its limit really, newer games like red dead kill it at that resolution.. I usually upgrade my graphics card every two generations if I get the top end one anyway. 

So yeah hoping this new generation of nvidia cards wont be silly expensive like that 2000 series. 

I'm sure at that resolution it would be bringing the card to its knees on games like Red Dead 2.

 

And I like that philosophy as well, top end card every other generation.  Meaning a 80, 80TI kind of card.

I7 9700k- MSI Z390 Gaming Edge AC - Be Quiet Dark Pro 4 - Corsair Vengeance LPX 3200 2X8GB - EVGA RTX 2080 TI Black Edition - WD BLACK NVMe M.2 500GB -  EVGA Supernova 650 P2, 80+ Platinum -Nzxt H500 - Display Dell S2721DGF 27" - Keyboard- Razer Huntsman Mini 60%  - Mouse- Logitech G203  - Headset  Astro A50 2019 Edition - Speakers - Logitech Z623

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/30/2019 at 2:36 PM, HarryNyquist said:

If they are I'm gonna be mad AF cuz I literally just bought a 2080 super to replace my dead 1080Ti

Well I mean new GPU's are always 1-2 years away, there will always be a new king of the hill

Optical Drive Poll: https://linustechtips.com/main/topic/1006309-optical-drive-survey/

Main Rig (Pulsar)

CPU: Ryzen 7 5800X MOBO: MSI MEG X570 Unify RAM: Corsair Dominator Platinum RG(4x8gb) 3200Mhz 16-16-16-32 GPU: EVGA GTX 1080 FTW ACX 3.0 Cooler: Noctua NH-U12S Chromax (LTT Edition) Storage: Intel 6000p 128gb boot drive, Intel 665p 1tb (x2), Samsung 850 EVO 1tb, Case: Phanteks Enthoo Pro M Tempered Glass PSU: EVGA SupeNOVA G1+ 750W OS: Windows 10 Pro 64 bit  

FreeNAS Server (The Vault)

CPU: Xeon E5-2603 v3 MOBO: MSI X99 Tomahawk RAM: G.Skill Aegis (4x8gb) 3000Mhz 16-18-18-38 GPU: EVGA GT 710 Cooler: Cooler Master Hyper 212 LED Storage: Intel S3520 x2 for boot, x16 in RAIDZ for storage, Seagate Ironwolf 2tb (Striped will be a steam cache) Case: Phanteks Enthoo Pro Tempered Glass PSU: Corsair CX750 750W

Link to comment
Share on other sites

Link to post
Share on other sites

On 11/15/2019 at 3:35 AM, Rune said:

If it's actually 50% faster, I don't care what it costs, its a day one buy. Trying to drive 4k144hz is hard.

It's been weird for a while, SLI is going away, yet single cards struggle at decent monitor. I have a pile of games waiting for the right GPU, which never comes.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×