Jump to content

Nvidia Ampere teased

Jensen: And the moment you've all been waiting for. Next gen Ray Tracing. Let me introduce you to...
Gamer crowd: RTX 3000! RTX 3000! RTX 3000!
Jensen: Tesla V200
Gamer crowd: NUUUUUUUU!!!!

Intel Xeon E5 1650 v3 @ 3.5GHz 6C:12T / CM212 Evo / Asus X99 Deluxe / 16GB (4x4GB) DDR4 3000 Trident-Z / Samsung 850 Pro 256GB / Intel 335 240GB / WD Red 2 & 3TB / Antec 850w / RTX 2070 / Win10 Pro x64

HP Envy X360 15: Intel Core i5 8250U @ 1.6GHz 4C:8T / 8GB DDR4 / Intel UHD620 + Nvidia GeForce MX150 4GB / Intel 120GB SSD / Win10 Pro x64

 

HP Envy x360 BP series Intel 8th gen

AMD ThreadRipper 2!

5820K & 6800K 3-way SLI mobo support list

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, AnonymousGuy said:

I know it's largely irrelevant to most people, but waterblock costs also make me less inclined to upgrade graphics cards every generation.  GPU waterblocks basically have no resale value because you're not going to easily find someone who wants to buy a) a used GPU b) watercool it c) watercool it with a used block.

I've never had a problem selling my GPUs with water blocks on them.  I do keep my OEM cooling and send that along with the GPUs in case the buyer wants to downgrade the cooling.  Since it's generally a 2 year cycle for me, I don't try to get any of my investment back when I price them out for sale.  I consider the "fun" I've had with them to be more than enough of a pay back.

 

Editing Rig: Mac Pro 7,1

System Specs: 3.2GHz 16-core Xeon | 96GB ECC DDR4 | AMD Radeon Pro W6800X Duo | Lots of SSD and NVMe storage |

Audio: Universal Audio Apollo Thunderbolt-3 Interface |

Displays: 3 x LG 32UL950-W displays |

 

Gaming Rig: PC

System Specs:  Asus ROG Crosshair X670E Extreme | AMD 7800X3D | 64GB G.Skill Trident Z5 NEO 6000MHz RAM | NVidia 4090 FE card (OC'd) | Corsair AX1500i power supply | CaseLabs Magnum THW10 case (RIP CaseLabs ) |

Audio:  Sound Blaster AE-9 card | Mackie DL32R Mixer | Sennheiser HDV820 amp | Sennheiser HD820 phones | Rode Broadcaster mic |

Display: Asus PG32UQX 4K/144Hz displayBenQ EW3280U display

Cooling:  2 x EK 140 Revo D5 Pump/Res | EK Quantum Magnitude CPU block | EK 4090FE waterblock | AlphaCool 480mm x 60mm rad | AlphaCool 560mm x 60mm rad | 13 x Noctua 120mm fans | 8 x Noctua 140mm fans | 2 x Aquaero 6XT fan controllers |

Link to comment
Share on other sites

Link to post
Share on other sites

On 4/25/2020 at 2:19 AM, RejZoR said:

GTX 1080Ti 

"Garbage"

 

Me with a R5 3600U + its VEGA iGPU.

If it is not broken, let's fix till it is. 

Link to comment
Share on other sites

Link to post
Share on other sites

Finally, a somewhat decent leak has come out (after such a long while) regarding these GPUs:

 

Quote

 

olute-monster-gpu.png.179476859a7895778abe76d6a21845db.png

 

(New) NVIDIA Ampere GA100 Specs:
8192 CUDA cores @ 1750MHz boost
1024 Tensor Cores
256 RT Cores
Unknown amount of GDDR6 @ 16Gbps
Unknown TDP
7nm

 

Source: https://www.tweaktown.com/news/72152/nvidia-geforce-rtx-3080-ti-leaked-specs-teases-an-absolute-monster-gpu/index.html

Source: https://www.guru3d.com/news-story/rumor-nvidia-geforce-rtx-3070-and-3080-coming-q3-2020-specs.html

 

Also, related news (for those that don't really need the latest and greatest GPU tech):

 

Quote

 

NVIDIA's Ampere GPU Ramp Could Deliver Sweet GeForce RTX 20 Series (Turing) Discounts

 

According to a new report from China Times, NVIDIA is already ramping production of Ampere GPUs that will be used in first-party and third-party graphics cards. That should mean that good things are on the way for those looking to score a deal on the current-generation GeForce RTX 20 series.

The report indicates that Ampere will debut during the third quarter of 2020, and alleges that some AIBs are already reducing spot prices to clear out excess inventory of Turing-based GeForce RTX graphics cards. ASUS is reportedly leading the way with this "inventory management", while competitors Gigabyte and MSI are making similar moves. 

 

Source: https://hothardware.com/news/nvidias-ampere-ramp-geforce-rtx-turing-sales

Link to comment
Share on other sites

Link to post
Share on other sites

Unless NVIDIA adds generic support for DLSS 2.0 and possibly generic SCRT (Screen Space Ray Tracing, unlikely but still) which I could use in almost any game, it's very unlikely I'm gonna buy one. I hate buying new expensive stuff and then can't really use any of the new stuff it offers except in those 2-3 games that need to be specifically coded for it. It's why I still have mixed feelings about whole RTX regardless of how cool and realistic ray tracing is, but welcome the Image Sharpening feature they've added after AMD added it to Radeon settings. Features like this is what excites me more, even if they aren't as significant, because you can use it this moment, right now in all games. And if they can accelerate them using RT and Tensor cores, even better. I have a feeling it's all just wishful thinking...

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, BiG StroOnZ said:

Finally, a somewhat decent leak has come out (after such a long while) regarding these GPUs:

Those Die product codes look wrong to me, so does the difference between the 3080 and 3080 Ti, bit out of wack.

 

Also GA100, all Gx100 code dies, won't be used for gaming GPUs, that died long ago. GA102 will be 3080 Ti, GA100 will be Tesla's etc.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, leadeater said:

Also GA100, all Gx100 code dies, won't be used for gaming GPUs, that died long ago. GA102 will be 3080 Ti, GA100 will be Tesla's etc.

Thought something looked a little suspicious about that.  It'd completely slipped my mind that the XX102s are the Titans and Ti chips.

Editing Rig: Mac Pro 7,1

System Specs: 3.2GHz 16-core Xeon | 96GB ECC DDR4 | AMD Radeon Pro W6800X Duo | Lots of SSD and NVMe storage |

Audio: Universal Audio Apollo Thunderbolt-3 Interface |

Displays: 3 x LG 32UL950-W displays |

 

Gaming Rig: PC

System Specs:  Asus ROG Crosshair X670E Extreme | AMD 7800X3D | 64GB G.Skill Trident Z5 NEO 6000MHz RAM | NVidia 4090 FE card (OC'd) | Corsair AX1500i power supply | CaseLabs Magnum THW10 case (RIP CaseLabs ) |

Audio:  Sound Blaster AE-9 card | Mackie DL32R Mixer | Sennheiser HDV820 amp | Sennheiser HD820 phones | Rode Broadcaster mic |

Display: Asus PG32UQX 4K/144Hz displayBenQ EW3280U display

Cooling:  2 x EK 140 Revo D5 Pump/Res | EK Quantum Magnitude CPU block | EK 4090FE waterblock | AlphaCool 480mm x 60mm rad | AlphaCool 560mm x 60mm rad | 13 x Noctua 120mm fans | 8 x Noctua 140mm fans | 2 x Aquaero 6XT fan controllers |

Link to comment
Share on other sites

Link to post
Share on other sites

On 4/28/2020 at 7:55 PM, SADS said:

If you went to your Ford dealer and bought a (lets just say) new Fiesta, every 4 years. You got a decent spec and paid £20,000.

And you did this every 4 years, generally you get a decent spec Fiesta and as time goes by sure there are some fluctuations in price due to inflation. But each time you get a little more tech and a few extra bhp and headroom.

Then, its time for an upgrade, you turn up to the dealer and suddenly Ford goes "oh yeh, here's your new fiesta, but its £40,000 now".

It's still a new iteration of the same model in the lineup, still a new infotainment system with maybe some extra bhp and a little more headroom.

I don't care how much Ford spent on R&D or cost to build, that's not my problem. As a consumer, that product is suddenly overpriced.

 

You say "fk you Ford, i'll keep my old Fiesta", or you go buy something else.

Steam survey isn't gospel, I know. But even now, 1080ti usage is over double that of the 2080ti and I believe they've been out for comparatively the same amount of time now before discontinuation. Sure people will still buy them, but that doesn't make it sensible or good for the consumer.

I think you are misunderstanding something. To make a chip that has a significant performance jump over the 1080ti it cost quite a bit to make and they priced it accordingly. They could have just as easily never released the 2080ti all together but I would rather them spend more to make a new flagship with  a sizable leap in performance over its predecessor and have to pay more than simply have them not release one at all. I think people sometimes think that prices should be a certain price because that how it always has been rather than realizing that there are actually factors behind it and unfortunately those factors changed with the addition of tensor cores. Just because you think it isn't worth the money doesn't mean it is overpriced especially when the cost to make the product can justify the price. 

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, RejZoR said:

Unless NVIDIA adds generic support for DLSS 2.0 and possibly generic SCRT (Screen Space Ray Tracing, unlikely but still) which I could use in almost any game, it's very unlikely I'm gonna buy one. I hate buying new expensive stuff and then can't really use any of the new stuff it offers except in those 2-3 games that need to be specifically coded for it. It's why I still have mixed feelings about whole RTX regardless of how cool and realistic ray tracing is, but welcome the Image Sharpening feature they've added after AMD added it to Radeon settings. Features like this is what excites me more, even if they aren't as significant, because you can use it this moment, right now in all games. And if they can accelerate them using RT and Tensor cores, even better. I have a feeling it's all just wishful thinking...

Along with Nvidia's other big selling points they never put any effort into and just let them die off. 3D vision is completely gone, despite it not being exceptionally reasonable to assume people have the hardware for it, their VR thing has gone literally nowhere, triple monitors suck, etc. Ray tracing guaranteed will come to games, but it certainly wasn't ready yet. Still seems a bit dumb to lock of parts of the GPU specifically for ray tracing and AI when they aren't strong enough to really matter. It's kind of a chicken and the egg thing.

 

As for me, I'm on an increasingly aging 2gb 960, and I'm about ready to pull the trigger on a 1660, but I don't know if Nvidia is going to release anything new in a month or two. Probably not, but it would be nice.

#Muricaparrotgang

Link to comment
Share on other sites

Link to post
Share on other sites

@JZStudios

I don' think NVIDIA will release anything really soon. I'm guessing September 2020. They've always done releases at that time just before holidays shopping frenzy. Only time they released something outside that was GTX 1080Ti series which was launched in April. Iirc that is.

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, Brooksie359 said:

I think you are misunderstanding something. To make a chip that has a significant performance jump over the 1080ti it cost quite a bit to make and they priced it accordingly. They could have just as easily never released the 2080ti all together but I would rather them spend more to make a new flagship with  a sizable leap in performance over its predecessor and have to pay more than simply have them not release one at all. I think people sometimes think that prices should be a certain price because that how it always has been rather than realizing that there are actually factors behind it and unfortunately those factors changed with the addition of tensor cores. Just because you think it isn't worth the money doesn't mean it is overpriced especially when the cost to make the product can justify the price. 

This would be fine, if the leap in performance of the 20** series vs 10** series matched the leap in performance between the 980ti and 1080ti.

otherwise we're going to be paying £3000 for a 30** series for only a 10% performance uplift this time around?

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, SADS said:

This would be fine, if the leap in performance of the 20** series vs 10** series matched the leap in performance between the 980ti and 1080ti.

otherwise we're going to be paying £3000 for a 30** series for only a 10% performance uplift this time around?

 

This discussion was hashed out over and over again almost two years ago.  If you're going to focus on rasterization performance increases, then you're completely missing the point behind why the RTX GPUs are so expensive.  Whether YOU like it or not, NVidia decided to use Turing to pivot and introduce consumer-consumable real time ray tracing in a GPU.  That's... a big, huge fucking deal.  You may not like it or care about it.  As it turns out: I don't care about it either.  But I can completely grok that what they accomplished is a big deal.

 

And, they're still faster than Pascal.  Like it or not, you can't do with a Pascal card the same things in DX11/DX12/etc as you can with a Turing card.  They perform better.  Perhaps not to your specific expectation, but that's not NVidia's problem.  They're moving into a different technology that the game makers will, eventually catch up to.  Like they did with rasterization ages ago.  All of that R&D costs a ton of money and NVidia needs ROI.  See: being a public company.

 

Buy the new cards and enjoy the increased performance over Pascal or previous GPUs.  Or don't, and continue to slug along at lower resolutions, lower detail levels, and/or lower frame rates.  That's all on you.

Editing Rig: Mac Pro 7,1

System Specs: 3.2GHz 16-core Xeon | 96GB ECC DDR4 | AMD Radeon Pro W6800X Duo | Lots of SSD and NVMe storage |

Audio: Universal Audio Apollo Thunderbolt-3 Interface |

Displays: 3 x LG 32UL950-W displays |

 

Gaming Rig: PC

System Specs:  Asus ROG Crosshair X670E Extreme | AMD 7800X3D | 64GB G.Skill Trident Z5 NEO 6000MHz RAM | NVidia 4090 FE card (OC'd) | Corsair AX1500i power supply | CaseLabs Magnum THW10 case (RIP CaseLabs ) |

Audio:  Sound Blaster AE-9 card | Mackie DL32R Mixer | Sennheiser HDV820 amp | Sennheiser HD820 phones | Rode Broadcaster mic |

Display: Asus PG32UQX 4K/144Hz displayBenQ EW3280U display

Cooling:  2 x EK 140 Revo D5 Pump/Res | EK Quantum Magnitude CPU block | EK 4090FE waterblock | AlphaCool 480mm x 60mm rad | AlphaCool 560mm x 60mm rad | 13 x Noctua 120mm fans | 8 x Noctua 140mm fans | 2 x Aquaero 6XT fan controllers |

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, SADS said:

This would be fine, if the leap in performance of the 20** series vs 10** series matched the leap in performance between the 980ti and 1080ti.

otherwise we're going to be paying £3000 for a 30** series for only a 10% performance uplift this time around?

This is a slippery slope logical fallacy. That is not how that works at all. First point being that a significant improvement in performance was still had between the 1080ti and the 2080ti so to suddenly conflate that with spending $3000 next gen for 10% increase  in performance is not supported at all. You are also neglecting the fact that they added more than just cuda cores which is why it cost more to make than the 1080ti. And I do not get why everyone is so fixated on getting the same performance jump generation over generation at the same price points like it is somehow owed to them and if they get anything else it is suddenly "overpriced". The simple fact is that nvidia spent alot of money to make an innovative product and they are charging accordingly. If you do not like that product then don't buy it but do not complain that it's overpriced when the cost to make said product does justify its price. 

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, Brooksie359 said:

This is a slippery slope logical fallacy. That is not how that works at all. First point being that a significant improvement in performance was still had between the 1080ti and the 2080ti so to suddenly conflate that with spending $3000 next gen for 10% increase  in performance is not supported at all. You are also neglecting the fact that they added more than just cuda cores which is why it cost more to make than the 1080ti. And I do not get why everyone is so fixated on getting the same performance jump generation over generation at the same price points like it is somehow owed to them and if they get anything else it is suddenly "overpriced". The simple fact is that nvidia spent alot of money to make an innovative product and they are charging accordingly. If you do not like that product then don't buy it but do not complain that it's overpriced when the cost to make said product does justify its price. 

They complain because its a similar situation to when the xbox one released. No one asked for connect and no one asked for rtx we asked for more power.

CPU: 6700K Case: Corsair Air 740 CPU Cooler: H110i GTX Storage: 2x250gb SSD 960gb SSD PSU: Corsair 1200watt GPU: EVGA 1080ti FTW3 RAM: 16gb DDR4 

Other Stuffs: Red sleeved cables, White LED lighting 2 noctua fans on cpu cooler and Be Quiet PWM fans on case.

Link to comment
Share on other sites

Link to post
Share on other sites

19 hours ago, RejZoR said:

@JZStudios

I don' think NVIDIA will release anything really soon. I'm guessing September 2020. They've always done releases at that time just before holidays shopping frenzy. Only time they released something outside that was GTX 1080Ti series which was launched in April. Iirc that is.

Yeah, probably. Also seems unlikely they'll have a new ~$200 card.

 

9 hours ago, Brooksie359 said:

This is a slippery slope logical fallacy. That is not how that works at all. First point being that a significant improvement in performance was still had between the 1080ti and the 2080ti so to suddenly conflate that with spending $3000 next gen for 10% increase  in performance is not supported at all. You are also neglecting the fact that they added more than just cuda cores which is why it cost more to make than the 1080ti. And I do not get why everyone is so fixated on getting the same performance jump generation over generation at the same price points like it is somehow owed to them and if they get anything else it is suddenly "overpriced". The simple fact is that nvidia spent alot of money to make an innovative product and they are charging accordingly. If you do not like that product then don't buy it but do not complain that it's overpriced when the cost to make said product does justify its price. 

This is a slippery slope logical fallacy. There's not much preventing Nvidia from just inflating the price of the cards with little increase in performance. Look at Intel and their CPUs. The RTX cores aren't enough to be marketed and sold as mass market, they should've stayed in the professional cards until they could get the cost down enough to really fully utilize them. Which is how everything else they do works.

#Muricaparrotgang

Link to comment
Share on other sites

Link to post
Share on other sites

41 minutes ago, MadyTehWolfie said:

They complain because its a similar situation to when the xbox one released. No one asked for connect and no one asked for rtx we asked for more power.

And you got more power.  Argue against it all you want, but the simple fact is: Turing is more capable at simple rasterization than Pascal is.

Editing Rig: Mac Pro 7,1

System Specs: 3.2GHz 16-core Xeon | 96GB ECC DDR4 | AMD Radeon Pro W6800X Duo | Lots of SSD and NVMe storage |

Audio: Universal Audio Apollo Thunderbolt-3 Interface |

Displays: 3 x LG 32UL950-W displays |

 

Gaming Rig: PC

System Specs:  Asus ROG Crosshair X670E Extreme | AMD 7800X3D | 64GB G.Skill Trident Z5 NEO 6000MHz RAM | NVidia 4090 FE card (OC'd) | Corsair AX1500i power supply | CaseLabs Magnum THW10 case (RIP CaseLabs ) |

Audio:  Sound Blaster AE-9 card | Mackie DL32R Mixer | Sennheiser HDV820 amp | Sennheiser HD820 phones | Rode Broadcaster mic |

Display: Asus PG32UQX 4K/144Hz displayBenQ EW3280U display

Cooling:  2 x EK 140 Revo D5 Pump/Res | EK Quantum Magnitude CPU block | EK 4090FE waterblock | AlphaCool 480mm x 60mm rad | AlphaCool 560mm x 60mm rad | 13 x Noctua 120mm fans | 8 x Noctua 140mm fans | 2 x Aquaero 6XT fan controllers |

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, jasonvp said:

And you got more power.  Argue against it all you want, but the simple fact is: Turing is more capable at simple rasterization than Pascal is.

Think you missed what I was talking about. People don't want a 10% increase in performance each generation that's lazy Intel way of doing things. Giving 10% more performance and adding a feature people didn't ask for is a bad decision. Microsoft learned that the hard way. I was replying to the hypothetical you gave. In my opinion, while RTX is cool it's only worth it if you buy a 2080ti and the the cards below that if you don't want Ray tracing it's not really worth what they are asking for especially if you own a 1080ti. Only reason to upgrade to Turing is if you have a 900 series or want a 2080ti everything is a meh reason to buy Turing. An before you call me a hater I bought 4-6 different 2080ti's over the course of 1 year and a half. Two of them being kingpins.

CPU: 6700K Case: Corsair Air 740 CPU Cooler: H110i GTX Storage: 2x250gb SSD 960gb SSD PSU: Corsair 1200watt GPU: EVGA 1080ti FTW3 RAM: 16gb DDR4 

Other Stuffs: Red sleeved cables, White LED lighting 2 noctua fans on cpu cooler and Be Quiet PWM fans on case.

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, MadyTehWolfie said:

Think you missed what I was talking about. People don't want a 10% increase in performance each generation that's lazy Intel way of doing things. Giving 10% more performance and adding a feature people didn't ask for is a bad decision.

I didn't miss what you were talking about at all.  Your point isn't valid nor backed up with any data.  The cards clearly continue to sell, so "bad decision" doesn't really fit.

 

But you and others will continue to bitch about them, creating your mini tempests in teapots.  NVidia's laughing all the way to the bank and getting ready to do it again.

 

If you're upset at them and their pricing, yell at AMD.

Editing Rig: Mac Pro 7,1

System Specs: 3.2GHz 16-core Xeon | 96GB ECC DDR4 | AMD Radeon Pro W6800X Duo | Lots of SSD and NVMe storage |

Audio: Universal Audio Apollo Thunderbolt-3 Interface |

Displays: 3 x LG 32UL950-W displays |

 

Gaming Rig: PC

System Specs:  Asus ROG Crosshair X670E Extreme | AMD 7800X3D | 64GB G.Skill Trident Z5 NEO 6000MHz RAM | NVidia 4090 FE card (OC'd) | Corsair AX1500i power supply | CaseLabs Magnum THW10 case (RIP CaseLabs ) |

Audio:  Sound Blaster AE-9 card | Mackie DL32R Mixer | Sennheiser HDV820 amp | Sennheiser HD820 phones | Rode Broadcaster mic |

Display: Asus PG32UQX 4K/144Hz displayBenQ EW3280U display

Cooling:  2 x EK 140 Revo D5 Pump/Res | EK Quantum Magnitude CPU block | EK 4090FE waterblock | AlphaCool 480mm x 60mm rad | AlphaCool 560mm x 60mm rad | 13 x Noctua 120mm fans | 8 x Noctua 140mm fans | 2 x Aquaero 6XT fan controllers |

Link to comment
Share on other sites

Link to post
Share on other sites

24 minutes ago, jasonvp said:

I didn't miss what you were talking about at all.  Your point isn't valid nor backed up with any data.  The cards clearly continue to sell, so "bad decision" doesn't really fit.

 

But you and others will continue to bitch about them, creating your mini tempests in teapots.  NVidia's laughing all the way to the bank and getting ready to do it again.

 

If you're upset at them and their pricing, yell at AMD.

1. Uh no, you clearly did.

2. Not bitching I bought more 2080ti's than most, doesn't mean I don't think it's not overpriced for the Ray tracing performance.

3. I still wouldn't buy a amd gpu regardless of the price. Regardless, I doubt that would change the pricing enough where it wouldn't still be considered overpriced for what you get by a good number of people.

CPU: 6700K Case: Corsair Air 740 CPU Cooler: H110i GTX Storage: 2x250gb SSD 960gb SSD PSU: Corsair 1200watt GPU: EVGA 1080ti FTW3 RAM: 16gb DDR4 

Other Stuffs: Red sleeved cables, White LED lighting 2 noctua fans on cpu cooler and Be Quiet PWM fans on case.

Link to comment
Share on other sites

Link to post
Share on other sites

On 4/30/2020 at 2:14 AM, BiG StroOnZ said:

Finally, a somewhat decent leak has come out (after such a long while) regarding these GPUs:

 

 

Source: https://www.tweaktown.com/news/72152/nvidia-geforce-rtx-3080-ti-leaked-specs-teases-an-absolute-monster-gpu/index.html

Source: https://www.guru3d.com/news-story/rumor-nvidia-geforce-rtx-3070-and-3080-coming-q3-2020-specs.html

 

Also, related news (for those that don't really need the latest and greatest GPU tech):

 

 

Source: https://hothardware.com/news/nvidias-ampere-ramp-geforce-rtx-turing-sales

Maybe I am dumb, but to me the amount of CUDA cores on that 3080 Ti seems way to high compared to the 3080. I know the clock speed is a little lower, but even still that's like double. The 80 Ti cards have had pretty sizable gaps between the 80 cards, but that seems absurd. 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, Inkz said:

Maybe I am dumb, but to me the amount of CUDA cores on that 3080 Ti seems way to high compared to the 3080. I know the clock speed is a little lower, but even still that's like double. The 80 Ti cards have had pretty sizable gaps between the 80 cards, but that seems absurd. 

 

Not saying this is guaranteed fact, or even a factoid if you will, but this leak from way back seems to clarify that subject (from our German friends @ 3DCenter.org):

 

Quote

 

Quote

amperespecsy.thumb.jpg.a55e42094f9029286ceaaf8d64cc3b59.jpg

 

amperespecsy23.thumb.jpg.ef9c3c64dcb63d471eb4207bd908b63d.jpg

 

SE (Shader-Einheiten) translates to Shader Unit (in this case an NVIDIA CUDA Core). It appears they are suggesting that both the Tesla and Titan cards will be based on the same silicon; as well as them having HBM2, which is very intriguing. They don't specify memory setups on the other cards (consumer), but I would imagine sticking to GDDR6 is still best for these applications.

 

Source

Source

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, MadyTehWolfie said:

They complain because its a similar situation to when the xbox one released. No one asked for connect and no one asked for rtx we asked for more power.

They did deliver more power. They just also delivered rtx along with that. And no that is not what people are complaining about because people are clearly complaining about the price. 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, JZStudios said:

Yeah, probably. Also seems unlikely they'll have a new ~$200 card.

 

This is a slippery slope logical fallacy. There's not much preventing Nvidia from just inflating the price of the cards with little increase in performance. Look at Intel and their CPUs. The RTX cores aren't enough to be marketed and sold as mass market, they should've stayed in the professional cards until they could get the cost down enough to really fully utilize them. Which is how everything else they do works.

Are you kidding me? Do you even know what a slippery slope logical fallacy is? Anyways they didn't overprice the product because it cost more to make and they adjusted the price accordingly. They didn't have a revolutionary change in the way the new architecture handled rendering so they didn't get much improvements in non raytracing performance from the new architecture but instead had to make the dies much bigger and add more cuda cores and faster memory. This is why it ended up costing more. The next gen will likely have a much bigger improvement in performance with the die shrink so I see no reason why they would increase the price at the top end again. There is zero evidence supporting your claim that they would do something like that. 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Brooksie359 said:

Are you kidding me? Do you even know what a slippery slope logical fallacy is? Anyways they didn't overprice the product because it cost more to make and they adjusted the price accordingly. They didn't have a revolutionary change in the way the new architecture handled rendering so they didn't get much improvements in non raytracing performance from the new architecture but instead had to make the dies much bigger and add more cuda cores and faster memory. This is why it ended up costing more. The next gen will likely have a much bigger improvement in performance with the die shrink so I see no reason why they would increase the price at the top end again. There is zero evidence supporting your claim that they would do something like that. 

Okay. We'll just wait.

#Muricaparrotgang

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, JZStudios said:

Yeah, probably. Also seems unlikely they'll have a new ~$200 card.

 

This is a slippery slope logical fallacy. There's not much preventing Nvidia from just inflating the price of the cards with little increase in performance. Look at Intel and their CPUs. The RTX cores aren't enough to be marketed and sold as mass market, they should've stayed in the professional cards until they could get the cost down enough to really fully utilize them. Which is how everything else they do works.

They'll probably just rehash some older GPU. Probably something from Turing GTX series.

 

What's the biggest shame is that they don't utilize Tensor and RT cores to accelerate things we already have and know. Like for example push SSAO/HBAO instructions of existing games through RT cores instead of regular shaders. Or use Tensor cores to accelerate edge smothing algorithm. Something like SMAA, but more advanced, accurate and taxing that could be used in all games and would benefit from Tensor cores at finding edges and stuff. Coz why waste all this good compute estate until RT actually becomes any kind of used standard...

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×