Jump to content

NVIDIA GeForce 3070/3080/3080 Ti (Ampere): RTX Has No Perf Hit & x80 Ti Card 50% Faster in 4K! (Update 6 ~ Specs / Overview / Details)

25 minutes ago, Hilltrot said:

I get 60+ 1080p RTX on the 2060.

 

Is usable for you 300+ FPS at 4k?

I don’t have one, hence the term “heard” the 2060 is generally considered to be a card that does 1440p quite well though.  I have a card that does 60+ @ 1080p and cost me $135.  How much did your 2060 which apparently does the same cost?

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, williamcll said:

Here's the motherboard of the new DGX, seems like Jensen learned a thing or two from Linus:

 

 

Came across this just now: STH's Inspur NF5488M5 Review.

 

Patrick does a pretty good job explaining in general the overall networking structure, what the DGX/HGX-2 layout is, and how NV-Switch works to connect all 8 cards more effectively than NV-Link (8:05 to 13:35):

 

https://www.servethehome.com/inspur-nf5488m5-review-a-unique-8x-nvidia-tesla-v100-server/

https://de.inspur.com/de/2464830/2464866/2507544/index.html

 

I kinda hope that GA100 or GA102 based Titan cards will support the aforementioned "NV-SWITCH".

Link to comment
Share on other sites

Link to post
Share on other sites

On 5/12/2020 at 5:26 AM, Bombastinator said:

It sounds like what is needed then is a tensor core apu or iGPU. Something that uses system ram which is variable.  Iirc there are other types of graphics solutions that use ram from elsewhere as well.  With one of those the ram wouldn’t be as fast, but one could assign as much as the whole machine could hold.  There are motherboards that will take 256gb. 

That's an unlikely thing. The ability to use system ram has always been possible when a standard API has been in use, so that the OS can delegate that memory. However the RT features and CUDA has always been proprietary, and it's unlikely that using system memory would ever be a possibility since it would lead to impairment of the proprietary features. When an OpenGL or DirectX game requests more VRAM than exists, it just pulls it from the system memory, including page file. Which is why you should max out the system memory on a new desktop PC build before anything as it's the weakest link.

Link to comment
Share on other sites

Link to post
Share on other sites

  • 4 weeks later...

This looks legit to me. But what do I know?

CPU: Intel Core i9-9900K | Motherboard: Asus Maximus Code XI | Graphics Card: RTX 3090 FE | RAM: 16GB Corsair Vengeance LPX 2666 MHz | 

Storage: LOADS of drives: SSD + HDD | PSU: be quiet! Dark Power Pro 11 850 W | 

Case: Fractal Design Define R5 Blackout (window) | Cooling: CRYORIG H5 Universal 

PCPartPicker List

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, VegetableStu said:

current rumors point to "no" 🤔

Maybe later.... or not both seems possible imo. But they definitely won’t cannibalize the rtx lineup from the start by saying there will be gtx cards as well so I guess my question was probably dumb anyways lol...

Folding stats

Vigilo Confido

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Nicnac said:

Maybe later.... or not both seems possible imo. But they definitely won’t cannibalize the rtx lineup from the start by saying there will be gtx cards as well so I guess my question was probably dumb anyways lol...

The last thing that I heard was that rtx on the new cards was something like 4 times bigger and would be more useful for more things.  Then I saw this maybe get backed off a bit.  Got no idea at this point.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, VegetableStu said:

at this point any card without raytracing accelerators would feel like they belong to the xx50 stack ._. at least that's how i'd feel

 

if i'm spending even xx70 amounts of money, i'd expect to have some fraction of the bleeding edge, not just to top the previous generation xx80(ti?) card

Yea but I think the 50 and 60 range will maybe get non rtx alternatives? 

Folding stats

Vigilo Confido

 

Link to comment
Share on other sites

Link to post
Share on other sites

56 minutes ago, VegetableStu said:

yeah fair 🤔 although i kinda expect the xx60 would at least inherit the 2070's raytracing performance

 

gonna have the problem of having 3 or 4 xx60 cards again if the xx60 stack turned out like the 2060/1660

Yea I was getting confused this gen already...

Folding stats

Vigilo Confido

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 5/12/2020 at 10:42 AM, TetraSky said:

Cool. Will still be waaaay out of my budget. Even the 2060 super cost over $500 (canada)... I miss the day I could get a high end GPU for $350~$400... Now I can't even get a low end one for that price.

 

Well you can get used GTX 1060s and RX 580s in decent condition right now on OfferUp, LetGo, and Mercari for as low as $100 ($130 CAD). I wouldn't be surprised if used listings pop up next year consisting of GTX 1660 Supers & RX 5600/5600XTs priced at $100-$125 ($150 CAD) and RTX 2060S/2070Ss & RX 5700/5700XTs at $150-$200 ($250 CAD max).

 

Of course this is assuming that both RTX and Radeon cards arriving this year will have a 60-70% performance jump no matter the price point.

 

Oh man, the thought of a RTX 3070 being as powerful as a Titan RTX..............

 

Even if you do end up only being able to afford a "3050TI" it should be able to match a RTX 2070 Super or 2080.

Link to comment
Share on other sites

Link to post
Share on other sites

While all this sounds really nice, some of it is enough for me to pare back expectations a bit. 

 

Don't wanna get too hyped. 

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 weeks later...

RTX 3060 will still be 6 GB thats all you need to know about shitnvidia.

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, yian88 said:

RTX 3060 will still be 6 GB thats all you need to know about shitnvidia.

There does seem to be an indication that Nvidia is treating navi21 like it’s not going to be much and is pulling the standard “10-20% more of the same old garbage because we know you have to eat it” thing.


Time will tell.  I may be waiting for Navi21 or the consoles or even to see what devs do with the console hardware once it is in hand.

 

dualoply sucks.  The only thing it’s better than is monopoly.  I miss functional antitrust laws.

 

im mad at the judge who killed 3 point antitrust.  It clearly doesn’t work well.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, yian88 said:

RTX 3060 will still be 6 GB thats all you need to know about shitnvidia.

I would agree with that statement, but I'm thinking as long as they have better compression algorithms and higher speed memory, the overall bandwidth should be a step up, which in turn would balance out the need for a larger overall amount of memory. Obviously more memory would be better, but I'll reserve my judgement for when the reviews come.

Link to comment
Share on other sites

Link to post
Share on other sites

I think there are some reasons why we might expect the 3000-series to be priced much better than the 2000-series was:

 

- RTX didn't turn-out to be as big a deal and a card-seller as Nvidia had hoped. It doesn't help that its performance is terrible to the point that many people simply don't want to use it.

 

- Nvidia's BS about their large surge in GPU sales not being from crypto-miners blew-up in their faces, with Nvidia being left sitting on a mountain of unsold 2000-series stock.

 

- The new consoles are going to release this autumn and every sale of a console is profit for Nvidia's main competitor, AMD. Nvidia will surely aim to stymie console purchases by both releasing their 3000-series ahead of them, and offering a more attractive price on them. Otherwise, the new consoles are going to eat into Nvidia's potential profits.

 

- Due to covid-19, many people are low on funds and can't afford the obscene and abusive prices the 2000-series was listed at. If Nvidia increase prices further at this time, or even if they don't decrease them a bit, it's likely to be seen as callous and offensive by potential customers.

 

 

It's also possible that Nvidia upcharged the 2000-series as a last-chance cash-grab, knowing they'd have to drop prices down again with the 3000-series due to the arrival of new consoles.

 

So, I think there's a chance that we'll see lower prices for the 3000 series. And we should see them because the 2000-series prices are unjustifiable and pure greed.

 

But, offsetting all these sound logical reasons to lower prices for the 3000 series is the fact that the company in question is Nvidia, which has shown itself to be divorced from rationality and unaware of its surroundings.

You own the software that you purchase - Understanding software licenses and EULAs

 

"We’ll know our disinformation program is complete when everything the american public believes is false" - William Casey, CIA Director 1981-1987

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Delicieuxz said:

I think there are some reasons why we might expect the 3000-series to be priced much better than the 2000-series was:

 

- RTX didn't turn-out to be as big a deal and a card-seller as Nvidia had hoped. It doesn't help that its performance is terrible to the point that many people simply don't want to use it.

 

- Nvidia's BS about their large surge in GPU sales not being from crypto-miners blew-up in their faces, with Nvidia being left sitting on a mountain of unsold 2000-series stock.

 

- The new consoles are going to release this autumn and every sale of a console is profit for Nvidia's main competitor, AMD. Nvidia will surely aim to stymie console purchases by both releasing their 3000-series ahead of them, and offering a more attractive price on them. Otherwise, the new consoles are going to eat into Nvidia's potential profits.

 

- Due to covid-19, many people are low on funds and can't afford the obscene and abusive prices the 2000-series was listed at. If Nvidia increase prices further at this time, or even if they don't decrease them a bit, it's likely to be seen as callous and offensive by potential customers.

 

 

It's also possible that Nvidia upcharged the 2000-series as a last-chance cash-grab, knowing they'd have to drop prices down again with the 3000-series due to the arrival of new consoles.

 

So, I think there's a chance that we'll see lower prices for the 3000 series. And we should see them because the 2000-series prices are unjustifiable and pure greed.

 

But, offsetting all these sound logical reasons to lower prices for the 3000 series is the fact that the company in question is Nvidia, which has shown itself to be divorced from rationality and unaware of its surroundings.

There were pretty good reasons the non reference 5700xts were supposed to be cheaper than the reference ones.  Didn’t happen though.  They went up in price.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

Just saying..........

 

If the raw hardware IPC performance increase of the A100 compared to the V100 at 1410Mhz is 35% then a "TITAN A" or RTX 3090 with 22-24% less CUDA cores (6912 vs 5248-5376) clocked 45% higher at 2045Mhz should match or exceed the A100 in games by 10-13% ~ that being conservative estimate.

 

Based on data from these sources:

 

Feel free to nitpick or counter.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Results45 said:

More rumor hype: 

.

TDP of 320W. Why and how? Something is not going right unless I'm missing something.

DAC/AMPs:

Klipsch Heritage Headphone Amplifier

Headphones: Klipsch Heritage HP-3 Walnut, Meze 109 Pro, Beyerdynamic Amiron Home, Amiron Wireless Copper, Tygr 300R, DT880 600ohm Manufaktur, T90, Fidelio X2HR

CPU: Intel 4770, GPU: Asus RTX3080 TUF Gaming OC, Mobo: MSI Z87-G45, RAM: DDR3 16GB G.Skill, PC Case: Fractal Design R4 Black non-iglass, Monitor: BenQ GW2280

Link to comment
Share on other sites

Link to post
Share on other sites

Hasn't that been rumored for awhile now that the TDP was going to be higher?

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, CTR640 said:

TDP of 320W. Why and how? Something is not going right unless I'm missing something.

 

I wouldn't deem any of the following to be true just yet: exact TDP, floating point compute performance, total memory bus bandwidth, exact die size, exact boost frequency, and final consumer product branding.

 

The specs that do seem remotely legit are die variant number, # of CUDA cores, process node, GDDR6, and amount of VRAM.

 

According to Tom from Moore's Law is Dead and a few other leak analysts both AMD and Nvidia seem to be participating in misinformation efforts (aka "leaking" false specs or skewed partially-true info) to confuse the wider public, but more importantly rival GPU companies.

 

Until multiple "official sources" or "legit leakers" all confirm the same info, specifics on the type of specs I initially outlined should be treated as rumors.

Link to comment
Share on other sites

Link to post
Share on other sites

I really hope at least some of these things will be true, I am waiting on building my next PC for this and Zen 3!!

Edited by GionnyBanana
Self censorship :P
Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×