Jump to content

Nvidia 30 Series unveiled - RTX 3080 2x faster than 2080 for $699

illegalwater
3 minutes ago, Rune said:

Was planning on getting a 3090. Didn't see it there either. Am I just blind?

Edit: Aaaah new nvlink thingy. Coolio.

It is mentioned on the Nvidia Specs page of the 30 series:

https://www.nvidia.com/en-us/geforce/graphics-cards/30-series/rtx-3090/

Go on "Specs" at the top navigation bar, and then click on "full specs", and scroll down a bit.

It says:

"NVIDIA NVLink™ (SLI-Ready):    Yes"

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Derkoli said:

r/pcmasterrace - Rip non aio CPUs I guess

so what you are saying its less heat then any other GPU that isnt a blower style ?

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Exty said:

so what you are saying its less heat then any other GPU that isnt a blower style ?

It's a meme, don't get your knickers in a twist :) 

LTT's Resident Porsche fanboy and nutjob Audiophile.

 

Main speaker setup is now;

 

Mini DSP SHD Studio -> 2x Mola Mola Tambaqui DAC's (fed by AES/EBU, one feeds the left sub and main, the other feeds the right side) -> 2x Neumann KH420 + 2x Neumann KH870

 

(Having a totally seperate DAC for each channel is game changing for sound quality)

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, LAwLz said:

Sadly only decoding from what I can tell.

The 8Mbps number is for viewers watching Twitch, assuming Twitch encodes to AV1.

i wonder if there is any way for them to let us use the tensor cores for encoding

Link to comment
Share on other sites

Link to post
Share on other sites

It's great to have something worth being excited about in graphics hardware after such slow progression and over-inflated prices for so many years.

 

I sort-of called-it, that Nvidia was going to price the 3000 series better to c*ck-block the upcoming consoles, due to the crypto-crash, and because of covid-19 impacting people's finances:

On 6/22/2020 at 6:56 AM, Delicieuxz said:

I think there are some reasons why we might expect the 3000-series to be priced much better than the 2000-series was:

 

- RTX didn't turn-out to be as big a deal and a card-seller as Nvidia had hoped. It doesn't help that its performance is terrible to the point that many people simply don't want to use it.

 

- Nvidia's BS about their large surge in GPU sales not being from crypto-miners blew-up in their faces, with Nvidia being left sitting on a mountain of unsold 2000-series stock.

 

- The new consoles are going to release this autumn and every sale of a console is profit for Nvidia's main competitor, AMD. Nvidia will surely aim to stymie console purchases by both releasing their 3000-series ahead of them, and offering a more attractive price on them. Otherwise, the new consoles are going to eat into Nvidia's potential profits.

 

- Due to covid-19, many people are low on funds and can't afford the obscene and abusive prices the 2000-series was listed at. If Nvidia increase prices further at this time, or even if they don't decrease them a bit, it's likely to be seen as callous and offensive by potential customers.

 

 

It's also possible that Nvidia upcharged the 2000-series as a last-chance cash-grab, knowing they'd have to drop prices down again with the 3000-series due to the arrival of new consoles.

 

So, I think there's a chance that we'll see lower prices for the 3000 series. And we should see them because the 2000-series prices are unjustifiable and pure greed.

 

But, offsetting all these sound logical reasons to lower prices for the 3000 series is the fact that the company in question is Nvidia, which has shown itself to be divorced from rationality and unaware of its surroundings.

A couple of more reasons to add to that list:

 

- AMD are soon to be releasing their more powerful Navi GPUs, and of course Nvidia want to direct as many sales as possible towards themselves while flaunting their status as the lead graphics card developer.

 

- Intel are entering the GPU market (and I hope they're going to stay in the market) and the 3000 series is Nvidia's last-chance to face a market with only 1 competitor. Nvidia will also want to make it as difficult as they can for Intel to get a foothold in the market, which is best done by increasing the gap Intel has to close in order to be competitive.

 

 

The 3000 series isn't priced lower than the RTX 2000 series, but its price-to-performance ratio blows the RTX 2000 series away. That accomplishes the same thing, but with a higher entry fee.

 

It seems that Nvidia are sucker-punching the competition with the 3000 series and making the upcoming consoles look weak before they launch. I think Nvidia were stingy with the 2000 series knowing that they were going to pull this move with the 3000 just before the new consoles release. And by making the 2000 series performance-value offering meagre, they enabled the 3000 series to offer a huge jump over the 2000 series.

 

 

A long time ago, I opined that AMD and Nvidia can turn-up the performance from one generation to another on a whim, like increasing the water flow from a tap:

On 6/10/2016 at 2:37 AM, Delicieuxz said:

Nvidia and AMD would have gotten more money from me if they'd been making meaningful performance increases to their GPU hardware throughout the last 5 years, instead of rehashing the same tech and performance. No one should think the slow pace of GPU performance over the recent years was natural - Nvidia and AMD turn their performance increases up and down like a tap to coincide with market influences, like the PS3 and 360 consoles bring game graphics progression to a near halt for 5 years, and like Nvidia making large strides once VR, 2k / 4k / 144 Mhz frequency monitors come out. Sure, they spend time and money developing new tech, but any time they want to make a leap in graphics performance, they already know how to go about accomplishing it, and it'll take them only as long as they feel will best allow them to capitalize on it.

For AMD, that hasn't proven to be the case. But I think Nvidia have shown that they have reserved performance up their sleeves which they're simply holding-back until an opportune time when its unleashing will clobber their rivals.

 

 

I might have also gotten this one right, regarding the RTX 3090's $2000 and then $14000 price rumours before the price was confirmed:

On 8/19/2020 at 3:46 AM, Delicieuxz said:

I think that Nvidia or whatever company pricing rumours are about might deliberately leak rumours of different prices to gauge how people respond to them, and then decide the pricing based on what they think they can get away with based on the feedback. So, if a lot of people say they're OK paying a higher price for a part, rather than that the price is unreasonable, then it may be that the manufacturer decides to price the part part higher in the end.

 

On 8/19/2020 at 6:00 AM, Delicieuxz said:

That would be assuming that a company would only release actual price candidates and not minimum and maximum along a spectrum to see what people gravitate towards, how differently people's reactions are to the minimum versus the maximum, or to see which pricing people find more believable or at which price people start getting angry.

 

There are a lot more ways to get feedback on customer willingness and tolerance than by putting out a solid price candidate.

 

On 8/19/2020 at 9:26 AM, Delicieuxz said:

Another way that a company might use two price rumours like $2000 and $1400 to their benefit, if they plan their final release price to be unpalatable to most people, is by first releasing a fake figure that is extremely unreasonable and letting people express revulsion at it, and then later release the much lower price, which is still pretty unreasonable, but because it's so much lower than the first price they released, people accept it a lot easier and feel relieved, rather than angry, after having first been exposed to the much larger price rumour.

 

Doing this can condition people into accepting higher prices, as rather than be critical of the high price, they feel relieved that they're not being charged the higher price rumour that they were first exposed to.

You own the software that you purchase - Understanding software licenses and EULAs

 

"We’ll know our disinformation program is complete when everything the american public believes is false" - William Casey, CIA Director 1981-1987

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, gabrielcarvfer said:

I don't see why this wouldn't be the case. Making use of the tensors to encode is the real issue. Not really sure which part of the encoding pipeline would be faster by using it.

from what i heard av1 encoding is very intensive so they probably would need something like the tensor cores to encode av1 in real time

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Mark Kaine said:

really like them  

Ah no well I was talking about Asus's AIB cards lol but the thread was edited so now my post doesn't make sense lol. 

PC: Motherboard: ASUS B550M TUF-Plus, CPU: Ryzen 3 3100, CPU Cooler: Arctic Freezer 34, GPU: GIGABYTE WindForce GTX1650S, RAM: HyperX Fury RGB 2x8GB 3200 CL16, Case, CoolerMaster MB311L ARGB, Boot Drive: 250GB MX500, Game Drive: WD Blue 1TB 7200RPM HDD.

 

Peripherals: GK61 (Optical Gateron Red) with Mistel White/Orange keycaps, Logitech G102 (Purple), BitWit Ensemble Grey Deskpad. 

 

Audio: Logitech G432, Moondrop Starfield, Mic: Razer Siren Mini (White).

 

Phone: Pixel 3a (Purple-ish).

 

Build Log: 

Link to comment
Share on other sites

Link to post
Share on other sites

42 minutes ago, spartaman64 said:

from what i heard av1 encoding is very intensive so they probably would need something like the tensor cores to encode av1 in real time

In 2019 the fastest AV1 encoder was 45 times slower than x264, so yeah maybe. The 2020 report is coming out in the next month and we'll see how the gap shifts then, because AV1 was 50 times slower than it is right now in 2017.

¯\_(ツ)_/¯

 

 

Desktop:

Intel Core i7-11700K | Noctua NH-D15S chromax.black | ASUS ROG Strix Z590-E Gaming WiFi  | 32 GB G.SKILL TridentZ 3200 MHz | ASUS TUF Gaming RTX 3080 | 1TB Samsung 980 Pro M.2 PCIe 4.0 SSD | 2TB WD Blue M.2 SATA SSD | Seasonic Focus GX-850 Fractal Design Meshify C Windows 10 Pro

 

Laptop:

HP Omen 15 | AMD Ryzen 7 5800H | 16 GB 3200 MHz | Nvidia RTX 3060 | 1 TB WD Black PCIe 3.0 SSD | 512 GB Micron PCIe 3.0 SSD | Windows 11

Link to comment
Share on other sites

Link to post
Share on other sites

30 minutes ago, TofuHaroto said:

Ah no well I was talking about Asus's AIB cards lol but the thread was edited so now my post doesn't make sense lol. 

Oh I see 😛

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Those CUDA Cores are very suspicious, it doesn't fall in line with the expected performance out of these cards, between the RTX 2080 and the RTX 3080 there should be a 3X increase in performance at a minimum but instead we're seeing 70-80% performance increase, they've split the FP32 Cores into 2, but without optimizing the rest of the pipeline? I'd take the insane number of TFlops with a huge grain of salt, the performance expectations more falls in line when you half the amount of CUDA Cores or Shader TFlops, it just seems extremely misleading.

Quote or Tag people so they know that you've replied.

Link to comment
Share on other sites

Link to post
Share on other sites

i think we all know what really matters....

 

when are the sub 500$ AMD and NVidia GPUs releasing because damn.

😕

Link to comment
Share on other sites

Link to post
Share on other sites

Also based on the performance we've seen, I'd expect the RTX 3070 to be slightly faster or on par with an RTX 2080S but not quite reaching the RTX 2080 Ti.

Quote or Tag people so they know that you've replied.

Link to comment
Share on other sites

Link to post
Share on other sites

Waiting for reviews of course, but it looks like the 3080 is my next GPU, right on my 1k CAD budget

Pizza is the best food group

Link to comment
Share on other sites

Link to post
Share on other sites

I guess this is appropriate here:

 dgtryr4.jpg

EDIT: Not mine. Just found it in Reddit. LOL

CPU: Sempron 2500+ / P4 2.8E / P4 2.6C / A64 x2 4000+ / E6420 / E8500 / i5-3470 / i7-3770
GPU: TNT2 M64 / Radeon 9000 / MX 440-SE / 7300GT / Radeon 4670 / GTS 250 / Radeon 7950 / 660 Ti

Link to comment
Share on other sites

Link to post
Share on other sites

35 minutes ago, Syn. said:

Also based on the performance we've seen, I'd expect the RTX 3070 to be slightly faster or on par with an RTX 2080S but not quite reaching the RTX 2080 Ti.

The 3080 was shown to be 70-90% faster in certain non-RTX games by digital foundry. I doubt Nvidia would blatantly say on their slides that the 3070 was faster than a 2080 ti if it wasn't true. It's a comparison between two of their own products and they presumably make a lot more money off a 2080ti than they will with a 3070. 

Link to comment
Share on other sites

Link to post
Share on other sites

Nice, i can finally upgrade from my 2080 ti

i5 2400 | ASUS RTX 4090 TUF OC | Seasonic 1200W Prime Gold | WD Green 120gb | WD Blue 1tb | some ram | a random case

 

Link to comment
Share on other sites

Link to post
Share on other sites

Not to be on the hype train but, upgrading from a GTX760 to a build in a PC-011D (a case that's tighter than it would appear), all I can say is:

 

YESSS!!! The PCB on the new cards is small!! That was a serious question holding up progress on the build.

ENCRYPTION IS NOT A CRIME

Link to comment
Share on other sites

Link to post
Share on other sites

It has AV1 hardware decoding, but what about encoding? I'd really prefer to have that

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, GoodBytes said:

True, but in Nvidia defense, based on Steam hardware survey, most people have a 1080p display (~65.5%)

https://store.steampowered.com/hwsurvey

 

 

There is this high refresh rate thing going on too. Most people settle at 60fps as some sort of golden standard, but after trying 144Hz, there is no way going back. And to consistently have that, sticking with 1080p means you can have one graphic card and achieve that even in very latest games that would require you a GPU upgrade at much higher resolutions. But not at 1080p. And with GPU scheduling slowly becoming a thing, I don't think CPU bottleneck will be that much of a problem at insane framerates due to lower resolutions. I don't mind 1080p from quality perspective, but I've enjoyed massive performance benefits in games, allowing me to run anything I ever desire at insane framerates. And I always just up all the image quality stuff to the max. Sans RTX thing coz obvious reasons.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Syn. said:

Those CUDA Cores are very suspicious, it doesn't fall in line with the expected performance out of these cards, between the RTX 2080 and the RTX 3080 there should be a 3X increase in performance at a minimum but instead we're seeing 70-80% performance increase, they've split the FP32 Cores into 2, but without optimizing the rest of the pipeline? I'd take the insane number of TFlops with a huge grain of salt, the performance expectations more falls in line when you half the amount of CUDA Cores or Shader TFlops, it just seems extremely misleading.

TFlops is flawed anyways never believe them. But just because somethign has mroe cuda cores doesnt mean it has that much extra performance. AMD beat out nvidia in specs with their last card but it performed significantly worse despite having superior specs in it. We can't expect a linear increase in performance and expecting a linear increase is stupid. Diminishing returns will happen and how they optimize said cores and other parts of the card wil ldetermine performance not raw spec alone. You could have a card with massive amounts of everything in it but have it run horrible all the same.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm looking at cooler designs and I'm scratching my head why vendors make a hole for the back of the GPU instead of adding a thermal pad there and make it contact with a metal backplate. Back of GPU also gets hot and while most heat dissipates on top, removing some at the back can be beneficial.

 

The AORUS GTX 1080Ti that I have has a copper chunk on the back of GPU and it's always really hot. But I don't expect that, just make back of GPU contact aluminium backplate with thermal pad and call it a day. Should make a difference.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, AdmiralMeowmix said:

TFlops is flawed anyways never believe them. But just because somethign has mroe cuda cores doesnt mean it has that much extra performance. AMD beat out nvidia in specs with their last card but it performed significantly worse despite having superior specs in it. We can't expect a linear increase in performance and expecting a linear increase is stupid. Diminishing returns will happen and how they optimize said cores and other parts of the card wil ldetermine performance not raw spec alone. You could have a card with massive amounts of everything in it but have it run horrible all the same.

This is true enough.

But the second generation of a new technology is also generally the one with the largest relative performance increase.

In this case, we are seeing the second generation of hardware support for two technologies, DLSS and RTX.

There is also another factor: In consoles, AMD graphics is king, and has been for a while. But in the desktop space, Nvidia is generally king, and has been for a while... But now, we are seeing the beginning of a very large change in how realtime rendering works, and Nvidia knows that in order to maintain their sizable lead in the PC graphics market they will have to be completely better than their competition from the very beginning. They are now a generation ahead of AMD, and will almost invariably have better performance than the new AMD cards. Beyond that, I don't see how AMD will be able to be price competitive this go around while producing profit on the cards. Keep in mind, both brands will have hardware supported DXR (what Nvidia calls RTX).

I fully suspect that this new line of cards will fully realize Nvidia's business goals.

ENCRYPTION IS NOT A CRIME

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Delicieuxz said:

The 3000 series isn't priced lower than the RTX 2000 series, but its price-to-performance ratio blows the RTX 2000 series away. That accomplishes the same thing, but with a higher entry fee.

Nvidia‘s CEO kinda admitted that pascal had a higher performance jump than Turing 

 

1 hour ago, Nicnac said:

3080ti when?

I think Nvidia wants to get rid of the -ti naming and the will replace it with Super because super Sounds much cooler than ti 

Edited by Drama Lama

Hi

 

Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler

hi

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×