Jump to content

Nvidia 30 Series unveiled - RTX 3080 2x faster than 2080 for $699

illegalwater
3 hours ago, porina said:

I spent over 3x the cash on my TV than my G-sync gaming monitor.

 

On a similar note, how many people NEED 3 DP ports? Sure, using 3 displays are a thing. Do they all need to be DP? All monitors I have with DP also have HDMI. Yes, there's always a question about which version so if you have high res and Hz, there may be limitations with either. Native output is best. Adapters are a mess. Are the GPU outputs DP++? Intel I think are, but I don't know about red/green offerings. I really don't know. Active adapters are costly and I'm not convinced on their compatibility for advanced features.

 

Anyway, Asus and Gigabyte appear to be offering two HDMI 2.1 ports, so they will likely get my money over those that don't.

HDMI is not free. It has royalty cost associated to it. This is why you don't have 20 of them on a TV.

DisplayPort is fully backward compatible down to HDMI and single-link DVI. HDMI can't be upgraded to DP.

The 3 video modes just mentioned are part of the DisplayPort standard, which ensures no adapter mess. You can easily get a cable where it is DP on one end, and the other HDMI, without any circuitry or conversion, ensuring 0 latency penalty (same for single link DVI). However, HDMI cannot be converted to DisplayPort. HDMI doesn't recognize DP in its standard.

 

HDMI also has under and over scan issues, a problem that DP doesn't have. Yes, I am tired of people complaining that they can't see their task bar or the image is not full screen despite native resolution set (or not detecting their screen resolution... another problem). DP, like DVI, just works.

 

You have people with 3 monitors on their desk, especially at work (although that might change for home with work at home being a thing now).

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Tedny said:

this is most import slide of all presentation, Nvidia fixing bottleneck of pc

 

Yes, but keep in mind that the game needs to support it (from what we know so far).

 

Link to comment
Share on other sites

Link to post
Share on other sites

29 minutes ago, HumdrumPenguin said:

Why does the power consumption matters to you? Worried it will generate more heat then the cooler can dissipate?

Not necessarily, it just makes my new 750W PSU seem insufficient.

CPU -AMD R5 2600X @ 4.15 GHz / RAM - 2x8Gb GSkill Ripjaws 3000 MHz/ MB- Asus Crosshair VII Hero X470/  GPU- MSI Gaming X GTX 1080/ CPU Cooler - Be Quiet! Dark Rock 3/ PSU - Seasonic G-series 550W/ Case - NZXT H440 (Black/Red)/ SSD - Crucial MX300 500GB/ Storage - WD Caviar Blue 1TB/ Keyboard - Corsair Vengeance K70 w/ Red switches/ Mouse - Logitech g900/ Display - 27" Benq GW2765 1440p display/ Audio - Sennheiser HD 558 and Logitech z323 speakers

Link to comment
Share on other sites

Link to post
Share on other sites

How much of an issue, if any, is having 24GB of VRAM if I only have 16GB of RAM? Really eyeing the 3090 but that's one of the concerns I have, other than the enormous price tag on it.

 

CPU: Ryzen 9 3900X | Cooler: Noctua NH-D15S | MB: Gigabyte X570 Aorus Elite | RAM: G.SKILL Ripjaws V 32GB 3600MHz | GPU: EVGA RTX 3080 FTW3 Ultra | Case: Fractal Design Define R6 Blackout | SSD1: Samsung 840 Pro 256GB | SSD2: Samsung 840 EVO 500GB | HDD1: Seagate Barracuda 2TB | HDD2: Seagate Barracuda 4TB | Monitors: Dell S2716DG + Asus MX259H  | Keyboard: Ducky Shine 5 (Cherry MX Brown) | PSU: Corsair RMx 850W

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, FaxedForward said:

Is it just me, or does it appear that their seemingly clever cooling solution is going to exhaust hot air straight into any tower-style CPU air cooler?

That is correct. However, typically you either have a large CPU cooler or 2 fan or more water cooling solution for the system with specs that would be appropriate for the given GPU. So yes, of course, I can only talk from my perspective until we get actual data (I guess we need to wait for GamerNexus). But, so far, I don't see this being a problem. Yes the CPU will be a bit warmer, but your CPU should not be at 99C in the first place to matter.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Just Monika said:

How much of an issue, if any, is having 24GB of VRAM if I only have 16GB of RAM? Really eyeing the 3090 but that's one of the concerns I have, other than the enormous price tag on it.

 

I don't think these are related in the way you mention it. VRAM is MUCH faster than RAM, but if you run out of VRAM, your system will be relying on whatever RAM you have available to do the extra work. You can go for the 3090 without a problem.

Link to comment
Share on other sites

Link to post
Share on other sites

Press F for all those who payed premium for the RTX 2080 Ti.

"Tolerance is the lube that helps the dildo of dysfunction slip into the ass of a civilized society" - Plato 427-347 BC

"Tolerance and apathy are the last virtues of a dying society" - Aristotle 384-322 BC

"Hope is the first step on the road to disappointment" - Lebiniz 1st of July 1646 - 14th of November 1716

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, thechinchinsong said:

RTX-3070 price reaction: 🤨

RTX-3080 price reaction: 😮

RTX-3090 price reaction: 😐

There I fixed it for you.

 

Keep in mind that a few years ago, the GeForce GTX 980 was $549, and people were shocked by the high price.

So all Nvidia did is play the game, of mass overpricing their 2000 series, only to simply overprice a bit less their 3000 series, making you forget about the real price of what it was before and make you think it is a fantastic deal. It's not.

 

Link to comment
Share on other sites

Link to post
Share on other sites

35 minutes ago, Mark Kaine said:

I really like them  I also liked the previous FE designs,  but the 3070 looks awful lol wtf were they thinking. 🤣

 

Also 8GB, oof. 

True, but in Nvidia defense, based on Steam hardware survey, most people have a 1080p display (~65.5%)

https://store.steampowered.com/hwsurvey

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Well, guess I'm getting a new GPU. Thank god they aren't the price of a new peugeot.

LTT's Resident Porsche fanboy and nutjob Audiophile.

 

Main speaker setup is now;

 

Mini DSP SHD Studio -> 2x Mola Mola Tambaqui DAC's (fed by AES/EBU, one feeds the left sub and main, the other feeds the right side) -> 2x Neumann KH420 + 2x Neumann KH870

 

(Having a totally seperate DAC for each channel is game changing for sound quality)

Link to comment
Share on other sites

Link to post
Share on other sites

36 minutes ago, GoodBytes said:

True, but in Nvidia defense, based on Steam hardware survey, most people have a 1080p display (~65.5%)

https://store.steampowered.com/hwsurvey

 

 

Which makes sense since the vast majority seem to have 1050ti / 1060 cards or below...  Sometimes I wonder for whom those $500+ cards are made for... Well I know for enthusiasts,  but that still means 8GB just isn't all that much for such a card,  the 3060 would likely have 6GB then,  again,  just like the 1060 and 2060... So definitely NV seems to think VRAM isn't that important? 3 generations and no gains in VRAM capacity is a bit weird (although it's faster I guess) 

 

Maybe I'm over estimating how important VRAM is but on the other hand, requirements will only go up so I can definitely see 6 or 8GB being a "bottleneck" of sorts going forward. 

 

^ I'm also biased because the game I'm playing most basically almost constantly maxes out my VRAM .  And if you give it more, it maxes that out too!

 

IMG_20200902_020707.jpg.406df10597aa47490d25f69cc031c75d.jpg

 

(Monster Hunter World btw)

https://store.steampowered.com/app/960781/Monster_Hunter_World__High_Resolution_Texture_Pack/

 

You can get away with less but definitely not at max settings,  not even close. 

 

And I can't see a theoretical MHW2 using any less either :o

 

 

 

 

 

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, Derkoli said:

Well, guess I'm getting a new GPU. Thank god they aren't the price of a new peugeot.

Probably will last longer than a Peugeot 

Dirty Windows Peasants :P ?

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, Mark Kaine said:

Which makes sense since the vast majority seem to have 1050ti / 1060 cards or below...  Sometimes I wonder for whom those $500+ cards are made for... Well I know for enthusiasts,  but that still means 8GB just isn't all that much for such a card,  the 3060 would likely have 6GB then,  again,  just like the 1060 and 2060... So definitely NV seems to think VRAM isn't that important? 3 generations and no gains in VRAM capacity is a bit weird (although it's faster I guess) 

 

Maybe I'm over estimating how important VRAM is but on the other hand, requirements will only go up so I can definitely see 6 or 8GB being a "bottleneck" of sorts going forward. 

 

^ I'm also biased because the game I'm playing most basically almost constantly maxes out my VRAM .  And if you give it more, it maxes that out too!

 

 

 

(Monster Hunter World btw)

https://store.steampowered.com/app/960781/Monster_Hunter_World__High_Resolution_Texture_Pack/

 

You can get away with less but definitely not at max settings,  not even close. 

 

And I can't see a theoretical MHW2 using any less either :o

 

Well, it is clear that they are making serious profits on these cards. I think Nvidia is happy with 8GB, and as that will help people have a reason to upgrade next year, and hopefully a higher end GPU (increased profits). Sucks, but that is what happens when you have no real competitor.

Link to comment
Share on other sites

Link to post
Share on other sites

Yup... basically. 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, porina said:

 

 

I've got the B9, and will almost certainly get the 3080. The thing to keep in mind is, you don't always have to be running at the upper end of the refresh rate. As long as the minimums are good enough for you, after all, the CX is G-sync compatible.

 

It is probably far enough off given rumours haven't hinted at anything in that space, although anything is possible.

I've got a no as well, about time to use my bestbuy anything warranty and upgrade to either a c9 or cx 65".

 

The rumors of a 3080 20gb from AIB partners hint to me a 3080ti is ready to go.

 

4gb less ram and a 3090 die cut down 5-10 percent for $999 really rounds out the lineup.

 

Just like they did with pascal, launched 1080, 1070 and 1060. A few months later the titan x and then in the spring the 1080ti.

 

I just don't want to wait 6 months for a 3080ti. Though I did sell my 2080ti almost 3 weeks ago for $1200cad so $920usd lol. Man I feel bad for that guy, he paid $1200 for a 3070 🤣

 

CPU | Intel i9-10850K | GPU | EVGA 3080ti FTW3 HYBRID  | CASE | Phanteks Enthoo Evolv ATX | PSU | Corsair HX850i | RAM | 2x8GB G.skill Trident RGB 3000MHz | MOTHERBOARD | Asus Z490E Strix | STORAGE | Adata XPG 256GB NVME + Adata XPG 1T + WD Blue 1TB + Adata 480GB SSD | COOLING | Evga CLC280 | MONITOR | Acer Predator XB271HU | OS | Windows 10 |

                                   

                                   

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, F.E.A.R. said:

Press F for all those who payed premium for the RTX 2080 Ti.

Honestly if you bought the 2080ti when it first came out you had top of the line performance for a whole two years. Is it the best value ever? No but if you care about value you aren't a flagship card buyer anyways and probably don't have a monitor that would require the 2080ti over a more reasonably priced gpu. I bought the 2080ti for my 4k 144hz monitor and it did its job beautifully and will likely be replaced by the 3090. 

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, illegalwater said:

-snip-

I'm just glad that it seemed like engineers wrote the presentation not marketing. A good sign that green team is onboard with consumer friendliness. Maybe they've seen how Intel has ended up, so they want to keep the favour of the people whilst pushing the industry further.

Edited by GoodBytes
snipped long quote
Link to comment
Share on other sites

Link to post
Share on other sites

My last 2 graphic cards were in the mid range and I paid less than 200 euros for those but I feel like those days really are over. Just checked my mail my last graphic card purchase was a Sapphire Radeon HD 6870 1Go OC for 175 euros in June 2012 and before that a HIS Radeon HD 5770 1Go IceQ for 165 euros in May 2010 so it's been a while for sure.

 

But if I check right now for a graphic card between 150 and 200 euros it's GTX1050 ti,GTX 650, RX570 and RX5500 XT from the site I usually use, that doesn't look like mid range it's more like last gen or I have my release dates wrong for those gpu?

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, thorpyworpy said:

I'm just glad that it seemed like engineers wrote the presentation not marketing. A good sign that green team is onboard with consumer friendliness. Maybe they've seen how Intel has ended up, so they want to keep the favour of the people whilst pushing the industry further.

No they are not. All the charts given are not quantifiable. Even claims like "3x quieter"... compared to what exactly? How was the DB measured? Or is it perceive (which is completely subjective)? The performance chart comparing GPUs doesn't make sense either. I'll give them that they didn't bother with FPS charts, which is always good, as those are BS.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Rune said:

No SLI fingers :(

Correct. You'll need to get 3090 to have it.

SLI is dead. It was dead for a long time now. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, GoodBytes said:

Correct. You'll need to get 3090 to have it.

SLI is dead.

Was planning on getting a 3090. Didn't see it there either. Am I just blind?

Edit: Aaaah new nvlink thingy. Coolio.

Link to comment
Share on other sites

Link to post
Share on other sites

r/pcmasterrace - Rip non aio CPUs I guess

LTT's Resident Porsche fanboy and nutjob Audiophile.

 

Main speaker setup is now;

 

Mini DSP SHD Studio -> 2x Mola Mola Tambaqui DAC's (fed by AES/EBU, one feeds the left sub and main, the other feeds the right side) -> 2x Neumann KH420 + 2x Neumann KH870

 

(Having a totally seperate DAC for each channel is game changing for sound quality)

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×