Jump to content

Making Nvidia’s CEO mad

So much for getting one for my workstation! Got stuck checking out, and then when I finally got to the payment details, the page reset, and said out of stock. :(

Screenshot_1112.thumb.png.c7946f1edc6ce9e5bee6c3887de7b415.png

 

Screenshot_1113.png.967f10c7b8651f2017877addaefea8e4.png

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

8k with dlss.....so it cant really do native 8k then thatll be a strong pass from me dog. Thats like giving grandma coke and saying she speeds around everywhere all the time when it only happens after she partakes the devils powder lol

CPU: 6700K Case: Corsair Air 740 CPU Cooler: H110i GTX Storage: 2x250gb SSD 960gb SSD PSU: Corsair 1200watt GPU: EVGA 1080ti FTW3 RAM: 16gb DDR4 

Other Stuffs: Red sleeved cables, White LED lighting 2 noctua fans on cpu cooler and Be Quiet PWM fans on case.

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, GabenJr said:

But is it good enough to fork over $1500?

the gist i got from the video is a hard no

Link to comment
Share on other sites

Link to post
Share on other sites

I think the take away from this is that unless you are doing workstation type jobs the 3080 is the card to get. 3090 is just overpriced and the majority of games simply DON'T NEED 24GB VRAM. Let's be realistic, "8K gaming" is nothing more than a TECH DEMO right now, but I'd love to get some of what nVidia's marketing people are smoking xD

 

Yes, if you want to make Jensen mad it's that you are angry he is milking gamers for things they don't need. A cheaper 3090 with half the memory (let's call it a 3080TI) would have been a better sell and a more attractive product. Of course, you cannot even buy the card because of a combination of lack of availability and bots.

 

Also, considering that many 2080TI cards were north of $1200 the $1500 asking price isn't all that crazy. The real losers were those who paid $2500 for a RTX Titan... and then $200 more for a waterblock+backplate. Let's not also forget that this "price inflation" began with the 1080TI and the mining craze from 2 years ago. I certainly remember that day I could have bought four EVGA HydroCopper 1080TI's for $800/pop and sell them for DOUBLE just a few months later.

 

It would be interesting to test the performance of two 3090's with the nVlink connector attached, but since nVidia has SAID NOTHING about the bridge needed to get this to work it has to make you wonder how serious they are about the card's capabilities. SLI has always been their performance crown, and it would be stupid of them to simply abandon nVlink given that it is still relatively new and it's native inclusion on all Quadro cards.

 

As for testing, it would be nice to see 3440x1440 results since this falls right between 4K and normal 1440p. I also think at that resolution this card would work fine with an older CPU like an 8 core 5960X.

 

Finally, when it comes to using that 24GB VRAM you guys should fire up X-PLANE and turn all the dials to 11. It's the only sim/game I know of that can bring even a high end system to it's knees, and the 3090 on paper would be the right card for it.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Luscious said:

"8K gaming" is nothing more than a TECH DEMO right now

Ya, I have told lots of friends something like this as well.

 

Not only is an 8k TV or monitor basically only affordable to the upper 0.5% of the population...but even if you got something to play it on, you are literally limited to medium (maybe high at most) settings with DLSS (upscaled from 1440p!).  And yes, I am assuming that still looks SICK, but this would be the equivalent of trying to 4k game when 1080p TV's cost 2000$; it just isnt really a thing that can happen for anyone save the very very few at the top in terms of hardware and display costs.

 

I am all for the push forward in tech; would love to play on a console on an 8k 70 inch TV at 30-60FPS, and this is certainly a step in that direction...

 

And I will also give in on the fact that if you want to 8k game, this is truly the only card that can do that (cores + VRAM on any other card wouldn't cut it).  So they arent wrong that it is THE 8K GPU...but I also dont want to play games at 15-30 FPS (which is what you would be playing the majority of games at without DLSS).

 

Good on them for pushing the envelope, but not many are going to want to pay that price for it.

El Zoido:  9900k + RTX 4090 / 32 gb 3600mHz RAM / z390 Aorus Master 

 

The Box:  3900x + RTX 3080 /  32 gb 3000mHz RAM / B550 MSI mortar 

Link to comment
Share on other sites

Link to post
Share on other sites

@Valentyn I think you should wait until their release the real Titan of this generation.

Main Rig :

Ryzen 7 2700X | Powercolor Red Devil RX 580 8 GB | Gigabyte AB350M Gaming 3 | 16 GB TeamGroup Elite 2400MHz | Samsung 750 EVO 240 GB | HGST 7200 RPM 1 TB | Seasonic M12II EVO | CoolerMaster Q300L | Dell U2518D | Dell P2217H | 

 

Laptop :

Thinkpad X230 | i5 3320M | 8 GB DDR3 | V-Gen 128 GB SSD |

Link to comment
Share on other sites

Link to post
Share on other sites

Damn, Linus wanted to talk about the card so much, intro was skipped, and this is the fastest I've ever heard him go through a sponsor. 11 seconds. World Record! :P

Intel Xeon E5 1650 v3 @ 3.5GHz 6C:12T / CM212 Evo / Asus X99 Deluxe / 16GB (4x4GB) DDR4 3000 Trident-Z / Samsung 850 Pro 256GB / Intel 335 240GB / WD Red 2 & 3TB / Antec 850w / RTX 2070 / Win10 Pro x64

HP Envy X360 15: Intel Core i5 8250U @ 1.6GHz 4C:8T / 8GB DDR4 / Intel UHD620 + Nvidia GeForce MX150 4GB / Intel 120GB SSD / Win10 Pro x64

 

HP Envy x360 BP series Intel 8th gen

AMD ThreadRipper 2!

5820K & 6800K 3-way SLI mobo support list

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Luscious said:

the majority of games simply DON'T NEED 24GB VRAM

Based on the top 10 sales of Steam Games: https://store.steampowered.com/search/?filter=topsellers

No Mans Sky might benefit from the extra VRAM(not played since it was released really, and now that it has VR), however #10 (Satisfactory) most definitely would. However I do agree with you 24GB is over kill and imo spending $1500 on a $40 game or 2 is simply not worth it, yet since top sellers don't equal top played tho if the survey is correct the most popular card on steam is a 1060, so if the 1070 eats 💩 what would the 1060 do 🤣 My GPU hits 6GB/8GB VRAM at least playing the game, tho I do have other stuff open but not usually actively running.

 

1 hour ago, NumLock21 said:

Damn, Linus wanted to talk about the card so much, this is the fastest I've ever heard him go through a sponsor. 11 seconds. World Record! :P

He also technically skipped the intro as well.

Tho this happens every year with Nvidia, and everyone will forget about it in 4-6 weeks time.

Link to comment
Share on other sites

Link to post
Share on other sites

Buy an RTX 3080:
On Amazon (PAID LINK): sorry
On Newegg (PAID LINK): none left
On B&H (PAID LINK): RIP

 

this was gold.

MSI x399 sli plus  | AMD theardripper 2990wx all core 3ghz lock |Thermaltake flo ring 360 | EVGA 2080, Zotac 2080 |Gskill Ripjaws 128GB 3000 MHz | Corsair RM1200i |150tb | Asus tuff gaming mid tower| 10gb NIC

Link to comment
Share on other sites

Link to post
Share on other sites

Since there is no thread for the Nvidia Sponsored 8K video I will say this here:

 

At the beginning of the video Linus says he is getting a LG ZX but the box clearly shows it is the Z9. That's all, Carry on.

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Fatih19 said:

@Valentyn I think you should wait until their release the real Titan of this generation.


Now I want both!

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

@GabenJr

hi, in the video there was a mention of the testbench needing to upgrade from its 850w Powersupply. 

 

Did the system have shutdownissues and as such you guys increased the wattage of the PSU?

 

What PSU was the 850w unit?

Would be nice to know, as the rtx 3080 is known to be quite hard on transiense sensetive PSUs, causing shutdowns.

 

And what unit did you guys upgrade to? I looked for what PSU was used in the Description, and in the testbench specs. But there was no mention

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, Zberg said:

Not only is an 8k TV or monitor basically only affordable to the upper 0.5% of the population...

but but but how can i game if i can't game at 8k on a 32 inch monitor? i need more pixels :(

to me this falls under the ultrawide syndrome where people get xbox mad over games not having full ultrawide support or the game can't do ultrawide + 144fps for their monitors and if the game can't do that then the devs are big meany poopy heads who want to kill their children and are specifically targeting them. 

 

To 99% of the people, 1080p/1440p is still going to be perfect (one of the reasons i might pick up a xbox series s is because i know i dont need 4k-ish performance, i'm still fine with 1080p). 4k will be it for the rest. If you absolutely need to play at 8k then well.... you're gonna have to wait lol, it was extremely naïve to expect to go from 4k to 8k native from the 2000 to 3000. No matter what nvidia said.

Link to comment
Share on other sites

Link to post
Share on other sites

17 hours ago, Quinnell said:

What makes you say that?\

Because there is no space for it in the product stack. 

The gap between 3080 and 3090 is 7-10%. It can't be faster than 3080 without totally killing off 3090. Also, while 3090 is a good workstation card, it's not a great one. So you can't really use productivity as an outlier. And if 20GB 3080 is real after all, releasing 12GB version of 3090 makes no sense either.

Ex-EX build: Liquidfy C+... R.I.P.

Ex-build:

Meshify C – sold

Ryzen 5 1600x @4.0 GHz/1.4V – sold

Gigabyte X370 Aorus Gaming K7 – sold

Corsair Vengeance LPX 2x8 GB @3200 Mhz – sold

Alpenfoehn Brocken 3 Black Edition – it's somewhere

Sapphire Vega 56 Pulse – ded

Intel SSD 660p 1TB – sold

be Quiet! Straight Power 11 750w – sold

Link to comment
Share on other sites

Link to post
Share on other sites

How to get Jensen mad? I still use a 710 card (and only because the 210 I had went haywire) :P Silent, adequate for my non-gaming needs and cheap!

"You don't need eyes to see, you need vision"

 

(Faithless, 'Reverence' from the 1996 Reverence album)

Link to comment
Share on other sites

Link to post
Share on other sites

In the video Linus mentions that even their 850w PSU struggled.  How??  The GPU draws at most 450w peak, there's at least 300-350w of headroom before even maxing it out.  Even a 10900k won't touch that power draw.  I'm thoroughly conflicted.  No total system power draw chart this time round.

QUOTE ME IN A REPLY SO I CAN SEE THE NOTIFICATION!

When there is no danger of failure there is no pleasure in success.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Dutch_Master said:

How to get Jensen mad? I still use a 710 card (and only because the 210 I had went haywire) :P Silent, adequate for my non-gaming needs and cheap!

No, if you actually want to get Jensen mad, just use an APU from AMD.

Main Rig :

Ryzen 7 2700X | Powercolor Red Devil RX 580 8 GB | Gigabyte AB350M Gaming 3 | 16 GB TeamGroup Elite 2400MHz | Samsung 750 EVO 240 GB | HGST 7200 RPM 1 TB | Seasonic M12II EVO | CoolerMaster Q300L | Dell U2518D | Dell P2217H | 

 

Laptop :

Thinkpad X230 | i5 3320M | 8 GB DDR3 | V-Gen 128 GB SSD |

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Quadriplegic said:

Because there is no space for it in the product stack. 

The gap between 3080 and 3090 is 7-10%. It can't be faster than 3080 without totally killing off 3090. Also, while 3090 is a good workstation card, it's not a great one. So you can't really use productivity as an outlier. And if 20GB 3080 is real after all, releasing 12GB version of 3090 makes no sense either.

Maybe a price drop for the 3090? But that would be a spit in the face of early adopters, who waited in line for days at a brick and mortar store.

Main Rig :

Ryzen 7 2700X | Powercolor Red Devil RX 580 8 GB | Gigabyte AB350M Gaming 3 | 16 GB TeamGroup Elite 2400MHz | Samsung 750 EVO 240 GB | HGST 7200 RPM 1 TB | Seasonic M12II EVO | CoolerMaster Q300L | Dell U2518D | Dell P2217H | 

 

Laptop :

Thinkpad X230 | i5 3320M | 8 GB DDR3 | V-Gen 128 GB SSD |

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/24/2020 at 4:00 PM, GabenJr said:

Nvidia’s not done yet – Their brand new, fastest-ever GPU just got smoked by their own RTX 3090. But is it good enough to fork over $1500?

 

 

Buy an RTX 3090:
On Amazon (PAID LINK): TBD
On Newegg (PAID LINK): TBD
On B&H (PAID LINK): TBD

 

Buy an RTX 3080:
On Amazon (PAID LINK): sorry
On Newegg (PAID LINK): none left
On B&H (PAID LINK): RIP

"bringing 4k to somewhere in the future TODAY!

 

Forgetting that we have 4k monitors for years now and nvidia sandbagging their GPUs and delaying new generations lead us so many years to spend so many $$$$ and yet still not be able to play on 4k monitors! 

 

Also while it is quite playable now in most games it still struggles to support all the 4k monitors (there are 4k monitors with more than 60hz refreshrates you know... these ones get barely saturated even with the 1500$ 3090 that came years after the avance of 4k monitors lol) 

 

"rated at 350 watts power consumption"

 

So now powerhungry GPUs are not something we are boowing ? :P

 

"the cut down PCB that makes nvidia's NEW flow through cooler POSSIBLE"

 

but wait there is mooooore!! :P So yea it is a FEATURE not a bug hahahaha as if conventional cooling couldnt cool a 3090 presenting an obvious flaw (in terms of power delivery etc) of a smaller PCB with less and more dense packed passive components as something good and not slizy jensen trying to make extra profits out of it by crippling the pcb... 

 

"But wait double the video memory but why? "

 

well the damn thing has a cut off PCB and still costs 1500$ and memory chip prices sinking there is plenty of room to add DOUBLE the memory (while useless since its more than enough for 4k which it cant even play above 120 FPS in order for us to make use of high refresh rate 4k monitors) in order to have one more marketing trick up their sleeves to push children steal their parent's cc and buy one lol 

 

"gaming at 8k also demands this much video memory"... a) it does not at least in most cases with nowadays titles b) the damn thing cant play 8k games not in a decent framerate at least despite costing 1500$ (the budget of an entire gaming PC or at least one you could buy not so many years back before nvidia started to inflate its prices) 

 

"1440p->8k via dlss delivers significantly more detail than 4k does"

 

Did linus even look at the clips he put in the video? I seriously doubt he did, he just read the autocue  that was copy pasted by Nvidia's pamphlet that was given to them with the 3090 lol.... I mean the dlss cut is blurrier than the 4k footage  you have to be blind for not being able to tell the difference lol .... 

 

and at this point ill stop we got the picture... nvidia is a nice friend to have lets become their cheerleader lol.... 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Samfisher said:

In the video Linus mentions that even their 850w PSU struggled.  How?

Given that the rtx 3080 has terrible transiense. Worse than Vega who had notably bad transiense. 

 

It has to do with the specific PSU they are running. Given other reports of 750w and 850w seasonic Prime PSUs tripping on the rtx 3080, should be fairly obvious what unit they were running. 

 

Would also explain why seasonic wasnt sponsoring the video. 

 

Tho still waiting for confirmation from @GabenJr over what unit it is. And seasonic not sponsoring the video could just entitely be coincidental. 

 

Would be nice to have a list of units that trip on Ampere for no other reason than transiense. 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, GoldenLag said:

Given that the rtx 3080 has terrible transiense. Worse than Vega who had notably bad transiense. 

 

It has to do with the specific PSU they are running. Given other reports of 750w and 850w seasonic Prime PSUs tripping on the rtx 3080, should be fairly obvious what unit they were running. 

 

Would also explain why seasonic wasnt sponsoring the video. 

 

Tho still waiting for confirmation from @GabenJr over what unit it is. And seasonic not sponsoring the video could just entitely be coincidental. 

 

Would be nice to have a list of units that trip on Ampere for no other reason than transiense. 

I doubt there is a correlation with seasonic not wanting to sponsor the video because of this, if anything it gives a reason to push for their even more expensive 1000+ watt units lol

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, papajo said:

I doubt there is a correlation with seasonic not wanting to sponsor the video because of this, if anything it give a reason to push for their even more expensive 1000+ watt units lol

They didnt include the seasonic 12-pin microfit cable that they included in the rtx 3080 review. 

 

So pardon my conspiracy theory. 

 

Still, would be nice to know what PsUs they used. Because an 850w unit that trips should be noted down on a list of transient sensetive PSUs. 

Link to comment
Share on other sites

Link to post
Share on other sites

24 minutes ago, GoldenLag said:

They didnt include the seasonic 12-pin microfit cable that they included in the rtx 3080 review. 

 

So pardon my conspiracy theory. 

 

Still, would be nice to know what PsUs they used. Because an 850w unit that trips should be noted down on a list of transient sensetive PSUs. 

The 850W unit was a SeaSonic PRIME Platinum SSR-850PD. I can confirm that it was a coincidence that SeaSonic didn't sponsor the video - Sponsors don't work that way, and we didn't say anything to them either way (the unit is older-gen).

 

As for the rest of this thread, I don't know why anybody actually cares about 8K gaming, and it was only included because Nvidia marketed the card that way (similar to GamersNexus' focus on 8K gaming benchmarks because that's what Nvidia was yelling about). Our conclusion was that nobody outside of the niche enthusiast and maybe very complex 3D modeling space should even consider the card, so I'm not sure why people think we're trying to market the thing. You can look at something's strengths without crapping on it at every turn. That's all I have to say about it.

Emily @ LINUS MEDIA GROUP                                  

congratulations on breaking absolutely zero stereotypes - @cs_deathmatch

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×