Jump to content

We wanted to review the new RTX 5060 and RTX 5060Ti, the problem was Nvidia didn't send us any... Makes it kind of tough to make a review about. So instead we decided to rant about the ENTIRE 50 series launch and how it has frustrated not just consumers, but reviewers as well.

Link to post
Share on other sites

So this marks the 2nd time we get a non-review on a card launch, right? First one being RX590. 

 

Good that you guys kept the consistency on this part.

Press quote to get a response from someone! | Check people's edited posts! | Be specific! | Trans Rights

 

Link to post
Share on other sites

Video cards have been such a shitshow the last few years, I cannot say Nvidia paper-launching multiple SKUs is a surprise. Everything from manufacturers doing nothing to prevent retail scalping, feature-locking capable hardware because it’s a previous generation,  and cutting PCIe lanes on lower-end cards so they’re data-limited on the older computers most people will be purchasing them for. 
 

Yeah there have been improvements, it’s just that much of the computer-building experience has become too tedious when it absolutely doesn’t need to be.

AMD Ryzen 5900X

T-Force Vulcan Z 3200mhz 2x32GB

EVGA RTX 3060 Ti XC

MSI B450 Gaming Plus

Cooler Master Hyper 212 Evo

Samsung 970 EVO Plus 2TB

WD 5400RPM 2TB

EVGA G3 750W

Corsair Carbide 300R

Arctic Fans 140mm x4 120mm x 1

 

Link to post
Share on other sites

I wonder if it'll cause problems if reviewers just say :

"Until the card is in our hand, tested, and proven otherwise, just consider it as shit, don't buy"

There is approximately 99% chance I edited my post

Refresh before you reply

__________________________________________

ENGLISH IS NOT MY NATIVE LANGUAGE, NOT EVEN 2ND LANGUAGE. PLEASE FORGIVE ME FOR ANY CONFUSION AND/OR MISUNDERSTANDING THAT MAY HAPPEN BECAUSE OF IT.

Link to post
Share on other sites

dam linus is on a burn roll

MSI x399 sli plus  | AMD theardripper 2990wx all core 3ghz lock |Thermaltake flow ring 360 | EVGA 2080, Zotac 2080 |Gskill Ripjaws 128GB 3200 MHz | Corsair RM1200i |200tb raw | Asus tuff gaming mid tower| 10gb NIC

Link to post
Share on other sites

1 hour ago, Poinkachu said:

I wonder if it'll cause problems if reviewers just say 

Simply compare <nothing> against its most likely competitors and declare them winners in everything by default.

Remember to either quote or @mention others, so they are notified of your reply

Link to post
Share on other sites

3 hours ago, Bitter said:

They don't want reviews of the 8GB models, simple as that. That's why they didn't release 8GB samples for reviews.

Everyone should just review the 8gb versions and ignore the 16gb (but that means reviewers having to buy the useless stuff themselves, I see the "dilemma"! 🤨)

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to post
Share on other sites

Great approach. Don't play their games and instead use launch day to reveal their BS. I hope other reviewers do the same. Maybe someday they get enough bad publicity that Jensen personally drops by marketing and goes "Who did this to me?!"

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to post
Share on other sites

6 hours ago, Bitter said:

They don't want reviews of the 8GB models, ...

Obviously. It's a safe bet reviews would shred them for releasing yet another 8 GB card. Except now they get shredded for hiding it. Not sure that's better for their PR.

 

Probably still enough people who will buy it regardless.

 

2 hours ago, Mark Kaine said:

(but that means reviewers having to buy the useless stuff themselves, I see the "dilemma"! 🤨)

I'm sure most of them (at least the bigger channels) have no issue spending the money.

 

The actual issue is that buying the card means the review can't be published until the card is already available for retail and they actually managed to get one early enough to still matter.

Remember to either quote or @mention others, so they are notified of your reply

Link to post
Share on other sites

I was looking forward to reviewers having a field day showing how 8GB obliterates performance compared to the same card with 16GB in realistic settings.

 

Unfortunately, even the RX9060 is rumored to have 8GB VRAM.  Only battlemage has entry level cards with 10GB and 12GB VRAM buffers but it's really hard getting them at MSRP. It feels Intel isn't making nearly enough of them.

Link to post
Share on other sites

I've felt for a while that we were approaching the end of the Nvidia dGPUs in the consumer market. Nvidia has been telegraphing they want out of Consumer for years, but it took a full decade for the CUDA server parts to go from "95% profit margin! (ignore we don't sell any!)" to it being the core of their business now. I thought the 2030 cycle would be their last in consumer (and I should probably find where I said that a number of years ago), but it might honestly be the next cycle. In probably 2028. The 6090 could be the last Nvidia high-end consumer card from them.

 

The business case for staying in the market is going to get interesting. They probably will from a revenue point of view, but, frankly, every die they send to Consumer is worth more in Compute. And every wafer they have to use for low-end cards could be worth 5x more as Compute cards.

Link to post
Share on other sites

3 minutes ago, Taf the Ghost said:

but it took a full decade for the CUDA server parts to go from "95% profit margin! (ignore we don't sell any!)" to it being the core of their business now.

It's pure speculation on my side, but I believe this AI bubble is going to pop as there is no way for VC to get a ROI on current or near models. it's a race to the bottom, and local models are nearly on par with proprietary models. After which Nvidia will have to return to its core business of gaming GPU.

 

It's unfortunate Intel and AMD are having such an hard time catching up to Nvidia. The mid and low end of the market are up for the taking...

Link to post
Share on other sites

11 hours ago, SorryBella said:

So this marks the 2nd time we get a non-review on a card launch, right? First one being RX590. 

 

Good that you guys kept the consistency on this part.

The review will come later on Short Circuit, as Linus mentioned in the recent LTT short on Youtube.

AMD Ryzen™ 5 5600g w/ Radeon Graphics | 16GB DDR4-3200 RAM | 256GB NVME SSD + 2TB HDD | Amazon Basics 2.0 Speakers

 

I'M JUST A REAL-LIFE TOM SAWYER

Link to post
Share on other sites

2 hours ago, 05032-Mendicant-Bias said:

It's pure speculation on my side, but I believe this AI bubble is going to pop as there is no way for VC to get a ROI on current or near models. it's a race to the bottom, and local models are nearly on par with proprietary models. After which Nvidia will have to return to its core business of gaming GPU.

 

It's unfortunate Intel and AMD are having such an hard time catching up to Nvidia. The mid and low end of the market are up for the taking...

I don't disagree on the AI bubble, which is probably why Nvidia won't actually leave Consumer. However, Jensen has been trying everything to get out of consumer since at least 2001.

Link to post
Share on other sites

9 hours ago, Bitter said:

To be clear, I'm not against 8GB cards, I'm against over priced 8GB cards. I'm using a 6GB 1660S in a system for non gaming and it's fine. 8GB is fine if you're just after the media engine or computer power.

8gb is also fine if this was like a $200 or at most $250 1080p card
I think we kinda loose the plot over how much Vram you need especially at 1080p, for a budget card 8gb is fine but $300 even with inflation shouldn't be considered budget and it should have been more 5060 10-12gb 5060ti 16gb only

Link to post
Share on other sites

21 minutes ago, DJ_Jay125 said:

8gb is also fine if this was like a $200 or at most $250 1080p card
I think we kinda loose the plot over how much Vram you need especially at 1080p, for a budget card 8gb is fine but $300 even with inflation shouldn't be considered budget and it should have been more 5060 10-12gb 5060ti 16gb only

100% you get it. Budget cards have their place but the important thing is that they're budget cards. $300 isn't budget card territory, that's lower-mid-range area. The 5060 TI 16GB should be $350ish, the TI 8GB should be $275-300ish, the non TI 8GB should be $225-250ish.

Link to post
Share on other sites

11 hours ago, Taf the Ghost said:

I don't disagree on the AI bubble, which is probably why Nvidia won't actually leave Consumer. However, Jensen has been trying everything to get out of consumer since at least 2001.

I think this is interesting considering Silicon Graphics kind of died by ignoring the consumer space.

Link to post
Share on other sites

23 hours ago, Bitter said:

They don't want reviews of the 8GB models, simple as that. That's why they didn't release 8GB samples for reviews.

Nah. 

 

Nvidia rushed the most expensive cards out before the tariffs could hit, subsequent, this is why drivers have been trash for the last 4 months, because they've clearly skipped some QA steps (see the missing ROP's.)

 

Link to post
Share on other sites

6 hours ago, Brian McKee said:

I think this is interesting considering Silicon Graphics kind of died by ignoring the consumer space.

 

While the 90s GPU market was a wildly different place, withering the consumer space still comes with marketing and PR effects. I feel like Nvidia probably ends up in a space where they just don't prioritize Consumer, so volume is whatever is still profitable on a per-unit approach. Where they spend the 2030s basically riding on the coattails of the previous highs. dGPU will never be a direct priority for either AMD or Intel, so the logic ends up revolving around it not actually being a competitive space.

 

Nvidia absolutely will keep making GPUs, it's just everything will be Server First and then squeezed into a consumer dGPU in the process.  At least I think they'll try to do that, then Jensen will see the results and fire a bunch of people. Jensen has kept Nvidia in such a way they can make changes extremely rapidly, so even long term plans can just change in ways other companies simply don't do.

Link to post
Share on other sites

57 minutes ago, Taf the Ghost said:

 

While the 90s GPU market was a wildly different place, withering the consumer space still comes with marketing and PR effects. I feel like Nvidia probably ends up in a space where they just don't prioritize Consumer, so volume is whatever is still profitable on a per-unit approach. Where they spend the 2030s basically riding on the coattails of the previous highs. dGPU will never be a direct priority for either AMD or Intel, so the logic ends up revolving around it not actually being a competitive space.

 

Nvidia absolutely will keep making GPUs, it's just everything will be Server First and then squeezed into a consumer dGPU in the process.  At least I think they'll try to do that, then Jensen will see the results and fire a bunch of people. Jensen has kept Nvidia in such a way they can make changes extremely rapidly, so even long term plans can just change in ways other companies simply don't do.

With how VLSI works, there is no reason to not do a dGPU. You design the a new SM every two years. that SM is now your backbone to every single GPU acceleratable ASIC across all sectors. 

the engineering that goes into the SM is amortized. Every die is a different mix of SMs, Cache, and Analog IP blocks that all have to communicate with each other but its not like that knowledge is useless. all that engineering know how and skills gained go into the next asic for a jetson or a professional gpu. 

Nvidia has zero reason to leave the consumer  gaming dGPU when the investment is so "low" when it does not use any IP blocks unique to itself. 

This was the whole point of cuda and GPGPU. 

The GB100/GB200 still uses... Blackwell SMs 

Intel wants to get into dGPUs because it forces them to improve the Xe cores for example, which go into all their CPUs. 

AMD bought ATI FOR FUSION. (APU) so they invest in dGPUs so they can have a robust custom SoC business which has paid off for them with the consoles and the steam deck. (way later then they had hopped)

The game gpus feed the pro enviroment, and vice versa. Tensor cores were on volta first for example. not the RTX cards. 

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×