Jump to content

UPDATE: NVIDIA backtracks - Hardware Unboxed blacklisted from receiving GeForce FE review samples over “focus on rasterization over ray-tracing”

D13H4RD
Message added by WkdPaul,

Reminder to follow the Community Standards when interacting with others.

6 minutes ago, SpaceGhostC2C said:

DLSS is not a gimmick just as anti-aliasing is not a gimmick. I also see no coverage of anti-aliasing in graphics cards reviews.

Nvidia is not marketing the cards on Anti Aliasing feature, the same way no one is making shadow or occlusion benchmarks, what's an awful comparison, are those gimmicks to you ? jesus lol 

 

Spoiler
Spoiler

AMD 5000 Series Ryzen 7 5800X| MSI MAG X570 Tomahawk WiFi | G.SKILL Trident Z RGB 32GB (2 * 16GB) DDR4 3200MHz CL16-18-18-38 | Asus GeForce GTX 3080Ti STRIX | SAMSUNG 980 PRO 500GB PCIe NVMe Gen4 SSD M.2 + Samsung 970 EVO Plus 1TB PCIe NVMe M.2 (2280) Gen3 | Cooler Master V850 Gold V2 Modular | Corsair iCUE H115i RGB Pro XT | Cooler Master Box MB511 | ASUS TUF Gaming VG259Q Gaming Monitor 144Hz, 1ms, IPS, G-Sync | Logitech G 304 Lightspeed | Logitech G213 Gaming Keyboard |

PCPartPicker 

Link to comment
Share on other sites

Link to post
Share on other sites

IIRC, Hardware Unboxed has never dismissed DLSS since 2.0, even praising it wholeheartedly in a really long piece Tim made when DLSS 2.0 was releasing. 

 

It's just RT where they're not too focused on as of now 

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Rohith_Kumar_Sp said:

Nvidia is not marketing the cards on Anti Aliasing feature, the same way no one is making shadow or occlusion benchmarks, what's an awful comparison, are those gimmicks to you ? jesus lol 

Are you reading backwards? I said kind of explicitly that those aren't gimmicks, the same way I said DLSS is not "a gimmick". They do prove that not all features get coverage, certainly not all features that make low resolutions look better on large displays get that coverage.

But you provide the best evidence of my point I could wish for:

Quote

Nvidia is not marketing the cards on Anti Aliasing feature,

Exactly, that's the only difference: that they are choosing to market their cards on it. Which is fine, but trying to make others implement their marketing strategy instead of producing independent reviews is what's wrong. And the fact that you defend that what is and is not worth allocating time to in a review should be based on the companies' marketing... well, yes, "jesus" indeed.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Shorty88jr said:

I don't get the hate fully? Hardware Unboxed is pretty AMD biased in his reviews. He repeatedly mentioned AMD and their 16GB of ram of all things while basically glossing over ray tracing. Also on top of that Ray tracing  is the future I didn't believe it a few years ago but looking into it more it's literally no point in buying AMD without good ray tracing. The technology that is there is beyond what most people can even fathom. Videogame graphics are being completely overhauled on a level not seen since the jump from 2d to 3d.

 

I wanted to highlight this type of post because of discussions I've had around here about what marketing manipulation looks like. This post is just specific enough while being just ignorant enough that it's practically impossible to tell whether it's real user that's clueless or manipulation marketing by a bot marketing agency. It's fascinating to behold because you can't really tell, but this is what it looks like. It's also what happens on every social media platform.

 

As for the topic at hand, no, HUB isn't AMD biased. They just spent a week ripping AMD over the marketing around the 6800 launch. Nvidia just barely didn't Paper Launch a new generation of cards and is trying to do something to make everyone look away.

 

As for Ray Tracing, sorry to bust anyone's bubble, but the current generation of RT performance is just a downpayment for what it'll eventually be. RTX Gen 1 really was a Tech Demo. The barely improved RTX Gen 2 is only not a tech demo when you throw a massive amount of units at it. AMD's implementation appears to be significantly better as a matter of silicon space for performance, but we need to wait for optimization passes to get an appreciation if it's just a downpayment or there might be something useful in this generation. Reality is that it's just a fun feature for at least another 2 generations. That's the way it is. This is something of a Chicken & Egg issue, which is why it was Microsoft that kicked off the slow transition. And it's going to be really slow.

 

There's only 3 things that matter in GPU sales for the next several years: Raster performance per price tier, Hardware Accelerated Media Encoders and Hardware-based Streaming. Nvidia can easily leverage their advantages in the second two without being knuckleheads about things. There's some deep irony that a lot of the negative angles we've seen on the Navi 21 launch stems from about a decade of AMD's North American PR departments pissing off reviewers, and now it's suddenly Nvidia that's in foot-shooting mode.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Taf the Ghost said:

There's some deep irony that a lot of the negative angles we've seen on the Navi 21 launch stems from about a decade of AMD's North American PR departments pissing off reviewers, and now it's suddenly Nvidia that's in foot-shooting mode.

NVIDIA's PR department saw AMD's Radeon PR department pissing off a fair number of reviewers and thought "We need to one-up them".

 

So this happened. :P

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

I don't understand why anyone's mad really mad

I'm disappointed but at what i couldn't tell ya yet

If any aib wanted them to focus on a feature thermals or having longer boost time are you going to comply?

Sent review sample and asked to focus on a feature isn't that why you get them from them for free?

Ofc

And nvidia black listing them? Is this the first time they didn't comply?

If it's first time infraction then disappointed in nvidia here

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, SpaceGhostC2C said:

DLSS is not a gimmick just as anti-aliasing is not a gimmick. I also see no coverage of anti-aliasing in graphics cards reviews.

What is kind of dishonest is to present DLSS as a performance boost for higher resolutions: just like AA, DLSS is a smoothness boost for lower resolutions (I also don't see any coverage of "Radeon Boost", btw, which is also a trick to lower resolutions when less perceptible by the player). Showing "4K without DLSS, 4K with DLSS" is misleading, just like no one would take seriously a graph that shows "1440p without anti-aliasing, 1080p with anti-aliasing" is if they were the same resolution and therefore similar measures of performance.

 

Ultimately, whether DLSS and RT are worth covering should be the decision of independent reviewers, and we can choose whether to give our view time and, even monetary support, to those making a big deal of DLSS, like LTT, to those RT evengelist, like the dude seeing his life flash before his eyes, or to the most dismissive ones, like HUB. Or to several of them, in order to have different perspectives. We know brands will try to influence the narrative through their marketing, but there is a limit to how much they can try to fully control it before turning unethical.

 

 

DLSS is definitely a gimmick. But it does work. Somewhat. If the developers put the work (i.e. get paid to) in to make it work decently. The best implementation is still Control, which used "DLSS 1.9" which is somehow better than the current 2.0 implementation. It's also a gimmick that's missing 1/2 of it's original promise. "DLSS 2X" was going to be a thing, lol.  But it also isn't really superior to downscaling the resolution by 30% and adding sharpening. Which is pretty much what we can expect AMD to do at the driver level with their upcoming method. (You downscale and Supersample, then have some sort of context-aware sharpening function.) Though maybe it'll require a little work from developers to implement. We'll just have to see.

 

The main reason to push DLSS is so that "RTX" doesn't become attached to "joke" or "doesn't work" or "look at the stutter, bro!". We had almost meme'd it out of existence after the first year. Without downscaling, RT performance just isn't there. It's a huge cost for very little. In the case of Cyberpunk, you can see the purpose from videos, but it's at a huge cost still. That'll always be true of the current hardware. It's only several generations from now where that isn't the case. It's a lot like the early days of different AA approaches. SMAA used to be impossible to run with, even if t was supported in hardware. This is the nature of the "it works but it isn't useful" tech in GPUs. You have to build an ecosystem to support something before things can move over.

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, D13H4RD said:

NVIDIA's PR department saw AMD's Radeon PR department pissing off a fair number of reviewers and thought "We need to one-up them".

 

So this happened. :P

I was joking with someone that AMD might have shipped their terrible PR to the competitors to ruin them. Considering some of the stunts Intel and Nvidia have pulled lately, it really might not have been a joke. We're so used to AMD having the incompetent PR that it's strange to think it's not AMD doing some of these stupid stunts lately.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Taf the Ghost said:

DLSS is definitely a gimmick. But it does work. Somewhat. If the developers put the work (i.e. get paid to) in to make it work decently. The best implementation is still Control, which used "DLSS 1.9" which is somehow better than the current 2.0 implementation. It's also a gimmick that's missing 1/2 of it's original promise. "DLSS 2X" was going to be a thing, lol.  But it also isn't really superior to downscaling the resolution by 30% and adding sharpening. Which is pretty much what we can expect AMD to do at the driver level with their upcoming method. (You downscale and Supersample, then have some sort of context-aware sharpening function.) Though maybe it'll require a little work from developers to implement. We'll just have to see.

 

The main reason to push DLSS is so that "RTX" doesn't become attached to "joke" or "doesn't work" or "look at the stutter, bro!". We had almost meme'd it out of existence after the first year. Without downscaling, RT performance just isn't there. It's a huge cost for very little. In the case of Cyberpunk, you can see the purpose from videos, but it's at a huge cost still. That'll always be true of the current hardware. It's only several generations from now where that isn't the case. It's a lot like the early days of different AA approaches. SMAA used to be impossible to run with, even if t was supported in hardware. This is the nature of the "it works but it isn't useful" tech in GPUs. You have to build an ecosystem to support something before things can move over.

After seeing it in personi know it's not a gimmick

Thing i hate is how every review compares it with everything else maxed have you tried it with other settings dialed down?

Becomes subjective but still fun to play around with for your own taste on visuals

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, pas008 said:

After seeing it in personi know it's not a gimmick

Thing i hate is how every review compares it with everything else maxed have you tried it with other settings dialed down?

https://www.merriam-webster.com/dictionary/gimmick

 

See definitions 2.b and 2.c . 

 

It's a "gimmick" in the proper sense, actually. Just not a trick. It is working technology, after several iterations of significant failures at doing its job. But it is pushed in marketing to confuse situations and make things "viable" that really aren't yet.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, pas008 said:

After seeing it in personi know it's not a gimmick

Thing i hate is how every review compares it with everything else maxed have you tried it with other settings dialed down?

Becomes subjective but still fun to play around with for your own taste on visuals

Would you consider PhysX a gimmick? I do. There are about 40 games that supported it versus the 20 or so the support DLSS. Unless you consider PhysX not a gimmick there is now way DLSS is.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, SpaceGhostC2C said:

Are you reading backwards? I said kind of explicitly that those aren't gimmicks, the same way I said DLSS is not "a gimmick". They do prove that not all features get coverage, certainly not all features that make low resolutions look better on large displays get that coverage.

my bad, english ain't my first language. but HWUB has been shitting on Ray tracing since a better part of the last year, idk what he has against RTX. 

 

Spoiler
Spoiler

AMD 5000 Series Ryzen 7 5800X| MSI MAG X570 Tomahawk WiFi | G.SKILL Trident Z RGB 32GB (2 * 16GB) DDR4 3200MHz CL16-18-18-38 | Asus GeForce GTX 3080Ti STRIX | SAMSUNG 980 PRO 500GB PCIe NVMe Gen4 SSD M.2 + Samsung 970 EVO Plus 1TB PCIe NVMe M.2 (2280) Gen3 | Cooler Master V850 Gold V2 Modular | Corsair iCUE H115i RGB Pro XT | Cooler Master Box MB511 | ASUS TUF Gaming VG259Q Gaming Monitor 144Hz, 1ms, IPS, G-Sync | Logitech G 304 Lightspeed | Logitech G213 Gaming Keyboard |

PCPartPicker 

Link to comment
Share on other sites

Link to post
Share on other sites

A gimmick is defined "as a trick or device intended to attract attention, publicity, or trade".

 

By that definition, DLSS is indeed a gimmick as it is one of NVIDIA's big sales pitches and has undeniably attracted a lot of attention. Gimmicks do not automatically mean "useless" or "bad".

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, BabaGanuche said:

Would you consider PhysX a gimmick? I do. There are about 40 games that supported it versus the 20 or so the support DLSS. Unless you consider PhysX not a gimmick there is now way DLSS is.

No just like glide on 3dfx or virtual sound sound ust because its not everywhere doesn't mean it doesn't work and isnt noticeable 

Is it always better? Yes no subjective comes into play then because sometimes other things are different that a or b might dislike or like

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, D13H4RD said:

A gimmick is defined "as a trick or device intended to attract attention, publicity, or trade".

 

By that definition, DLSS is indeed a gimmick as it is one of NVIDIA's big sales pitches and has undeniably attracted a lot of attention. Gimmicks do not automatically mean "useless" or "bad".

Doesn't it use hardware and software to achieve this?

Wouldn't that make gpus a gimmick period?

It's only changing visuals 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, BabaGanuche said:

Would you consider PhysX a gimmick? I do. There are about 40 games that supported it versus the 20 or so the support DLSS. Unless you consider PhysX not a gimmick there is now way DLSS is.

its not like most of those games have a broken version of DLSS

Good luck, Have fun, Build PC, and have a last gen console for use once a year. I should answer most of the time between 9 to 3 PST

NightHawk 3.0: R7 5700x @, B550A vision D, H105, 2x32gb Oloy 3600, Sapphire RX 6700XT  Nitro+, Corsair RM750X, 500 gb 850 evo, 2tb rocket and 5tb Toshiba x300, 2x 6TB WD Black W10 all in a 750D airflow.
GF PC: (nighthawk 2.0): R7 2700x, B450m vision D, 4x8gb Geli 2933, Strix GTX970, CX650M RGB, Obsidian 350D

Skunkworks: R5 3500U, 16gb, 500gb Adata XPG 6000 lite, Vega 8. HP probook G455R G6 Ubuntu 20. LTS

Condor (MC server): 6600K, z170m plus, 16gb corsair vengeance LPX, samsung 750 evo, EVGA BR 450.

Spirt  (NAS) ASUS Z9PR-D12, 2x E5 2620V2, 8x4gb, 24 3tb HDD. F80 800gb cache, trueNAS, 2x12disk raid Z3 stripped

PSU Tier List      Motherboard Tier List     SSD Tier List     How to get PC parts cheap    HP probook 445R G6 review

 

"Stupidity is like trying to find a limit of a constant. You are never truly smart in something, just less stupid."

Camera Gear: X-S10, 16-80 F4, 60D, 24-105 F4, 50mm F1.4, Helios44-m, 2 Cos-11D lavs

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, pas008 said:

Doesn't it use hardware and software to achieve this?

Wouldn't that make gpus a gimmick period?

When we call something a gimmick, we mostly just refer to a featureset(s) that's heavily marketed by the manufacturer to the point where it has garnered a lot of attention.

 

Every product has a gimmick of its own. NVIDIA has DLSS. The PS5's DualSense feature can be considered a gimmick. Some novel phone features can also be considered a gimmick.

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, pas008 said:

I don't understand why anyone's mad really mad

I'm disappointed but at what i couldn't tell ya yet

If any aib wanted them to focus on a feature thermals or having longer boost time are you going to comply?

Sent review sample and asked to focus on a feature isn't that why you get them from them for free?

Ofc

And nvidia black listing them? Is this the first time they didn't comply?

If it's first time infraction then disappointed in nvidia here

This wouldn't be a review, this would be buying a narrative in return for a free product. Anyone that willingly accepts money or a product and agrees to change their opinion of it as a result of that transaction should be scrutinized and be considered unworthy of trust. This is not the case for HWUB, it was the opposite. They were provided hardware and a reviewers guide (standard for all reviewers, tech outlets and even retailers that advertise product pages) and decided to review the cards using their own methodology. Nothing in those reviewers guides states you cannot deviate from them in your testing. I know this because I've been given my fair share of them for my lab testing as they are standard in blanket NDA's too. The issue people have with this is that it is a targeted reaction from Nvidia because they disliked the coverage of their product and are punishing the reviewer for not reviewing it as they would like to see their product reviewed. If Nvidia wanted certain features highlighted, they should have paid for a promotional showcase as a sponsor spot.

 

Now with that being said, it's time for the unpopular part of my opinion on this situation. As scummy of a practice as it is to throw a tantrum for not getting your way, Nvidia has every right to do so. They are not obligated to provide anyone, press included,  early/review access to their hardware. This would likely be to their detriment, given how cheap this form of advertisement is (even when it's less than favorable at times). I personally would rather see reviewers not receive free product at all and buy their hardware like the rest of us as it adds more weight to the reviews when there is an actual dollar amount attached that comes from ones own pocket. It's extremely easy to forget just how expensive hardware is when you have stockpiles of it on shelves and in bins. Those of you will likely mention "How can you get it early if you have to pay like the rest of us?", to which I'll say, the answer is quite simple, as my employer does it on a daily basis for hardware such as this, lol. We buy our test samples before I destroy them. Granted, we are offered a discount, so it's not really "retail pricing", but it's still somewhat relevant. Given the symbiotic relationship these people have, I imagine the same could be offered to them.

 

At the end of the day, this likely won't hurt Nvidia. GPU demand is still too high right now and people will continue to buy their product. Even if you don't buy directly from Nvidia, their board partners do, and you'll continue to buy from them. I also don't see this hurting HWUB much at all. There are plenty of avenues one can use to their benefits to receive hardware before embargo, and their connections will make this fairly easy for them.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, MageTank said:

This wouldn't be a review, this would be buying a narrative in return for a free product. Anyone that willingly accepts money or a product and agrees to change their opinion of it as a result of that transaction should be scrutinized and be considered unworthy of trust. This is not the case for HWUB, it was the opposite. They were provided hardware and a reviewers guide (standard for all reviewers, tech outlets and even retailers that advertise product pages) and decided to review the cards using their own methodology. Nothing in those reviewers guides states you cannot deviate from them in your testing. I know this because I've been given my fair share of them for my lab testing as they are standard in blanket NDA's too. The issue people have with this is that it is a targeted reaction from Nvidia because they disliked the coverage of their product and are punishing the reviewer for not reviewing it as they would like to see their product reviewed. If Nvidia wanted certain features highlighted, they should have paid for a promotional showcase as a sponsor spot.

 

Now with that being said, it's time for the unpopular part of my opinion on this situation. As scummy of a practice as it is to throw a tantrum for not getting your way, Nvidia has every right to do so. They are not obligated to provide anyone, press included,  early/review access to their hardware. This would likely be to their detriment, given how cheap this form of advertisement is (even when it's less than favorable at times). I personally would rather see reviewers not receive free product at all and buy their hardware like the rest of us as it adds more weight to the reviews when there is an actual dollar amount attached that comes from ones own pocket. It's extremely easy to forget just how expensive hardware is when you have stockpiles of it on shelves and in bins. Those of you will likely mention "How can you get it early if you have to pay like the rest of us?", to which I'll say, the answer is quite simple, as my employer does it on a daily basis for hardware such as this, lol. We buy our test samples before I destroy them. Granted, we are offered a discount, so it's not really "retail pricing", but it's still somewhat relevant. Given the symbiotic relationship these people have, I imagine the same could be offered to them.

 

At the end of the day, this likely won't hurt Nvidia. GPU demand is still too high right now and people will continue to buy their product. Even if you don't buy directly from Nvidia, their board partners do, and you'll continue to buy from them. I also don't see this hurting HWUB much at all. There are plenty of avenues one can use to their benefits to receive hardware before embargo, and their connections will make this fairly easy for them.

Noone said change opinion or facts 

I said focus on a feature

If evga wanted thermals part of your review claiming they the best and can boost longer are you going to include it?

Link to comment
Share on other sites

Link to post
Share on other sites

Hardware Unboxed's tweet has 10.1k likes.

The r/nvidia post regarding HUB has 21.3k upvotes and is currently #9 on reddit's popular feed.

 

If Nvidia's plan was to quietly brush this under the rug... well, their plan isn't working out that well.

 

Screenshot_20201212-032155_Reddit.jpg

Screenshot_20201212-032505_Twitter.jpg

Link to comment
Share on other sites

Link to post
Share on other sites

Harvard Unboxed has always been dismissive of raytracing and 4k for that matter because of their Patron surveys and Steve's opinion. 

 

They also push value like it is a drug and don't mind insulting people like me that buy high end hardware. That dose not stop me from subscribing to them but I can understand Nvidia getting tired of them. 

 

Unlike Gamers Nexus and Cooler Master where Gamers Nexus pointed out and mocked the flaws in CMs case designs the issue between Nvidia and Hardware Unboxed is more about a Nvidia's philosophy and not the quality of their product.

 

I am a 4k games that likes eye candy features like ray tracing so Nvidia is going in the right direction for me but it does go against the core audience of Hardware Unboxed.  

 

 

RIG#1 CPU: AMD, R 7 5800x3D| Motherboard: X570 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3200 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX500 2.5" 2TB | Monitor: ASUS ROG Swift PG42UQ

 

RIG#2 CPU: Intel i9 11900k | Motherboard: Z590 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3600 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1300 G+ | Case: Lian Li O11 Dynamic EVO | Cooler: Noctua NH-D15 | SSD#1: SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX300 2.5" 1TB | Monitor: LG 55" 4k C1 OLED TV

 

RIG#3 CPU: Intel i9 10900kf | Motherboard: Z490 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 4000 | GPU: MSI Gaming X Trio 3090 | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Crucial P1 1TB | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

 

RIG#4 CPU: Intel i9 13900k | Motherboard: AORUS Z790 Master | RAM: Corsair Dominator RGB 32GB DDR5 6200 | GPU: Zotac Amp Extreme 4090  | PSU: EVGA 1000 G+ | Case: Streacom BC1.1S | Cooler: EK 360mm AIO | SSD: Corsair MP600 1TB  | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, pas008 said:

Noone said change opinion or facts 

I said focus on a feature

If evga wanted thermals part of your review claiming they the best and can boost longer are you going to include it?

EVGA can say whatever they want to say, what they want has no relevance on what will actually be published. You cannot tell someone to "focus" on one of the stronger features. That implies that you want it to be more prominent, potentially taking time away from less favorable aspects of your product. Like I mentioned earlier, they already provide this in reviewers guides, but in those very guides they do not consider their testing mandatory nor do they list requirements for what must be tested. That's simply not how a review works. You cannot tell someone that they can only have an opinion as long as their opinion is derived from a very specific set of standards controlled by you.

 

Also, if you are going to make an analogy, stop trying to make it seem so obvious by using things like thermals to bolster your point. You know full well that if I (MageTank) do a review, thermals is absolutely going to be included, regardless of whether a manufacturer says so. Your analogy would have been far more accurate had you asked if I would refrain from publishing thermal results if a manufacturer asked me to do so simply because I received free hardware. The answer would still be no, they'd be roasted as much as their product was roasted during the testing.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, MageTank said:

EVGA can say whatever they want to say, what they want has no relevance on what will actually be published. You cannot tell someone to "focus" on one of the stronger features. That implies that you want it to be more prominent, potentially taking time away from less favorable aspects of your product. Like I mentioned earlier, they already provide this in reviewers guides, but in those very guides they do not consider their testing mandatory nor do they list requirements for what must be tested. That's simply not how a review works. You cannot tell someone that they can only have an opinion as long as their opinion is derived from a very specific set of standards controlled by you.

 

Also, if you are going to make an analogy, stop trying to make it seem so obvious by using things like thermals to bolster your point. You know full well that if I (MageTank) do a review, thermals is absolutely going to be included, regardless of whether a manufacturer says so. Your analogy would have been far more accurate had you asked if I would refrain from publishing thermal results if a manufacturer asked me to do so simply because I received free hardware. The answer would still be no, they'd be roasted as much as their product was roasted during the testing.

So why would i send a product out to be reviewed if they aren't going to focus on a feature I want included

Hence the blacklist

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Did NVIDIA tell them or how were they "notified" exactly?

 

Also for anyone looking for ray tracing, just use Marty McFly's RTGI shader for ReShade (costs 5 american schilings a month if you want it up to date or a single download for that price and you cancel the Patreon. It's not plug and play like flipping RTX switch as it require some fiddling to get it working (depth buffer access), but when you get it going, it makes ridiculously realistic lighting and shadowing. It's half faked because it's screen space based, but within the screen space, it creates stunning depth and global illumination with jaw dropping light bounce. In basically any game out there! Especially for people with cards like RTX 3000 and RX 6000 as it does RT through regular shaders and those cards have enough of raw rasterization compute power anyway.

 

There is also a free version RTGI version from BlueSkyDefender, but is a alot more subtle and doesn't seem to do SSAO like Marty's does.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, RejZoR said:

Did NVIDIA tell them or how were they "notified" exactly?

Notified via an email from Nvidia.

CPU - Ryzen 5 5600X | CPU Cooler - EVGA CLC 240mm AIO  Motherboard - ASRock B550 Phantom Gaming 4 | RAM - 16GB (2x8GB) Patriot Viper Steel DDR4 3600MHz CL17 | GPU - MSI RTX 3070 Ventus 3X OC | PSU -  EVGA 600 BQ | Storage - PNY CS3030 1TB NVMe SSD | Case Cooler Master TD500 Mesh

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×