Jump to content

Hardware Unboxed Goes on Twitter Rant in Response to LTT Labs Potshots

Skipple

TL;DR: Tim from engineering at LMG makes an off the cuff comment during an LTX tour attempting to differentiate Labs/LTT from other reviewers, saying that LMG reruns every test for every review where GamersNexus and Hardware Unboxed doesn't. Hardware Unboxed takes exception to this and goes on a 24 hour twitter rant. It's still unclear if Hardware Unboxed reruns every test for every review.

 

 

 

BACKGROUND

 

Small YouTuber MurfsGaming recently uploaded a video of a Studio and Labs Tour at LTX 2023. During the video, LTT Labs Engineer Tim Holowachuk said the following:

Quote

...the difference between us and somebody like Gamers Nexus or Hardware Unboxed is we test new components, new tests, every time. Every project that we do has new data. (timestamp)

 

 

 

HARDWARE UNBOXED RESPONSE

 

Well, apparently Hardware Unboxed took some exception to this and responded in a series of tweets

Quote

LTT Labs taking shots 😅 I've re-run more benchmarks than Linus has spent $ on testing equipment & staff.
...
If you have to tell everyone how good your testing is, in my 20+ years of experience that generally means your testing isn't very good. Your work speaks for itself guys, that's how you build credibility. (link)

 

They criticized LTT's results and data in videos:

Quote

I didn't want to say it, but yeah LTT Labs should look to refine their data, not throw it out and end up with more results that often don't make sense. Not to be salty, but we've all seen it way more than we should. (link)

...It's frustrating to see reviews with obviously bad data from the biggest outlet in the game (link)

 

Hardware Unboxed was unclear if they rerun tests for every review, or not

 

Quote

Hardware Unboxed (HUB): I think he's claiming they update all their comparative data with each review, which we almost do anyway. Ever(y) review is unnecessary though.

Other User: But what it is almost or always? Because this makes the point entirely different imho.

HUB: As in almost always there is data that needs to be updated, BUT NOT ALWAYS! The point is GN and HUB have been doing a significantly better job of day one review data than LTT for years now, do you see either of us pointing that out.... outside of this tweet? NO!

Other User: The guy never said that they did it better. He just pointed out how they do it differently, right?

HUB: Yeah it was never inferred that LTT Labs data is superior to HUB and GN, of course not. He was just saying that to stress the fact that they're different. FFS give me a break.

(link)

 

 

TIM EXPLAINS HIS COMMENTS

 

Tim Holowachuk responded in a series of tweets: 

 

Quote

I said something on a tour for LTX attendees last week that's gotten some traction in this thread. I'd like to clarify its context.

 

I said that LTT Labs creates new data for every project that we start and I drew a distinction between our approach and that of some of our industry peers. Everyone is looking for a way to differentiate in this space, and we determined very early that our approach would be to parallelize our testing as much as possible so we could produce fresh data for every video. This was important to our team for two main reasons:

  1. We're revising our testing methodology frequently and rapidly as we start up. You’ve seen this in the form of significant changes in parameters. We were trying out different games, settings, measurement methods, test variable controls. If you look at our early Labs-tested reviews, such as the RTX 4090 & 4080, you'll see a lot of mistakes. Believe me, I know. These are lessons that I am glad we learned in public, because we strive for the best, and the feedback we've gotten from all of you has been excellent.
  2. If we don’t re-test for every release, the driver version + OS patch + game version + Markbench harness updates all wombo combo into datasets that aren't comparable.

The long & short of it is that our project's datasets need to stay separate from each other because each has different variables. We're striving to become better in every aspect of what we do as a Lab, and that means that we need the space to make mistakes and learn from them.

 

I read as many of your comments on our videos as I can, and the criticism goes into my post-project summaries. The criticism in this thread is, I think, meant for me personally, and I'm glad for the chance to read it. I'm not camera-trained, I'm not a public spokesperson, but I have a ton of respect for our industry peers, and the hard work that THEY also do to differentiate their content from ours. All of this is in service of bringing better data to our viewers. I hope they would agree with this.

 

The bottom line is that I am excited by the tools our team is building, and I am always stoked to talk about them. We've got a video coming out soon about some of the statistics behind our benchmarking strategy, and I hope that you'll stick around to see what we're working on.

In response to Tim recognizing the mistakes in the 4090 and 4080 reviews Hardware Unboxed responded

 

Quote

This is probably a good example that's still very fresh. I recon you'd want to show a reasonable track record of class leading day one reviews before you try and elevate yourself above the likes of @GamersNexus for example 😉

 

 

MY OPINION

 

While I think Tim could have avoided calling other YouTubers / reviewers out by name, I understand that he was attempting to differentiate LTT Labs from the rest of the pack with their testing methodology. It was something said in a small room with a handful of people. The video was uploaded by a fairly small channel and the comment wouldn't have gained much traction otherwise. 

 

As far as Hardware Unboxed response, I view it was utterly unprofessional. They have been tweeting, and are continuing to reply to tweets for over 24 hours now. It's an issue that's been blown way out of proportion on their end and probably should have been handled privately. 

 

With all that it's still unclear to me if Hardware Unboxed gathers new data for each review or not. 

ask me about my homelab

on a personal quest convincing the general public to return to the glory that is 12" laptops.

cheap and easy cable management is my fetish.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Skipple said:

With all that it's still unclear to me if Hardware Unboxed gathers new data for each review or not. 

I would assume that results between different videos are usually not comparable, unless explicitly mentioned. We don't really know what software versions (e.g. drivers, Windows, games) changed in between.

 

My guess would be they re-use results from previous videos only if they know that nothing has changed other than the hardware component being tested.

Remember to either quote or @mention others, so they are notified of your reply

Link to comment
Share on other sites

Link to post
Share on other sites

LTT Tim was very unprofessional, and honestly really should keep his mouth shut on that type of thing because he does not know how GN or HUB works behind the scenes. LTT has proven time and time again they are NOT a reliable source of data for testing, benchmark numbers,  amongst other things. LABS was supposed to help mend that, but honestly with how its launched and the numbers theyve had, its been about the same lack of trusting their numbers.

 

LTT has to go out of its way to constantly show that it IS a reliable set of data, and can be trusted, not just talk about it. GN has a proven track record at this point for doing their utmost in keeping data up to date, and when things change to rerun it. HUB generally does the same for patches and such that are known about. 

 

In his response not once did LTT Tim apologize or even reference a "Hey probably shouldnt have done that, statement wasnt true based on that he doesnt know what hes talking about" instead he goes on about how hard they have worked and a bunch of nonsense about how LTT labs will be different! In all honesty until they start getting back on track and focusing on having a reputable testing and data platform to show data, its just gonna come off as people who say they do a bunch of stuff but almost always come up short.

 

Every channel has its way of doing this, there is no one way of doing it, thats why it helps to watch a few channels to get a rough idea of what performance should be. LTT has not been one of those channels for years due to how bad their data has been.

 

HUB was mostly poking fun at that idea that it comes off as HUB wont put in the same effort and time that LTT Labs will! thus they will be better and more accurate and up to date, when HUB steve has done 1000;s of hours of benchmarks in the past year. When you have someone call out your credibility or your testing methods and data like LTT Tim did to try and making their solution sound better, you best bet you are going to hear a staunch defense and a bit of that Aussie snarkiness. 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Eh, I noticed that remark but didn't put that much attention to it.  

My first reaction was "he's tired, he probably did the tour several times by now and he's running in auto-mode and not paying that much attention to what he's saying and probably means something else".

I assumed he means when they get a card that needs specific driver versions to work, they retest the other cards again with the same new driver, for consistency. 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Alternative title: HardwareUnboxed makes a valid argument on twitter after LTT employee makes uncalled comments on record.

mY sYsTeM iS Not pErfoRmInG aS gOOd As I sAW oN yOuTuBe. WhA t IS a GoOd FaN CuRVe??!!? wHat aRe tEh GoOd OvERclok SeTTinGS FoR My CaRd??  HoW CaN I foRcE my GpU to uSe 1o0%? BuT WiLL i HaVE Bo0tllEnEcKs? RyZEN dOeS NoT peRfORm BetTer wItH HiGhER sPEED RaM!!dId i WiN teH SiLiCON LotTerrYyOu ShoUlD dEsHrOuD uR GPUmy SYstEm iS UNDerPerforMiNg iN WarzONEcan mY Pc Run WiNdOwS 11 ?woUld BaKInG MY GRaPHics card fIX it? MultimETeR TeSTiNG!! aMd'S GpU DrIvErS aRe as goOD aS NviDia's YOU SHoUlD oVERCloCk yOUR ramS To 5000C18

 

Link to comment
Share on other sites

Link to post
Share on other sites

The Hardware Unboxed response is far from unprofessional; if someone starts taking shots at your entire business model (they mostly do reviews; if the reliability is in question, so is their livelihood) they have every right to go off on Twitter. If we wanted to be more serious about it, comments like that in a public setting could probably also be considered slander and LTT could face legal action. Interestingly defamation can also carry criminal charges leading to jail time.

 

LTT has time and time again had incorrect information in their reviews, or their review data is so mediocre that most people don't pay attention to it. It's not a secret that LTT is more of an entertainment channel and less of an informative channel.

 

IMO they really need to do better.

CPU: Ryzen 9 5900 Cooler: EVGA CLC280 Motherboard: Gigabyte B550i Pro AX RAM: Kingston Hyper X 32GB 3200mhz

Storage: WD 750 SE 500GB, WD 730 SE 1TB GPU: EVGA RTX 3070 Ti PSU: Corsair SF750 Case: Streacom DA2

Monitor: LG 27GL83B Mouse: Razer Basilisk V2 Keyboard: G.Skill KM780 Cherry MX Red Speakers: Mackie CR5BT

 

MiniPC - Sold for $100 Profit

Spoiler

CPU: Intel i3 4160 Cooler: Integrated Motherboard: Integrated

RAM: G.Skill RipJaws 16GB DDR3 Storage: Transcend MSA370 128GB GPU: Intel 4400 Graphics

PSU: Integrated Case: Shuttle XPC Slim

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

Budget Rig 1 - Sold For $750 Profit

Spoiler

CPU: Intel i5 7600k Cooler: CryOrig H7 Motherboard: MSI Z270 M5

RAM: Crucial LPX 16GB DDR4 Storage: Intel S3510 800GB GPU: Nvidia GTX 980

PSU: Corsair CX650M Case: EVGA DG73

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

OG Gaming Rig - Gone

Spoiler

 

CPU: Intel i5 4690k Cooler: Corsair H100i V2 Motherboard: MSI Z97i AC ITX

RAM: Crucial Ballistix 16GB DDR3 Storage: Kingston Fury 240GB GPU: Asus Strix GTX 970

PSU: Thermaltake TR2 Case: Phanteks Enthoo Evolv ITX

Monitor: Dell P2214H x2 Mouse: Logitech MX Master Keyboard: G.Skill KM780 Cherry MX Red

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Let's be brutally honest here, whether you do one run or two runs, the data you gather is anecdotal. To do a proper hardware test you need an adequately calculated sample size. If you are testing a CPU, you need n number of those CPUs and you need to test them all. This might be something like 30 CPUs (Central Limit Theorem), and likely higher for more variable hardware such as monitors. That allows you to display your test result with a confidence interval, usually 95%.

Of course, none of this is economically viable. So all we get is anecdotal evidence. No matter how good your testing methodology, if your test has no statistical power then the result is anecdotal. Tests with a single run on a single device would be called "case studies" (aka a coin toss). Tests with two runs on a single device could be called tests (statistics is possible), but the confidence interval is likely around 50% (aka tossing a coin twice).

Then there's the whole Bayes' theorem, which applies if you are testing the probability of, say, a PSU failure given a certain system condition. 

LTT labs NEEDS a statistician if they want to provide statistically robust data at the lowest possible cost. I am not a statistician. 

Link to comment
Share on other sites

Link to post
Share on other sites

Im very sure that HUB  rerun their entire benchmark suite when they release new content. Steve mention it from time to time in both reviews and their QandAs.

mITX is awesome! I regret nothing (apart from when picking parts or have to do maintainance *cough*cough*)

Link to comment
Share on other sites

Link to post
Share on other sites

I feel like some people are focusing on the wrong thing. 

Rerunning tests doesn't automatically make results more valid. In some cases rerunning is good, like with new drivers for graphics cards. In some cases, it doesn't matter. The write speed of an SSD won't change in any meaningful way except if there is some drastic firmware update, which happens very rarely. Cinebench results for CPUs probably won't change in any meaningful way either. 

 

So it's not that rerunning is good and reusing results is inherently bad. It depends on the situation. That's why I think the question of "do HUB rerun every single time" misses the point. If they rerun it when it makes sense, then that's good enough for me. 

 

We shouldn't boil things down to a single binary thing. There are plenty of other important things to consider. I also don't think it's a competition. There can be multiple valid ways of doing things, which is why it's often a good idea to use multiple sources and understand why the results might differ. 

 

 

 

 

My two cents on LTT Labs:

Not sure if I've said it before, but I feel like LTT Labs is a way for Linus to try and feel serious and scientific, because he has been critiziced for not being that before. 

 

Calling it labs, showing how much money he is spending on testing equipment that may or may not be wasted, all that seems like him trying to compensate. 

"we sent this to the lab" adds a feeling of importance, even though it might be the same thing they did before in someone's garage. 

 

I don't watch their videos anymore, but from what I've seen on this forum their testing results are still questionable at times, and even some of their decisions with the lab, for example having a PSU manufacturer build their PSU testing facility. Imagine if Intel sponsored AMD CPU reviews and paid for the software used for testing... 

 

I think it's an honorable goal to try and make a database of properly done test results, but so far I feel like LTT are a long way from that. I am worried that their fans will get impressed by the words "labs" and "engineering" and then that will be it. That they will be more impressed by labels and titles rather than the result.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Shimejii said:

LTT has to go out of its way to constantly show that it IS a reliable set of data, and can be trusted, not just talk about it. GN has a proven track record at this point for doing their utmost in keeping data up to date, and when things change to rerun it. HUB generally does the same for patches and such that are known about. 

 

 

There are situations where "new" tests can be designed, but the amount of data you can generate is limited simply by time.

 

Like I'd LOVE to see a benchmark run on every GPU released since the TNT/3DFX/Rage cards, but for practical reasons, you can not even get those old pieces of hardware and operating system in a state that you could run the same benchmark on that 1999 PC that you can with the 2022 PC. At a certain point you draw a line in the sand and go "date before X hardware innovation" can be discarded.

 

Like right now, if the hardware can't run on Windows 11, then you can discard it, because nobody is going to be buying hardware that isn't capable of running the current Windows 11 OS. But there is a pile of hardware that was released BEFORE Windows 11 that will run it. So every PCIe GPU with Windows 7/8/10/11 drivers should be tested, which pretty much means, going back to the GTX 700 parts, at worst. But should we go check every Intel CPU going back to Sandy Bridge? No, because Windows 11 only supports 8th gen or later CPU's.

 

For the purposes of "should I buy this thing" , you only need the highest end part, the lowest "reasonably available part" and whatever part is the best value for the dollar. Cause the highest end parts are never the best value, and neither is the cheapest part. If a part is not available, then why are we testing it except for giving a reference point of "should I upgrade from it?"

 

Link to comment
Share on other sites

Link to post
Share on other sites

To me it seems like the lab is more about "how can we make more videos" and less about "how can we test manufacturers claims" despite what Linus keeps saying (he repeatedly claims he wants to debunk marketing speak and do real tests). I know it takes time to set up a lab, but I feel that he/the team are spreading way to thin and trying to do too much at once. How many multi part videos get dropped after the first part is released?

Link to comment
Share on other sites

Link to post
Share on other sites

23 hours ago, dizmo said:

The Hardware Unboxed response is far from unprofessional; if someone starts taking shots at your entire business model (they mostly do reviews; if the reliability is in question, so is their livelihood) they have every right to go off on Twitter. If we wanted to be more serious about it, comments like that in a public setting could probably also be considered slander and LTT could face legal action. Interestingly defamation can also carry criminal charges leading to jail time.

 

LTT has time and time again had incorrect information in their reviews, or their review data is so mediocre that most people don't pay attention to it. It's not a secret that LTT is more of an entertainment channel and less of an informative channel.

 

IMO they really need to do better.

Found the not lawyer! Knew you’d show up somewhere lol, yeah they’re gonna throw the guy in jail, perp walk tomorrow. LOL holy shit thats amazing.  

 

 

15 hours ago, Blue4130 said:

To me it seems like the lab is more about "how can we make more videos" and less about "how can we test manufacturers claims" despite what Linus keeps saying (he repeatedly claims he wants to debunk marketing speak and do real tests). I know it takes time to set up a lab, but I feel that he/the team are spreading way to thin and trying to do too much at once. How many multi part videos get dropped after the first part is released?


Far more likely with regards to multi part videos is the first part doesn’t do well, and they drop it.  LTT seems extremely driven by literally just how well a video does, if it does well you get more(look at the tech upgrades), if it doesn’t do well they’re pretty happy to just drop it entirely.

 

 

21 hours ago, LAwLz said:

 

I don't watch their videos anymore, but from what I've seen on this forum their testing results are still questionable at times, and even some of their decisions with the lab, for example having a PSU manufacturer build their PSU testing facility. Imagine if Intel sponsored AMD CPU reviews and paid for the software used for testing... 

 

I think it's an honorable goal to try and make a database of properly done test results, but so far I feel like LTT are a long way from that. I am worried that their fans will get impressed by the words "labs" and "engineering" and then that will be it. That they will be more impressed by labels and titles rather than the result.


While I dont think Seasonic actually built their tester, provided input is certainly not building.  You’re saying you want it designed by a group completely disconnected from the power supply industry?  Do you think that would generate useful data?  

They’ve posted their power supply tester in multiple videos, it would be pretty simple to look at what it’s made from and track down the individual pieces.  Can you tell me what that’s been sourced that would in some way benefit Seasonic when it comes to power supply testing?  Or are you just as uninformed when it comes to the science of power supply tests as I am and just want to say something about media you no longer engage with?

Gamers Nexus just posted about their $250,000 new testing chamber…Will you start not watching their content(assuming you do) because now they’re talking about how much money they’re spending on testing equipment that might be a waste of money?  Is all of their previous data compromised because their testing conditions were so poor that they needed to spend a quarter of a million dollars to make it more acceptable?  Maybe they should re run all their tests so we can be 100% sure the testing environment didn’t effect the data.

If people are impressed by labels then they are people who dont listen to Linus…ever.  His primary message is never listen to a single review, go watch multiple things and evaluate everything based on multiple sets of data.  

You know what, most of the time, all those reviewers come up with extremely similar data, recommendations, and overall results.  So maybe all of them are just wasting their money just like the people who buy the new Mac Pro for the occasional 1% increase over the Mac Studio and we as consumers should just check out all three if that purchase is a relevant one and hey! We benefit from all of it. Look at that


 
 

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, Vilacom said:

You’re saying you want it designed by a group completely disconnected from the power supply industry?

Yes

 

11 hours ago, Vilacom said:

Do you think that would generate useful data?

Yes. It's not like testing power supplies are the dark arts that only PSU manufacturers understand. How electricity works is a pretty well-understood subject outside of the PSU industry too.

 

11 hours ago, Vilacom said:

They’ve posted their power supply tester in multiple videos, it would be pretty simple to look at what it’s made from and track down the individual pieces.  Can you tell me what that’s been sourced that would in some way benefit Seasonic when it comes to power supply testing?

There was a time when Linus said that he would never let a company sponsor the review of a competitor because it makes the review untrustworthy. It introduces a conflict of interest. That's what is happening here. 

Even if Seasonic ended up doing a good job and the testing doesn't favor them in any way, it is still very bad optics to have one manufacturer be the one who paid for and potentially designed the tests their competitors will be tested using.

It's like having Intel decide and pay for the software that AMD CPUs will be tested using. Even if they choose good software that doesn't favor themselves, it still undermines the legitimacy of the tests because there is a conflict of interests there. 

 

 

11 hours ago, Vilacom said:

Or are you just as uninformed when it comes to the science of power supply tests as I am and just want to say something about media you no longer engage with?

Calm down dude. You're being pretty hostile towards me for no reason.

 

 

 

11 hours ago, Vilacom said:

Gamers Nexus just posted about their $250,000 new testing chamber…Will you start not watching their content(assuming you do) because now they’re talking about how much money they’re spending on testing equipment that might be a waste of money?

I don't watch Gamers Nexus, but I wouldn't stop watching them just because they posted a video about their testing equipment.

I do however thing that in the case of Linus he comes across as someone trying to compensate. The reasons why are more nuanced than just "post video = bad. Not post video = good". 

 

 

11 hours ago, Vilacom said:

Is all of their previous data compromised because their testing conditions were so poor that they needed to spend a quarter of a million dollars to make it more acceptable?

No it's not. When did I say it was? I feel like you're attacking a strawman right now but your response is so weird that I find it hard to even pinpoint what I said that made you response in this way.

 

 

11 hours ago, Vilacom said:

Maybe they should re run all their tests so we can be 100% sure the testing environment didn’t effect the data.

That might be a good idea, but depending on the test I think it might only be necessary to do it once. I think rerunning all tests every single time you test a new product, regardless of the category or test, sounds excessive and probably a waste of time. The type of test and some other factors matters a lot in whether or not a test needs to be rerun.

 

 

11 hours ago, Vilacom said:

If people are impressed by labels then they are people who dont listen to Linus…ever.  His primary message is never listen to a single review, go watch multiple things and evaluate everything based on multiple sets of data.  

That's good advice.

 

 

11 hours ago, Vilacom said:

You know what, most of the time, all those reviewers come up with extremely similar data, recommendations, and overall results.  So maybe all of them are just wasting their money just like the people who buy the new Mac Pro for the occasional 1% increase over the Mac Studio and we as consumers should just check out all three if that purchase is a relevant one and hey! We benefit from all of it. Look at that

What are you trying to say here?

When did I say reviewers are wasting their money?

Why are you bringing up some Mac Pro vs Mac Studio? I never mentioned any of those.

 

I really don't get what you are trying to say. I feel like at least half of your post was a response to me, but you were responding to things I never said or even implied.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, LAwLz said:

Yes. It's not like testing power supplies are the dark arts that only PSU manufacturers understand. How electricity works is a pretty well-understood subject outside of the PSU industry too.

 

1 hour ago, LAwLz said:

There was a time when Linus said that he would never let a company sponsor the review of a competitor because it makes the review untrustworthy. It introduces a conflict of interest. That's what is happening here. 

Even if Seasonic ended up doing a good job and the testing doesn't favor them in any way, it is still very bad optics to have one manufacturer be the one who paid for and potentially designed the tests their competitors will be tested using.

It's like having Intel decide and pay for the software that AMD CPUs will be tested using. Even if they choose good software that doesn't favor themselves, it still undermines the legitimacy of the tests because there is a conflict of interests there. 

You're 100% correct that electricity and power supplies are pretty well known factors and probably extremely straightforward to those people who work with the components and design/build them.  But at the same time there are apparently enough things that go into them that are enough of "the dark arts" that there is a large market for deceptive marketing and shady practices to where there is a need to test and verify power supplies and not just really standard prices/performance.  With that in mind I would say the idea that people who are just in the electrical industry but have nothing to do with consumer PC power supply production probably don't actually know everything to look for and the various testing situations that may apply to that particular industry.  So instead of starting from the ground up it is beneficial to have a starting point as to what data and tests an industry leader thinks are important.

Your example for intel here is a completely different situation.  If Seasonic had purchased the power supply tester for LMG and designed the test suite that was being used then that would absolutely be a massive conflict of interest.  100% agree, and clearly so does LMG.  Unless Linus is just blatantly lying about how they went about their purchase they went to a company that actually makes these things, and SeaSonic offered some input.  No money from SeaSonic has anything to do with the purchase or testing of the products.  Also, optics can be important, but ultimately they dont matter.  LMG has incredible motivation to keep things on the up and up as is humanly possible because investing millions into a massive testing lab and then letting the data you're producing be completely inconsistent with the many other excellent review channels out there because of bias is just stupid.  It'll be discovered extremely quickly, called out, and then all of the credibility for both the reviewer and the product go out the window.  

 

1 hour ago, LAwLz said:

Calm down dude. You're being pretty hostile towards me for no reason.

 

Sorry if I'm being hostile, you are just a type of person that is particularly triggering to me and found almost everywhere in the gaming/tech sphere.  The person who no longer has anything to do with a game/media/etc but also feels the need to constantly show up and engage in a negative way with that thing that is no longer worth their time to engage with apparently.  It leads to your comparison of intel paying for AMD tests, either you have no idea what actually happened with seasonic and you just want to say something negative, or you do actually understand what happened and you just want to make things up to portray something in a bad light.  People cry about things like developers/publishers/movie studios/X thing that's ruining gaming/movies/youtube/PCs but at the same time people like yourself who claim to be done with whatever content but the constantly show up to engage with it negatively never get brought up.  So yes, I'm hostile towards you, but for a reason.


 

 

1 hour ago, LAwLz said:

I don't watch Gamers Nexus, but I wouldn't stop watching them just because they posted a video about their testing equipment.

I do however thing that in the case of Linus he comes across as someone trying to compensate. The reasons why are more nuanced than just "post video = bad. Not post video = good"

1 hour ago, LAwLz said:

No it's not. When did I say it was? I feel like you're attacking a strawman right now but your response is so weird that I find it hard to even pinpoint what I said that made you response in this way

Here let me quote you to show you what you said

 

 

On 8/4/2023 at 4:56 AM, LAwLz said:

Not sure if I've said it before, but I feel like LTT Labs is a way for Linus to try and feel serious and scientific, because he has been critiziced for not being that before. 

 

Calling it labs, showing how much money he is spending on testing equipment that may or may not be wasted, all that seems like him trying to compensate. 

"we sent this to the lab" adds a feeling of importance, even though it might be the same thing they did before in someone's garage. 


Now, you dont watch LTT content so I'm not 100% sure where you're getting this impression, and this plays into a question you have later, but the labs project is literally the way LMG actually can be "serious and scientific".  I brought up GN because that's often the channel LMG is compared to when it comes to being far more focused on deep dives into data, and they don't get this kind of response when they drop a video about their super expensive new test suite that if anything actually brings less to their channel than something like labs brings to LTT despite the fact that if you found out that a source of information you relied on for something was suddenly spending a huge amount of money to be able to do all their testing much more effectively I would think you'd say "wait, what was wrong with the testing before that you needed to do this to make it work so much better?". 

To your last point, its almost exactly the same as what they did before in someone's garage, just like a google data center is almost exactly the same as you saving your photos to a hard drive.  The difference is just scale and consistency, which means costs go astronomical.

 

 

2 hours ago, LAwLz said:

What are you trying to say here?

When did I say reviewers are wasting their money?

Why are you bringing up some Mac Pro vs Mac Studio? I never mentioned any of those.

 

I really don't get what you are trying to say. I feel like at least half of your post was a response to me, but you were responding to things I never said or even implied.

You didn't say they're wasting their money, you did imply that the labs are mostly a vanity project for Linus to give him more of a sense of legitimacy, I don't think interpreting that as "wasting massive amounts of money to feel fancy" is too much of a leap.  Ultimately though, they are wasting their money, and the example of a mac pro vs studio is the example of that.  I forget how much that power supply tester costs, but the reality is if i need a 750 watt power supply and i go to newegg and ask them to sort by best selling I would end up with a Corsair 80+ Gold modular power supply for $100, am i going to have a horrible experience with that power supply if i just order that one?  Is my experience going to be far better if it turns out there was another one that provided similar power delivery for $10 cheaper? They could probably just put a power meter on an outlet and leave the PC on for like a week with some kind of standardized load running and so long as they don't turn off...hey, probably a decent enough purchase.  

The fact is most of these reviews by any large tech channel could be accomplished to 95% effectiveness by someone with a test bench in their garage, anything they're choosing to do to exceed that 95% will be almost universally useless to an average consumer.  The system for testing keyboards is really amazing, but regardless of what it spits out my wife is still gonna go to microcenter, play with the keys on all the ones they have on display, and leave with the one that feels the best to her.  But if you want the data to exist at all, you need people who are willing to just piss away money to provide it to the 5% of people that actually can get some amount of value from it.  



 

Link to comment
Share on other sites

Link to post
Share on other sites

There are so many bad takes here that it ain't even funny.

 

On 8/4/2023 at 9:15 AM, Electricity Taster said:

Let's be brutally honest here, whether you do one run or two runs, the data you gather is anecdotal. To do a proper hardware test you need an adequately calculated sample size. If you are testing a CPU, you need n number of those CPUs and you need to test them all. This might be something like 30 CPUs (Central Limit Theorem), and likely higher for more variable hardware such as monitors. That allows you to display your test result with a confidence interval, usually 95%.

Of course, none of this is economically viable. So all we get is anecdotal evidence. No matter how good your testing methodology, if your test has no statistical power then the result is anecdotal. Tests with a single run on a single device would be called "case studies" (aka a coin toss). Tests with two runs on a single device could be called tests (statistics is possible), but the confidence interval is likely around 50% (aka tossing a coin twice).

Then there's the whole Bayes' theorem, which applies if you are testing the probability of, say, a PSU failure given a certain system condition. 

LTT labs NEEDS a statistician if they want to provide statistically robust data at the lowest possible cost. I am not a statistician. 

I find it hilarious that you're speaking in favour of statistical validation of measurements, and then use veritasium as source, arguably one of the worst offenders on YouTube when it comes to grossly misrepresenting things in favour of making "wholesome" clickbaity content.

 

In any case, I don't think what you propose is realistic due to other reasons: there's no actual guarantee that the quantities you're attempting to measure follow a known distribution, unless if you go to very large sample sizes, this is due to the way we tend to bin dies during the manufacturing process. So you'd need to get quite a few devices per manufacturing lot to even stand a chance of getting a statistically meaningful measurement, otherwise the odds are that what you're actually measuring is the variability of your test method.

 

On 8/4/2023 at 6:41 AM, Shimejii said:

LTT has to go out of its way to constantly show that it IS a reliable set of data, and can be trusted, not just talk about it. GN has a proven track record at this point for doing their utmost in keeping data up to date, and when things change to rerun it. HUB generally does the same for patches and such that are known about. 

GN has made plenty of stupid remarks that demonstrate they know jackshit about the electronics industry and power supply design. They got so much wrong in that entire gigabyte PSU saga that it ain't even funny, and when industry folks pointed out that what they were saying was just plain wrong, they just went "no, we know better". Meanwhile, I haven't actually heard anyone from LTT say anything outrageously stupid on video, they know when they're in over their head and don't attempt to represent themselves as experts.

 

4 hours ago, LAwLz said:

Yes. It's not like testing power supplies are the dark arts that only PSU manufacturers understand. How electricity works is a pretty well-understood subject outside of the PSU industry too.

This is particularly hilarious to me, because many electronics books refer to switch mode power supply design as a dark art for those blessed with long grey beards. You need an exceptional understanding on the behaviour of magnetic components, and need to be an absolute wizard at systems theory to design a robust SMPS. In much the same way, actually verifying the unconditional stability and guaranteeing the performance of one of these is neigh on impossible unless if you damn well know what you're doing. The end result is that even companies the likes of Keysight go to TDK when they need a high power PSU, which should tell you something about how bloody difficult it is to actually design and test one of these.

 

So, from my point of view as an EE, I see the following:

  • LTT wants to do power supply testing but quickly realises that it's not nearly as easy as folks make it out to be.
  • LTT asks industry contacts for advice on how to test power supplies. 
  • Seasonic tells LTT what they're using, forwards them to Chroma and then helps them specify a system that can outperform what they're using themselves.
  • LTT then actually went to Chroma, which is the absolute gold standard in PSU testing, and asked them for such a system. And knowing Chroma's application engineers, if you tell them you want to absolutely wreck computer power supplies and you want something better than the listed system, they're going to give you exactly that and tell you how to use it. This is literally their core business, it's what made them famous and respected in the industry.

Also, do you really think we don't talk to the engineers at our competitors about such things? These folks are often former class mates or colleagues, acquaintances from meeting at industry events, folks you've talked to about articles they wrote, etc. So, if something sketchy were going on, we'd have heard about it by now through the grapevine. Open hostility or trying to screw each other over by adjusting test standards in specific ways, and other such moves, are something that's quite rare within our industry, and I'd be seriously surprised if Seasonic were to do so.

 

In any case, I hope Tim doesn't take this flak too serious, he made a remark to someone in person in front of a small group of people in response to a direct question, he gave an honest answer from his point of view based on his knowledge at the time, nothing wrong with that.

Link to comment
Share on other sites

Link to post
Share on other sites

What’s the reason to not rerun tests?

Phone 1 (Daily Driver): Samsung Galaxy Z Fold2 5G

Phone 2 (Work): Samsung Galaxy S21 Ultra 5G 256gb

Laptop 1 (Production): 16" MBP2019, i7, 5500M, 32GB DDR4, 2TB SSD

Laptop 2 (Gaming): Toshiba Qosmio X875, i7 3630QM, GTX 670M, 16GB DDR3

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, RoseLuck462 said:

What’s the reason to not rerun tests?

Time. Testing eats a lot of time.

Link to comment
Share on other sites

Link to post
Share on other sites

So much drama in the LBC.

 

LTT needs to grow up and get professional.

 

If you are going to exploit your users/viewers for clicks, at least do something good with the money, instead of being a "me too" channel.

 

 

AMD R9 5900X | Thermalright Phantom Spirit 120 EVO, T30,TL-C12 Pro
Asus Crosshair VIII Dark Hero | 4x8GB G.Skill Trident Z @ 3733C14 1.5v
Zotac 4070 Ti Trinity OC @ 3045/1496 | WD SN850, SN850X, SN770
Seasonic Vertex GX-1000 | Fractal Torrent Compact RGB, Many CFM's

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, ImorallySourcedElectrons said:

I haven't actually heard anyone from LTT say anything outrageously stupid on video

Back when I watched them, they constantly did just that.

Like in this video, which is like 90% incorrect information and shows a complete lack of understanding of the fundamentals.

 

 

Maybe they have gotten better since then, but the amount of topics I see pop up here where someone quotes some really strange things from LTT videos indicates to me that they haven't.

 

 

 

24 minutes ago, Sakuriru said:

Did you see the power supply video? It's hard to get a solid PSU review after the age of johnnyguru.

Yes I know it's hard to find good reviews. But I don't think that's a reason to lower peoples' standards or get super defensive when someone brings up criticism. 

I've time and time again said that I really hope LTT Labs plays out and we get a really good source for good information, but so far the amount of red flags I see regarding the lab makes me not trust it, and I don't think people should trust it either until we've had some drama free years without a bunch of weird results. Like in this thread. And that's just what I find even when I don't watch their content. I have several times in the past found other issues that I don't see others bring up, so the amount of errors is probably higher than what I know about at this point.

 

With the amount of money being spent and the amount of hype around labs, I do expect close to perfection. When I don't get that, I should at the very least be able to raise concerns and valid criticism (like potential conflicts of interest) without being attacked by angry fanboys who will defend LTT whatever they do.

Link to comment
Share on other sites

Link to post
Share on other sites

20 hours ago, LAwLz said:

Yes. It's not like testing power supplies are the dark arts that only PSU manufacturers understand. How electricity works is a pretty well-understood subject outside of the PSU industry too.

Building a computer PSU that conforms to and reacts correctly to all the ATX specifications isn't actually well understood by a company that makes for example variable output DC power supplies not for usage in computers. The fundamentals are very similar in terms of power circuitry but what is the correct order of 3,3v, 5v and 12v bring up and on which rails and what is the time spacing?

 

There's actually quite a lot that goes in to computer PSU design and manufacturing as well as testing and there isn't anything inherently wrong with consulting the industry on it and it certainly will not compromise you forever. It's not as if Labs is using any actual Season equipment at all, no in fact all the equipment has nothing at all to do with Seasonic and everything to do with getting the necessary information on how to use it for the application of testing computer PSUs and not some other type of power supply for a different use case.

 

It's not necessarily wrong to have concerns about an industry vendor being involved at some level however if you want to properly critique what is going on then that also requires making sure of know what is going on. If not then you risk jumping to ill-informed conclusions which becomes more of a problem, for others, if you start to voice them.  

 

34 minutes ago, LAwLz said:

When I don't get that, I should at the very least be able to raise concerns and valid criticism (like potential conflicts of interest) without being attacked by angry fanboys who will defend LTT whatever they do.

Is it really valid if the core of it comes from being jaded and in general hostile towards LTT and you openly state that you are commenting in ignorance because "you don't watch their videos"?

 

Please don't take the above the wrong way but be careful in how much you personally believe that you are raising valid criticism because others may not see it as valid and it really does not help when you say some of the things you have. And lets be frank, you're known for making a lot of criticisms towards LMG/LTT, founded and unfounded. So many of us here won't just treat something you say as valid criticism even if you personally think it was.

 

Don't be stepping in to the realm of mistakes like Tim from Labs.

Edited by leadeater
Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, leadeater said:

Building a computer PSU that conforms to and reacts correctly to all the ATX specifications isn't actually well understood by a company that makes for example variable output DC power supplies not for usage in computers. The fundamentals are very similar in terms of power circuitry but what is the correct order of 3,3v, 5v and 12v bring up and on which rails and what is the time spacing?

 

There's actually quite a lot that goes in to computer PSU design and manufacturing as well as testing and there isn't anything inherently wrong with consulting the industry on it and it certainly will not compromise you forever. It's not as if Labs is using any actual Season equipment at all, no in fact all the equipment has nothing at all to do with Seasonic and everything to do with getting the necessary information on how to use it for the application of testing computer PSUs and not some other type of power supply for a different use case.

 

It's not necessarily wrong to have concerns about an industry vendor being involved at some level however if you want to properly critique what is going on then that also requires making sure of know what is going on. If not then you risk jumping to ill-informed conclusions which becomes more of a problem, for others, if you start to voice them.  

All I know is that the testing equipment they will use to test power supplies was "specced by Seasonic" (their words, not mine), and the video where they show off their equipment has a "sponsored by Seasonic" banner in it.

Even if it turns out that their testing is good, doing those things is bad optics and potentially a conflict of interest.

 

Again, I would be very suspicious of an AMD review where Intel was the one paying for the video to be made, and Intel decided which software to use during the tests. I would have much preferred if they had gotten impartial sponsors and help.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, LAwLz said:

doing those things is bad optics and potentially a conflict of interest.

Maybe for you but for others that could also give confidence that they are consulting the right people, getting the right equipment and are being instructed on how to use it correctly. Optics is all about how you see it, your eyes are not the same as everyone else's 😉

 

I would be more worried if they just went off and brought a million dollars of equipment with no idea how to use it and tried to fumble their way through it, that doesn't give me any confidence either.

Link to comment
Share on other sites

Link to post
Share on other sites

I think we are in a period of economic downturn that is going to result in some people whipping up drama for clicks and views.  After the burst of popularity during COVID, the return to normal for the tech/content industry is going to be hard for some/many.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, leadeater said:

Maybe for you but for others that could also give confidence that they are consulting the right people, getting the right equipment and are being instructed on how to use it correctly. Optics is all about how you see it, your eyes are not the same as everyone else's 😉

That is true. But in order to inspire me with such confidence they would have to have dropped the sponsorship (so that Seasonic isn't giving them money or other compensation), and preferably have talked to others as well. Not just, in their own words bought something "specced by Seasonic".

 

2 minutes ago, leadeater said:

I would be more worried if they just went off and brought a million dollars of equipment with no idea how to use it and tried to fumble their way through it, that doesn't give me any confidence either.

I am not entirely convinced that isn't exactly what they are doing, and I am worried that if that happens people will still believe their numbers are accurate because "they spent so much money and it is called 'the lab', so it must be true!". That's why threads like the one I posted earlier, where the results are very strange and we have no idea how or what they actually tested (because they don't/didn't disclose that).

Maybe that won't happen again, but they have a very big uphill battle to fight if they want me to take them seriously. The reason for that large hill is their countless fuckups in the past.

Link to comment
Share on other sites

Link to post
Share on other sites

The power supply equipment is industry standard, chroma units etc...  if seasonic told them something, it's probably what current / power rating the modules should have, how many modules - basically programmable electronic loads sold as modular units -  ( how many modules they use to test 12v in their labs, how many modules are used for 5v or 3.3v etc) and maybe guides them in how to configure the loads.

But the testing method is sort of standardized, and there's the ATX specification easily accessible and that can be used to configure the testing equipment.

 

I think the Jonnyguru guy also gave them advice, and he works now at Corsair, and unless I'm mixing things horribly, Jonnyguru guy also helped Gamersnexus sort of like a consultant, so it's not a big deal.  

 

Personally I'm not worried about the electronic loads and all that, but I'd be more worried about properly using those super expensive oscilloscopes that often have fancy expensive probes and for which you really have to know your shit to get things right... there's lots of "gotchas" with scopes and high frequency testing.

 

Anyway...  all this lab thing is pretty much pointless when the end user who watches the videos  only  gets something like  "We have tested this video card in our labs and the results are these " 

You can be proud you have invested a lot in the lab and bought equipment but if the regular Joe watching the video doesn't actually see the hardware or doesn't see at least some B roll with the card being tested ... the "lab" could very well be a computer in the corner of the room for all they know, it's pointless.

 

In pretty much all the latest videos I've seen it's something like  "We gave the card to x at the labs and we got these results " or "he tested it and he says" ... that doesn't mean anything to most people watching, doesn't make the results more trustworthy.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×