Jump to content

Labs Transparency Thread

LMGcommunity

I would like to see the LMG get ISO 9001 certified (or even just the labs) and have the standard work pertaining to video creation or testing methodology public.

 

 

Here is a link describing ISO 9001:2015

https://www.iso.org/files/live/sites/isoorg/files/archive/pdf/en/documented_information.pdf

Edited by TylerD321
Added ISO Link

My PC Specs: (expand to view)

 

 

CPU: Intel Core i7-10700K - OC to 5 GHz All Cores
CPU Cooler: Corsair iCUE H115i RGB Pro XT (Front Mounted AIO)
Motherboard: Asus TUF GAMING Z490-PLUS (WI-FI)
Memory: Corsair Vengeance LPX 32 GB (2 x 16 GB) DDR4-3600

Storage: Intel 665p 1 TB M.2-2280 NVME SSD (x2)
Video Card: Zotac RTX 3070 8 GB GAMING Twin Edge OC

Power Supply: Corsair RM850 (2019) 850 W 80+ Gold Certified Fully Modular
Case: Corsair 4000D Airflow
Case Fan 120mm: Noctua F12 PWM 54.97 CFM 120 mm (x1)
Case Fan 140mm: Noctua A14 PWM 82.5 CFM 140 mm (x4)
Monitor Main: Asus VG278QR 27.0" 1920x1080 165 Hz
Monitor Vertical: Asus VA27EHE 27.0" 1920x1080 75 Hz
Mouse: SteelSeries Rival 600 Wired
Keyboard: SteelSeries Apex 5  – (Hybrid Blue Switch)
Headphones: SteelSeries Arctis 9X Wireless - Black

Speakers: Mackie CR3-X 3 Inch Speakers Pair
UPS: APC 1500VA UPS Battery Backup and Surge Protector

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, TylerD321 said:

I would like to see the LMG get ISO 9001 certified (or even just the labs) and have the standard work pertaining to video creation or testing methodology public.

THIS would be a solid card to have in their wallet even if it was just labs. I think if the ideology of labs is to challenge the industry on a technical level, they need to be certified AND calibrated like the industry, with openly published records.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, qbergeron648 said:

THIS would be a solid card to have in their wallet even if it was just labs. I think if the ideology of labs is to challenge the industry on a technical level, they need to be certified  AND calibrated like the industry, with openly published records.

Wouldn't the engineering tools need to also be calibrated and tested frequently? The Aneco chamber, PSU Tester, Thermal chamber ect? 

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Toakan said:

Wouldn't the engineering tools need to also be calibrated and tested frequently? The Aneco chamber, PSU Tester, Thermal chamber ect? 

 

Iso is more about process and documenting what you do

Link to comment
Share on other sites

Link to post
Share on other sites

There absolutely needs to be a much, much longer list of hardware with much more extensive graphs with all that hardware in it. It doesn't matter if the text is small and people might need to pause the video... They will if they are making a purchase decision partially based on the data. 

 

But we need at least 3 gens of GPUs included, and 4 gens of CPUs. I want to see 10th gen Intel included with the 14th Gen launch, including i3,i5,i7,i9 in all of them. K and non k. People need a full layout of what they are looking at over time for perspective, not just this Gen vs last Gen with 3 products for each. That's not valuable and doesn't give anyone enough info. Trust your audience is smart enough to deal with the large swath of data. 

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, Toakan said:

Wouldn't the engineering tools need to also be calibrated and tested frequently? The Aneco chamber, PSU Tester, Thermal chamber ect? 

 

not for ISO 9001, BUT to add to the suggestion that they get ISO 9001, they should also have their machines & equipment regularly load tested/calibrated and publish those cert documents so that it adds to the credibility. 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Daniel White said:

Post on GitHub the practices and methodologies in an editable format. Take PRs from the community to improve or add clarity 

Posting it on GitHub or a similar platform that allows diffing with previous versions and seeing exactly what changed and when over time is an excellent idea.
EDIT: with good commit comments ("this change was made to address an issue we ran into when testing motherboard X, where without addition of test Y we were getting incosistent results. The test Y addresses it in the way Z...", etc.) it would actually also be a great demonstration of the work the team is putting into making these methodologies work.

Link to comment
Share on other sites

Link to post
Share on other sites

22 hours ago, LMGcommunity said:

As promised, here is the thread where you can post your transparency suggestions for Labs. Here are some examples of things we'd like to see feedback about:

  • Ideas in our benchmarking.
  • What you'd like to see regarding test system information.
  • Benchmark details.
  • Testbed variations.

And, of course, any other transparency-related feedback is welcome, but similar to the Sponsor Complaints subforum, any off-topic comments will be subject to moderation.

 

Thank you for wanting to help us improve the way we do things here!

Hi,

 

LTT Labs truly excites me, but there are a few challenges that (for the sake of the sustainability of LTT Labs, at least in my opinion) LTT needs to clearly and publicly solve that I'd like to highlight. It's free advice, so take it for what it's worth.

 

Challenge #1: Establish clear guidelines, independent of the product class, for why something should be measured.

Challenge #2: Codify what about that class of thing can/should be measured, and what shouldn't.

Challenge #3: Quantify the one-time (equipment) and recurring (manpower, consumables, opportunity) costs of a given suite of measurements, and clearly label who is bearing that cost.

 

When LTT Labs was announced, Linus positioned it as a swiss army knife to ensure incentive alignment within LTT as a whole, eliminating the need for sponsorships (either fiscal or general rapport) from the companies that sell the products about which LTT makes content. This is wonderfully aspirational, while still being achievable and potentially profitable.

 

Painful thought:

Spoiler

As a viewer, it looks like what happened instead is a shift of content generation responsibility from the writing teams to the Labs... which (if true) would be both dysfunctional and soul crushing for both the writers and technicians in LTT Labs. In my opinion, writers at LTT should not be allowed to dictate needs to LTT Labs in order to make good content. Writers must be empowered/required to have qualitative opinions that are freestanding. A video idea that boils down to a question for Labs to answer or a problem for Labs to solve... doesn't cut it. Writers will be simultaneously too busy mismanaging Labs' projects while appearing unproductive in producing content around Labs' results to have time or desire for creative freedom. 


Excessively high DPI flick shots:

Spoiler
  • LTT Labs must use (whenever possible) measurement standards as ratified by IEEE (or the corresponding professional governing body) in order to have any true credibility. If the standard doesn't exist, you can't just make up your own. You CAN draft a proposed standard (one for each data class for each product class), solicit feedback on the standard here (or at the conventions of the appropriate organizations)... LTX would be a great place for other independent labs and product manufacturers alike to ratify new standards as needed.
  • As has been demonstrated with the keyboard pressing abomination (sic.) project, automation is expensive, has long lead times, and doesn't guarantee repeatability. Anything more precise than is repeatable is a waste of time and resources. On the flip side, an automated test that shows repeatability is indicative of actual data as well as expertise in Labs. Ascribing the repeatable results to the wrong causes WILL happen, but that's not a matter of integrity, just time.
  • Youtube rewards negativity (just look at the hoards of scat fetishism swarming over this current LTT drama), so weaponize that by making a Labs-driven Hall of Shame. Labs gets to pick a product category and quantitative measure, implements the standards, and automates the process. If the measurement isn't weaponized, then the results shouldn't be considered reliable enough to nuke a brand. Sample size of one doesn't count either. Getting manufacturers to continue seeding products will be dependent on Labs position as "the final say" on a statistical basis, with unnecessary bridge-burning parties as a secondary metric.
  • Video series idea: record weekly arguments interactions between writers and Labs breaking down misconceptions and redefining expectations (goes both ways - see literally any audio product for reference). No scripting allowed.
  • If content -> views -> sponsorships/merch sales/ad revenue -> more content, there is no endgame besides siphoning off money at some point. There needs to be a publicly displayed goal for LTT employees and viewers to buy into together, and Labs needs to be a part of that vision. Marketing integrity, better tech products, more open source, more recycling, better documentation, consumer education, just entertainment, etc.?
  • If Labs will ever be anything other than Colton/Dennis-sized expense, there needs to be solid traceability for Lab expenditures. If manufacturers are the funding source, there must be a separation from LTT videos so that manufacturers can expect confidentiality and collaboration. If LTT merch sales are the funding source, there should be a way for viewers to vote with their wallets to shift the focus of Labs to their collective desires (*cough* preorders *cough*). If ad revenue/sponsorships/traditional viewership metrics are the funding source, Labs will fail catastrophically due to public perceptions of conflict of interest. The long lag time between strategic direction provided to Labs and the resulting functional output will guarantee failure if the measure of success is decoupled from the strategic direction. Every company makes mistakes, but nobody can afford to make mistakes that take too long to become obvious.

 

Labs will be a hard vision to execute on, but it's going to be worth it and I am still hopeful that it's possible.

 

Good luck ya'll.

Link to comment
Share on other sites

Link to post
Share on other sites

Would it be possible to semi-regularly run the most up to date testing harness on a known configuration as a canary to detect configuration issues in the testing methodology? I.e. if you knew the expected performance on that known configuration, this canary test would help catch any new issues that may have popped up.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×