I would like to see
1) better graphs - steve can do box plots, why not labs? Where it makes sense, it would be great to see run to run variations and how large those actually are
2) ask other specialists in the industry for feedback in each area of testing - maybe even publish how you changed your testing based on that feedback. I am assuming labs people do things to their own standards and it's good to have another set of eyes doing the review of processes. I am not talking about fixing mistakes in a review but instead - what kind of tests are important, how the current ones can be improved, how they should be taken and then visualized for public
3) certifications of labs where it make sense - some ISO standards are full of crap just like anything else, however if a standard is well regarded and used in the industry - use it, maybe educate viewers on why do companies follow it and why it is good to have and go through the certification. Invent your own standards only when there is a good reason to.
Documentation - can we get a documentation on:
1) the OS setup that is being used - in the RTX 4060 review it mentions the windows version, patch level, driver version which is all good. But are there any tweaks to the operating system? What about windows updates? Are scheduled tasks controlled? Which apps are running in the background that can skew results? Things of that nature. Same for Linux or Mac.
2) With UEFI - what about SAM/ReBAR - is it enabled? which other settings are controlled or is it all stock as of the specified version? Can we compare the data across reviews?
3) not just how you test each game but also how do you prepare your workbenches
Would be great not just for transparency but also that it would be possible to replicate a specific result