Jump to content

AdoredTV's history on NVIDIA

Mira Yurizaki

AdoredTV takes a critical look at the history of NVIDIA's GPUs.

If you don't feel like watching through the 45 minutes or so these videos are, here's the tl;dr summary from him:

  • Pre-2010 performance jumps were on average at least 60-75% between generations
  • Post-2010 performance jumps were on average around 30% between generations (actual GPU generations, not model numbers)
  • At some point, the x80 series became a "midrange GPU" on the basis that NVIDIA always has a GPU with more stuff in it and that if you go by code names, the x80 GPUs are in the middle. Which led to the point that we're paying more for effectively less.
  • Also claims that because you have quite a bit of overclocking headroom on at least Maxwell and Pascal, NVIDIA was deliberately shipping GPUs underclocked to hide mediocre performance gains
    • Personal note: I find this claim absurd at best. Yes Maxwell and Pascal do have a lot of overclocking potential, but if you ship from the factory at those speeds, wouldn't you also be lowering the overclocking potential then? You're going to have a damned-if-you-do-damned-if-you-don't situation. Also, if you're also going to make that claim, then I hope you're prepared to make the claim across the board. Why doesn't Intel ship the i7-7700K at 4.8GHz? Why doesn't AMD ship Ryzen at whatever the usual overclocking speed is?

      My friend made an analogy with cars and engines. Sure an engine can achieve even more horsepower, but it wasn't designed to take it that long for the amount of time most people want to keep their cars.
       
  • NVIDIA only gave us their best cards when ATI/AMD gave NVIDIA something to worry about.
  • AMD is no longer able to compete in the high end and everyone should stop hoping. As such, NVIDIA will just continue to give us more and more mediocrity and wants PC hardware journalists to point this out.
    • Personal note: I also don't see how this will improve anything. Look at ISPs. Complaints out the ass but if they're the only provider in town, yeah ,sure, they'll make improvements. Uh-huh.

Also a lot of this I want to know what was more marketing and what was more of an engineering decision. Yes it's easy to claim that NVIDIA was purely driven by business decisions and continued to release iterations of GPUs that make you go "why didn't they just release that in the beginning?" But I also wonder did they release those iterations because they had a pile of GPUs sitting around that was better than an x80 but not quite a Titan? What do you do with those GPUs? Similarly with what happened with the GTX 970, it was an engineering decision to do partitioned memory as a compromise between shipping with one less memory controller and have less memory total (or worse, more memory and make everything confusing as hell) or run into availability issues with needing a full compliment of the memory controllers in order to pass QA.

 

Also setting flame shields up to full.

Link to comment
Share on other sites

Link to post
Share on other sites

I haven't watched the vids yet, but I can tell that this guy is a nerd beyond repair. And most of the stuff he says makes sense. I genuinely trust in most of what he's saying.

Now, let me go watch and edit this comment afterwards, if I have sth to add.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Light-Yagami said:

I haven't watched the vids yet, but I can tell that this guy is a nerd beyond repair. And most of the stuff he says makes sense. I genuinely trust in most of what he's saying.

Now, let me go watch and edit this comment afterwards, if I have sth to add.

Check out his vid on intel. Found that very interesting

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, TheOriginalHero said:

Check out his vid on intel. Found that very interesting

Maybe I came out as if I don't watch him on regular basis. I do xD

 

Link to comment
Share on other sites

Link to post
Share on other sites

Tinfoil hats on folks. 

 

(And yeah, I believe Nvidia is purposefully slowing down how much GPUs get better, just cause they have no competition).

Custom pinewood case, Corsair CX 600WRampage 3 Extreme, i7 980x (@4.2ghz) with ML240 Cooler MSI GTX 970, 24gb DDR3, 240gb OCZ Tr150 SSD + 2Tb Seagate Baracuda. 

 

Advocate for used/older hardware. Also one of the resident petrol heads. 

Link to comment
Share on other sites

Link to post
Share on other sites

I wish people would stop giving him views. I think it's incredible that anyone takes him seriously after all the bullshit he has been spewing.

Cherry picking benchmarks like crazy. Obviously being a massive AMD fanboy who hates Intel and Nvidia. Making ridiculous claims about unreleased products.

He is like the embodiment of /r/AMD.

 

I honestly would not be surprised if it was revealed that he was a legitimate shill.

Link to comment
Share on other sites

Link to post
Share on other sites

26 minutes ago, M.Yurizaki said:

 

  • Also claims that because you have quite a bit of overclocking headroom on at least Maxwell and Pascal, NVIDIA was deliberately shipping GPUs underclocked to hide mediocre performance gains
    • Personal note: I find this claim absurd at best. Yes Maxwell and Pascal do have a lot of overclocking potential, but if you ship from the factory at those speeds, wouldn't you also be lowering the overclocking potential then? You're going to have a damned-if-you-do-damned-if-you-don't situation. Also, if you're also going to make that claim, then I hope you're prepared to make the claim across the board. Why doesn't Intel ship the i7-7700K at 4.8GHz? Why doesn't AMD ship Ryzen at whatever the usual overclocking speed is?
       

I don't understand how underclocking would high limited gains across generations. Gains in what? Are we talking about performance per clock, like IPC in CPUs? Because if I don't have much of an improvement in IPC, and I want to make it look like I have a better product, I would overclock it compared to the previous generations. That way, performance numbers would go up, even though clock for clock there is no improvement.

But if I have limited IPC gains, and I set the clocks lower... wouldn't that just make it look more "meh" at stock values, compared to older generations?

So, bottom line: how does underclocking make performance gains look better?

 

Maybe I got it wrong and he's talking about comparisons between models within a generation. In that case, yes, selling products that could be clocked similarly with very different clocks to try and price-discriminate consumers with different preferences is a standard practice for GPUs and CPUs (and not only limited to clocks, it applies to all sorts of characteristics).

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, LAwLz said:

I wish people would stop giving him views. I think it's incredible that anyone takes him seriously after all the bullshit he has been spewing.

Cherry picking benchmarks like crazy. Obviously being a massive AMD fanboy who hates Intel and Nvidia. Making ridiculous claims about unreleased products.

He is like the embodiment of /r/AMD.

 

I honestly would not be surprised if it was revealed that he was a legitimate shill.

Yeah he is such a fanboy he gave Vega nothing but praise. Oh no wait, he thought the Vega 64 they sent him was overpriced trash and he at the moment of review couldn't recommend it or any other Vega at that time. He thought Vega 56 was "ok", but that the pricing was poor. Not exactly the ringing endorsement of a fanboy.

 

The sheer projection on display here.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, SpaceGhostC2C said:

I don't understand how underclocking would high limited gains across generations. Gains in what? Are we talking about performance per clock, like IPC in CPUs? Because if I don't have much of an improvement in IPC, and I want to make it look like I have a better product, I would overclock it compared to the previous generations. That way, performance numbers would go up, even though clock for clock there is no improvement.

But if I have limited IPC gains, and I set the clocks lower... wouldn't that just make it look more "meh" at stock values, compared to older generations?

So, bottom line: how does underclocking make performance gains look better?

 

Maybe I got it wrong and he's talking about comparisons between models within a generation. In that case, yes, selling products that could be clocked similarly with very different clocks to try and price-discriminate consumers with different preferences is a standard practice for GPUs and CPUs (and not only limited to clocks, it applies to all sorts of characteristics).

Maybe I misinterpreted it, but I took it to mean NVIDIA could've released Maxwell and Pascal at higher stock speeds but they didn't, claiming that a overclocked-the-snot-out-of 1080 is only 30% better than a similarly overclocked (as best it can be) 980, rather than the 50% we see elsewhere.

 

But that could also just be the result of cherry picking. He does seem to have an anti-NVIDIA slant towards the end, especially when he used FE as the price point, completely ignoring the actual MSRP NVIDIA set for the GeForce 10 series.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, ImadKnight said:

Tinfoil hats on folks. 

 

(And yeah, I believe Nvidia is purposefully slowing down how much GPUs get better, just cause they have no competition).

I mean, yeah, why would you bother spending so much on R&D if you have nothing to compete against?

USEFUL LINKS:

PSU Tier List F@H stats

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, LAwLz said:

I wish people would stop giving him views. I think it's incredible that anyone takes him seriously after all the bullshit he has been spewing.

Cherry picking benchmarks like crazy. Obviously being a massive AMD fanboy who hates Intel and Nvidia. Making ridiculous claims about unreleased products.

He is like the embodiment of /r/AMD.

 

I honestly would not be surprised if it was revealed that he was a legitimate shill.

Excuse me, but this guy hammers over everybody if they make something stupid. He trashed the 5xx series and the whole AMD GPU department as a matter of fact. He gives credit where it's due and points out stupid decision making of these companies. He prefers AMD as a company because of their fair play and better approach to average consumer. I mean, if you watched his videos you'd see that this guy is far from your usual AMD fan boy. I have to agree with @Majestic on this one. Check facts before spewing out biased opinions. 

 

Link to comment
Share on other sites

Link to post
Share on other sites

49 minutes ago, L.Lawliet said:

I was hoping VEGA will made nvidia release Volta but turns out its a huge disappointment 

After seeing Adored's video, I'm not even sure that Volta would be worth it if we're looking at 20-30% improvement.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×