Jump to content

Having confusing desktop GPUs wasn't enough - Nvidia removing MaxQ branding

CephDigital

First time doing a news post so lemme know if I've done something wrong.

 

Summary

Nvidia are removing the Max-Q designation for laptop GPUs, making it a whole lot more harder knowing what performance you can expect.

 

Quotes

Quote

 No longer will NVIDIA or OEMs mention whether a particular Ampere Mobile GPU is Max-Q or Max-P. In fact, NVIDIA tells us that every Ampere Mobile part can offer third gen Max-Q features

 

For instance, the flagship RTX 3080 "Laptop GPU", as NVIDIA would like to call it, offers a configurable TGP ranging from 80 W to 150+ W. The RTX 3070 Mobile offers an 80 W to 125 W range while the RTX 3060 Mobile can be tuned between 60 W and 115 W. These TGP ranges correspond to actual clocks. The RTX 3080 Mobile, for example, can offer up to 37% increased clocks at the higher-end of the TDP range compared to the base 80 W variant.

 

What this means is that a 115 W RTX 3060 Mobile can potentially outperform an 80 W RTX 3080 Mobile depending on the given workload. The lack of explicit power details in the laptop specs can potentially complicate matters for the end user, who would generally assume that mention of an RTX 3080 Mobile automatically implies higher performance.

 

We also get to learn that implementing Max-Q features will be at the OEM's discretion. If say, the GPU is already at the lower end of the TDP range like an RTX 3060 Mobile 60 W, for example, the OEM can choose not to implement Dynamic Boost 2.0. Instead, they can offer it for a higher TDP variant. The problem here is that there is no way for end-users to know how their prospective purchase is limited by merely glancing at the spec sheet.

My thoughts

Stop this please Nvidia. When it comes to laptops, knowing if something is MaxQ or not is a massive thing. For instance, a MaxQ 2080 might only be on par with or slightly better than a Max P 2070. I seriously think they're doing this on purpose so they can charge unknowing customers the "better" 3080 when it might perform worse than a cheaper 3060 at a higher wattage. What makes this worse is that the MaxQ features has to be implemented in BY the OEM, meaning you could theoretically get a GPU that is underpowered and none of the benefits that supposedly come with MaxQ.

 

Sources

Notebook Check

Link to comment
Share on other sites

Link to post
Share on other sites

Doesn't seem to be too different than mobile CPUs, they too can be had in different configurations.

While it might be a little backhanded, and it'd probably be better if they didn't, I don't fault them fully.

If you're a consumer, and you're spending over $500 on something, you should do your due diligence and put in an hour or two of research.

CPU: Ryzen 9 5900 Cooler: EVGA CLC280 Motherboard: Gigabyte B550i Pro AX RAM: Kingston Hyper X 32GB 3200mhz

Storage: WD 750 SE 500GB, WD 730 SE 1TB GPU: EVGA RTX 3070 Ti PSU: Corsair SF750 Case: Streacom DA2

Monitor: LG 27GL83B Mouse: Razer Basilisk V2 Keyboard: G.Skill KM780 Cherry MX Red Speakers: Mackie CR5BT

 

MiniPC - Sold for $100 Profit

Spoiler

CPU: Intel i3 4160 Cooler: Integrated Motherboard: Integrated

RAM: G.Skill RipJaws 16GB DDR3 Storage: Transcend MSA370 128GB GPU: Intel 4400 Graphics

PSU: Integrated Case: Shuttle XPC Slim

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

Budget Rig 1 - Sold For $750 Profit

Spoiler

CPU: Intel i5 7600k Cooler: CryOrig H7 Motherboard: MSI Z270 M5

RAM: Crucial LPX 16GB DDR4 Storage: Intel S3510 800GB GPU: Nvidia GTX 980

PSU: Corsair CX650M Case: EVGA DG73

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

OG Gaming Rig - Gone

Spoiler

 

CPU: Intel i5 4690k Cooler: Corsair H100i V2 Motherboard: MSI Z97i AC ITX

RAM: Crucial Ballistix 16GB DDR3 Storage: Kingston Fury 240GB GPU: Asus Strix GTX 970

PSU: Thermaltake TR2 Case: Phanteks Enthoo Evolv ITX

Monitor: Dell P2214H x2 Mouse: Logitech MX Master Keyboard: G.Skill KM780 Cherry MX Red

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, dizmo said:

Doesn't seem to be too different than mobile CPUs, they too can be had in different configurations.

While it might be a little backhanded, and it'd probably be better if they didn't, I don't fault them fully.

If you're a consumer, and you're spending over $500 on something, you should do your due diligence and put in an hour or two of research.

The main issue is that a 3060 in laptop A could be wildly different to laptop B despite both having a "3060". You can't tell unless the maker specifies it. At least with CPUs you can see that CPU A has a higher boost clock than CPU B so is most likely to be faster.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, CephDigital said:

At least with CPUs you can see that CPU A has a higher boost clock than CPU B so is most likely to be faster.

Not necessarily. The same CPU can be configured with a 15W or a 25W TDP for example, depending on the manufacturer. That's the case on both Intel and AMD CPU's.

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, CephDigital said:

The main issue is that a 3060 in laptop A could be wildly different to laptop B despite both having a "3060". You can't tell unless the maker specifies it. At least with CPUs you can see that CPU A has a higher boost clock than CPU B so is most likely to be faster.

Not entirely true, as @IAmAndre said.

It still doesn't change the fact that anyone buying something that expensive should do their due diligence and research what they're buying.

CPU: Ryzen 9 5900 Cooler: EVGA CLC280 Motherboard: Gigabyte B550i Pro AX RAM: Kingston Hyper X 32GB 3200mhz

Storage: WD 750 SE 500GB, WD 730 SE 1TB GPU: EVGA RTX 3070 Ti PSU: Corsair SF750 Case: Streacom DA2

Monitor: LG 27GL83B Mouse: Razer Basilisk V2 Keyboard: G.Skill KM780 Cherry MX Red Speakers: Mackie CR5BT

 

MiniPC - Sold for $100 Profit

Spoiler

CPU: Intel i3 4160 Cooler: Integrated Motherboard: Integrated

RAM: G.Skill RipJaws 16GB DDR3 Storage: Transcend MSA370 128GB GPU: Intel 4400 Graphics

PSU: Integrated Case: Shuttle XPC Slim

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

Budget Rig 1 - Sold For $750 Profit

Spoiler

CPU: Intel i5 7600k Cooler: CryOrig H7 Motherboard: MSI Z270 M5

RAM: Crucial LPX 16GB DDR4 Storage: Intel S3510 800GB GPU: Nvidia GTX 980

PSU: Corsair CX650M Case: EVGA DG73

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

OG Gaming Rig - Gone

Spoiler

 

CPU: Intel i5 4690k Cooler: Corsair H100i V2 Motherboard: MSI Z97i AC ITX

RAM: Crucial Ballistix 16GB DDR3 Storage: Kingston Fury 240GB GPU: Asus Strix GTX 970

PSU: Thermaltake TR2 Case: Phanteks Enthoo Evolv ITX

Monitor: Dell P2214H x2 Mouse: Logitech MX Master Keyboard: G.Skill KM780 Cherry MX Red

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

imo this entire 30 series laptop generation is a joke. After two generations of fairly synced up laptop and desktop GPUs its now back to the 980m bullshit, this time without the "m" designation to mislead the consumer.

 

Here is my take:

Tightass nvidia fucked everything by going with a hot and under-performing Samsung 8nm process.

IMO the 3080 desktop was shifted to the GA102 (the titan class die) to compete with AMD RDNA2.

As a result of the above there is no fucking way a "102" die (which they have never used in a laptop before) is going to be feasible.

GDDR6X also runs hot, so we wont see that in a laptop.

Even the GA104 (3070 die) runs fairly hot and pulls a good chunk of power (close to the 2080 if i'm not mistaken).

 

Thanks to all of this we are now stuck with the biggest cluster fuck of underclocked GA104 laptops, misleading 3080 laptop marketing. Confused consumers and drastic performance differences between laptops running same the 30 series GPU.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, dizmo said:

It still doesn't change the fact that anyone buying something that expensive should do their due diligence and research what they're buying.

What if the product doesn't have any reviews or videos that specify whether it's a Max P or Max Q and the manufcaturer doesn't specify it either. Do you just completely dis-consider it. There's many laptops out there that don't have any kind of reviews, do you just ignore them or hope it turns out to be the one you want. Regardless of whether you do research or not, this is just unnecessarily making things more difficult than they should be. 

Take my Asus ZX553VD-DM969T, it's a solid laptop, but there's absolutely no reviews for it online.

Link to comment
Share on other sites

Link to post
Share on other sites

Why is it so hard for Intel, AMD, Nvidia and others... To have clear product brandings and stick to them... 

CPU: AMD Ryzen 3700x / GPU: Asus Radeon RX 6750XT OC 12GB / RAM: Corsair Vengeance LPX 2x8GB DDR4-3200
MOBO: MSI B450m Gaming Plus / NVME: Corsair MP510 240GB / Case: TT Core v21 / PSU: Seasonic 750W / OS: Win 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, TetraSky said:

Why is it so hard for Intel, AMD, Nvidia and others... To have clear product brandings and stick to them... 

Probably because the amount of people that purchase a laptop based on the nuances of the GPU/CPU is niche compared to formfactor and aesthetics.

 

Don't hate me, I'm just telling it like it is. People prefer form over function.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, StDragon said:

Probably because the amount of people that purchase a laptop based on the nuances of the GPU/CPU is niche compared to formfactor and aesthetics.

 

Don't hate me, I'm just telling it like it is. People prefer form over function.

This sort of thing is not only happening on the laptop side of things though.

CPU: AMD Ryzen 3700x / GPU: Asus Radeon RX 6750XT OC 12GB / RAM: Corsair Vengeance LPX 2x8GB DDR4-3200
MOBO: MSI B450m Gaming Plus / NVME: Corsair MP510 240GB / Case: TT Core v21 / PSU: Seasonic 750W / OS: Win 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

43 minutes ago, AndreiArgeanu said:

What if the product doesn't have any reviews or videos that specify whether it's a Max P or Max Q and the manufcaturer doesn't specify it either. Do you just completely dis-consider it. There's many laptops out there that don't have any kind of reviews, do you just ignore them or hope it turns out to be the one you want. Regardless of whether you do research or not, this is just unnecessarily making things more difficult than they should be. 

Take my Asus ZX553VD-DM969T, it's a solid laptop, but there's absolutely no reviews for it online.

If they don't specify it, ask. Simple. If they can't provide that information, then no, don't buy the product.

All goes back to proper research.

CPU: Ryzen 9 5900 Cooler: EVGA CLC280 Motherboard: Gigabyte B550i Pro AX RAM: Kingston Hyper X 32GB 3200mhz

Storage: WD 750 SE 500GB, WD 730 SE 1TB GPU: EVGA RTX 3070 Ti PSU: Corsair SF750 Case: Streacom DA2

Monitor: LG 27GL83B Mouse: Razer Basilisk V2 Keyboard: G.Skill KM780 Cherry MX Red Speakers: Mackie CR5BT

 

MiniPC - Sold for $100 Profit

Spoiler

CPU: Intel i3 4160 Cooler: Integrated Motherboard: Integrated

RAM: G.Skill RipJaws 16GB DDR3 Storage: Transcend MSA370 128GB GPU: Intel 4400 Graphics

PSU: Integrated Case: Shuttle XPC Slim

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

Budget Rig 1 - Sold For $750 Profit

Spoiler

CPU: Intel i5 7600k Cooler: CryOrig H7 Motherboard: MSI Z270 M5

RAM: Crucial LPX 16GB DDR4 Storage: Intel S3510 800GB GPU: Nvidia GTX 980

PSU: Corsair CX650M Case: EVGA DG73

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

OG Gaming Rig - Gone

Spoiler

 

CPU: Intel i5 4690k Cooler: Corsair H100i V2 Motherboard: MSI Z97i AC ITX

RAM: Crucial Ballistix 16GB DDR3 Storage: Kingston Fury 240GB GPU: Asus Strix GTX 970

PSU: Thermaltake TR2 Case: Phanteks Enthoo Evolv ITX

Monitor: Dell P2214H x2 Mouse: Logitech MX Master Keyboard: G.Skill KM780 Cherry MX Red

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Lol reminds me of the 2060 implementation in laptops

 

90W 120W or something, ton of BS

 

But that said, Nvidia GPU, from my experience, runs about ~95-99% of the performance on ~80% of the power limit, so I'm not too offended by it

 

But if it's a lower wattage part, I want to know about it, not find it out myself.

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, IAmAndre said:

Not necessarily. The same CPU can be configured with a 15W or a 25W TDP for example, depending on the manufacturer. That's the case on both Intel and AMD CPU's.

Just because something exists a certain way doesn’t mean it’s the right thing to do. The whole configurable TDP thing in CPU’s is super anti-consumer and we shouldn’t be greeting the same change for GPU’s with open arms.

Link to comment
Share on other sites

Link to post
Share on other sites

It doesn't help when the laptop's spec sheets don't even mention the power limit configuration for both the CPU and GPU.

 

Another reason why we need actual reviews to point out such metrics (and please, clearer indications next time)

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, schwellmo92 said:

Just because something exists a certain way doesn’t mean it’s the right thing to do. The whole configurable TDP thing in CPU’s is super anti-consumer and we shouldn’t be greeting the same change for GPU’s with open arms.

I'm not saying that it's good news, it's just nothing new so there's no need to bash Nvidia for things all major companies are already doing. The situation is also relatively similar with desktop and mobile parts having the same name but different performance.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, IAmAndre said:

I'm not saying that it's good news, it's just nothing new so there's no need to bash Nvidia for things all major companies are already doing.

Yes but NVIDIA in this case is regressing. They had a perfectly fine solution for distinguishing between low power and high power GPU's, now they are getting rid of it.

 

1 hour ago, IAmAndre said:

The situation is also relatively similar with desktop and mobile parts having the same name but different performance.

Not true at all. What desktop CPU has the same name in mobile form?

Link to comment
Share on other sites

Link to post
Share on other sites

Also, in case people haven’t noted.

 

The RTX 3080 and 3080 Laptop use completely different dies. The 3080 uses GA102 whilst the 3080 Laptop uses a fully-unlocked GA104 (same die albeit binned lower used on the desktop 3070, with a lower-binned variant on the desktop 3060 Ti).

 

It’s why I’ve taken to calling the laptop variant the RTX 3080M.

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, schwellmo92 said:

Not true at all. What desktop CPU has the same name in mobile form?

It's more rare than not, but low-end desktops and high-end laptops do cross over. For example (don't know if that's now true), but some Dell Precision laptops in the past used full Intel socketed Xeons whereas all other laptops used mobile CPUs. In the case of Dell OptiPlex desktops, they all use desktops CPUs whereas the OptiPlex Micro series will use a mobile CPU. The difference is often a combination of core count, speed, and/or lower power consumption.

 

Desktop example: Dell OptiPlex 7080 uses the Core i5-10500 whereas the 7080 Micro uses the Core i5-10500T. Click on the links to see the differences.

 

And yes, they are distinguished by the "T", but I doubt most consumers pay attention.

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, TrigrH said:

imo this entire 30 series laptop generation is a joke. After two generations of fairly synced up laptop and desktop GPUs its now back to the 980m bullshit, this time without the "m" designation to mislead the consumer.

 

Here is my take:

Tightass nvidia fucked everything by going with a hot and under-performing Samsung 8nm process.

IMO the 3080 desktop was shifted to the GA102 (the titan class die) to compete with AMD RDNA2.

As a result of the above there is no fucking way a "102" die (which they have never used in a laptop before) is going to be feasible.

GDDR6X also runs hot, so we wont see that in a laptop.

Even the GA104 (3070 die) runs fairly hot and pulls a good chunk of power (close to the 2080 if i'm not mistaken).

 

Thanks to all of this we are now stuck with the biggest cluster fuck of underclocked GA104 laptops, misleading 3080 laptop marketing. Confused consumers and drastic performance differences between laptops running same the 30 series GPU.

Though RTX 3080 isn't that hot or power hungry as everyone is screeching about. I have RTX 3080 GamingPro from Palit that has one of the slimmest coolers (it's as wide as PCI bracket exactly) and really isn't a thick boy and yet it's suprisingly quiet. Though Palit made some good use of backplate as cooling element because it's scorching hot during operation and those holes in the card at the end are one of the biggest I've seen on any card. It can really pump a lot of air through the backplate. I was expecting it to be obnoxiously hot and loud, but really isn't.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, schwellmo92 said:

Not true at all. What desktop CPU has the same name in mobile form?

I was referring to GPU names. For instance the GTX 1060 6GB vs 3GB desktop that had much more in difference than memory size, and then the mobile version was a stripped down version of the 3GB model.

 

That aside, in general desktop and mobile versions of the GPUs have the same name while the mobile versions performs worse. And some manufacturer include desktop variants in some laptop models, so yes that's confusing as well.

Link to comment
Share on other sites

Link to post
Share on other sites

Oh no...
I already hated it enough with the Max-Q/P branding.
Well, now we are going to have laptops with 3070s outperforming 3080s with no way to tell besides looking at the TDP probably hidden somewhere deep in the detailed specs. I can't wait to try and find anything second hand where your lucky if someone tells you if it's max q or p in the listing, good luck getting them to say what the tdp is....

why no dark mode?
Current:

Watercooled Eluktronics THICC-17 (Clevo X170SM-G):
CPU: i9-10900k @ 4.9GHz all core
GPU: RTX 2080 Super (Max P 200W)
RAM: 32GB (4x8GB) @ 3200MTs

Storage: 512GB HP EX NVMe SSD, 2TB Silicon Power NVMe SSD
Displays: Asus ROG XG-17 1080p@240Hz (G-Sync), IPS 1080p@240Hz (G-Sync), Gigabyte M32U 4k@144Hz (G-Sync), External Laptop panel (LTN173HT02) 1080p@120Hz

Asus ROG Flow Z13 (GZ301ZE) W/ Increased Power Limit:
CPU: i9-12900H @ Up to 5.0GHz all core
- dGPU: RTX 3050 Ti 4GB

- eGPU: RTX 3080 (mobile) XGm 16GB
RAM: 16GB (8x2GB) @ 5200MTs

Storage: 1TB NVMe SSD, 1TB MicroSD
Display: 1200p@120Hz

Asus Zenbook Duo (UX481FLY):

CPU: i7-10510U @ Up to 4.3 GHz all core
- GPU: MX 250
RAM: 16GB (8x2GB) @ 2133MTs

Storage: 128GB SATA M.2 (NVMe no worky)
Display: Main 1080p@60Hz + Screnpad Plus 1920x515@60Hz

Custom Game Server:

CPUs: Ryzen 7 7700X @ 5.1GHz all core

RAM: 128GB (4x32GB) DDR5 @ whatever it'll boot at xD (I think it's 3600MTs)

Storage: 2x 1TB WD Blue NVMe SSD in RAID 1, 4x 10TB HGST Enterprise HDD in RAID Z1

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, RejZoR said:

Though RTX 3080 isn't that hot or power hungry as everyone is screeching about. I have RTX 3080 GamingPro from Palit that has one of the slimmest coolers (it's as wide as PCI bracket exactly) and really isn't a thick boy and yet it's suprisingly quiet. Though Palit made some good use of backplate as cooling element because it's scorching hot during operation and those holes in the card at the end are one of the biggest I've seen on any card. It can really pump a lot of air through the backplate. I was expecting it to be obnoxiously hot and loud, but really isn't.

A good cooler doesn't mean the GPU doesn't dump a bucket load of heat into the case. I literally had to change case it was that bad. I couldn't exhaust it fast enough.

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/21/2021 at 5:46 PM, CephDigital said:

First time doing a news post so lemme know if I've done something wrong.

 

Summary

Nvidia are removing the Max-Q designation for laptop GPUs, making it a whole lot more harder knowing what performance you can expect.

 

Quotes

My thoughts

Stop this please Nvidia. When it comes to laptops, knowing if something is MaxQ or not is a massive thing. For instance, a MaxQ 2080 might only be on par with or slightly better than a Max P 2070. I seriously think they're doing this on purpose so they can charge unknowing customers the "better" 3080 when it might perform worse than a cheaper 3060 at a higher wattage. What makes this worse is that the MaxQ features has to be implemented in BY the OEM, meaning you could theoretically get a GPU that is underpowered and none of the benefits that supposedly come with MaxQ.

 

Sources

Notebook Check

I mean like, since the rtx 3000 series is super power hungry, it wouldn't surprise me if the MaxQ sku's just performed too terribly to possibly release. They didn't want people seeing MaxQ benchmarks for 3000 series laptops and thinking the mobile chips are underpowered, seeing as every company just couldn't stop talking about mobile at CES. Maybe this means we'll be seeing AMD cards in laptops yet!

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/21/2021 at 5:46 PM, CephDigital said:

First time doing a news post so lemme know if I've done something wrong.

 

Summary

Nvidia are removing the Max-Q designation for laptop GPUs, making it a whole lot more harder knowing what performance you can expect.

 

Quotes

My thoughts

Stop this please Nvidia. When it comes to laptops, knowing if something is MaxQ or not is a massive thing. For instance, a MaxQ 2080 might only be on par with or slightly better than a Max P 2070. I seriously think they're doing this on purpose so they can charge unknowing customers the "better" 3080 when it might perform worse than a cheaper 3060 at a higher wattage. What makes this worse is that the MaxQ features has to be implemented in BY the OEM, meaning you could theoretically get a GPU that is underpowered and none of the benefits that supposedly come with MaxQ.

 

Sources

Notebook Check

What do you mean “theoretically”. That has ALWAYS been the case.  If you buy a laptop with an “X” gpu in it it is going to be a ton slower that that gpu as a desktop version.  The whole problem is calling laptop GPUs by desktop gpu designations in the first place.  They used to not even have the same gpu chip. They would just “call” a given laptop gpu by a desktop gpu name to designate its pricing level.  They had nothing to do with each other except marketing shenanigans.  Lately they’ve actually been putting the same chip in the laptops but running it at a lower wattage.  A 2080 or whatever, in a laptop never behaved at a desktop 2080 level.  This is why the “M” was a big deal.  The mobile version is ALWAYS a much lower performance version.  If you’ve got a chip that runs at 200w and needs 3 pounds of aluminum fins and heat pipes to cool it thinking that putting the same gpu in a laptop will net the same performance when it’s running under 20w with maybe a vapor plate and a tiny little fan for cooling will perform on the same level is stupid.  People seem to do it though. A maxQ 2080 or maxP 2080 won’t run at normal 2080 speeds.  You could call it an 8gbtu102, with some or another power level, because that’s what it is, but it was never a 2080. The Q in maxQ stood for “quiet” iirc whereas the maxP stood for “power” it was never Max though.  It’s all just stuff tarted up with marketing.  

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×