Jump to content

NVIDIA GeForce RTX 4090 Laptop GPU Review: 29-58% Faster on Average than 3080 Ti Laptop GPU, Comparable to RTX 3090 Desktop GPU

Summary

Reviewers have published their benchmarking results featuring NVIDIA’s most powerful mobile GPUs to date; the 4090 and 4080 Laptop GPU. Today we will be looking at a review by Hardware Unboxed/TechSpot, which focuses primarily on the 4090 Laptop GPU itself and not actually reviewing the whole laptop. 

 

4090lp2.thumb.jpg.b44d1b934d4bc4a37796003dcba2ccb0.jpg

 

4090lp1.thumb.jpg.424f4d6effdf95911878fc4f24433438.jpg

 

4090lp3.thumb.jpg.7edb419b42c7ec4ca6abc69b8e539bcb.jpg

 

 

Quotes

Quote

Overall, there is no doubting that the RTX 4090 Laptop GPU is extremely powerful and opens up many new possibilities for laptop gaming.

 

With 50-60% performance improvements compared to the RTX 3080 Ti Laptop GPU, in GPU limited scenarios. The new 4090 Laptop GPU is perfect for 1440p high refresh rate gaming, as well as Ray Tracing, and even 4K gaming at times. 

 

This level of performance gain compared to the previous generation is hugely impressive. It's one of the largest I can remember for laptops and it's doubly so when these gains were seen without any increase to GPU power.

 

Going from RTX 20 to RTX 30 was a 30% uplift at the same power limit and similar for GTX 10 series to RTX 20 series. The RTX 4090 Laptop in a best versus best comparison doubled that gen on gen increase, which is incredible news for laptop gamers.

 

It's actually pretty crazy to think that a modern, flagship laptop in 2023 will have the rough performance of an RTX 3090. Which by today's standards is still a very fast graphics card. The RTX 3090 desktop was a 350W GPU, just two and a half years later we're talking about that performance inside 150-175W. That's just another way to think about the massive performance per watt increase we're seeing this generation. 

 

The final issue I have is a big one and an inescapable roadblock and that's the price. RTX 4090 laptops are disgustingly expensive. The GT77 used in this video is a $5000 US laptop. The absolute cheapest model currently listed on NewEgg is the Gigabyte AORUS 17X for $3500 US, which is still just insanely expensive for a gaming laptop and makes it hard to justify under any condition.

 

My thoughts

I think this 4090 Laptop GPU is an incredible piece of engineering, providing near 4070 Ti Desktop performance or 3090/3090 Ti Desktop performance in a mobile platform. I agree though with Tim on many of his negatives. One thing he points out is building out even a standard 4090/13900k desktop PC on PCPartPicker without choosing the most price to performance parts comes out to about $3500. That's the same price as the cheapest laptop with a 4090 Laptop GPU. There's of course additionally the misleading branding. While this is common currently in the Laptop space, it doesn't necessarily make it right. Using an M at the end (signifying Mobile) and bringing that back makes more sense. This card has only 60% of the CUDA cores seen in the desktop 4090 card, which is very important to consider (calling it a 4090 when it's closer in specs to a desktop 4080). Another thing Tim mentions regarding the reasoning behind the 4090 nomenclature is to bring price parity with OEMs with similar desktop configurations. A buyer who hasn't done all their due diligence might see a desktop 4090 with an i9 and laptop 4090 with an i9 both for $3500, thinking they are the same; except with the laptop you get a screen and it's portable. Lastly, and most importantly pointed out is the power target. Where this 4090 Laptop GPU exists in both 80W and 150W configurations. Obviously, the 80W will be substantially slower. If you see reviews of the 4090 and it's the 150W version, then you purchase a laptop with the 80W version; you might be left disappointed when it's performing 30% slower than compared to the benchmarks you saw. This is why a more logical and practical naming scheme should have been established originally. 

 

Sources

https://videocardz.com/149342/nvidia-geforce-rtx-4080-4090-gaming-laptop-review-roundup

https://www.techspot.com/review/2624-nvidia-geforce-rtx-4090-laptop-gpu/

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, BiG StroOnZ said:

Where this 4090 Laptop GPU exists in both 80W and 150W configurations. Obviously, the 80W will be substantially slower.

Might as well be a different gpu.

 

We saw this with a 2060 at 120w vs a 70w 2080. The 2060 was performing the same as the 2080 in a much more expensive laptop. This is really really difficult and scummy for the average consumer

Link to comment
Share on other sites

Link to post
Share on other sites

Generally Nvidia laptop GPU is 1 tier lower than marketed so a 4090 laptop is generally a desktop 4080 and a 3080 laptop is generally a desktop 3070. If I understand what I've been reading correctly. I'm no laptop user though but I can't say I think very highly of this practice from Nvidia as it's gonna fool potential customers. I don't know anything about AMD laptop GPU but they may use a similar practice.

28 minutes ago, BiG StroOnZ said:

calling it a 4090 when it's closer in specs to a desktop 4080

Pretty much. They can call it what they want but in my eyes it's just lies.

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, jaslion said:

Might as well be a different gpu.

 

We saw this with a 2060 at 120w vs a 70w 2080. The 2060 was performing the same as the 2080 in a much more expensive laptop. This is really really difficult and scummy for the average consumer

 

The laptop parts trailing far behind the desktop parts with the same model numbers is nothing new.

 

Having two different laptop parts where one has a power limit that's almost double the other and calling them both "4090," if I'm understanding this right, is way scummier. 

Corps aren't your friends. "Bottleneck calculators" are BS. Only suckers buy based on brand. It's your PC, do what makes you happy.  If your build meets your needs, you don't need anyone else to "rate" it for you. And talking about being part of a "master race" is cringe. Watch this space for further truths people need to hear.

 

Ryzen 7 5800X3D | ASRock X570 PG Velocita | PowerColor Red Devil RX 6900 XT | 4x8GB Crucial Ballistix 3600mt/s CL16

Link to comment
Share on other sites

Link to post
Share on other sites

39 minutes ago, aDoomGuy said:

Generally Nvidia laptop GPU is 1 tier lower than marketed so a 4090 laptop is generally a desktop 4080 and a 3080 laptop is generally a desktop 3070. If I understand what I've been reading correctly. I'm no laptop user though but I can't say I think very highly of this practice from Nvidia as it's gonna fool potential customers. I don't know anything about AMD laptop GPU but they may use a similar practice.

Pretty much. They can call it what they want but in my eyes it's just lies.

1 tier?  It is like a 4070, possibly 4060ti.

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

Is it actually 3090 level? Their results were vs 3080 Ti Laptop not the 3090 desktop. I'm hesitant to attempt a relative performance comparison from there as a reference. I see HUB/TS commentary hasn't improved at all. Are they really that out of touch with reality or are they playing some kind of YouTube meta?

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, porina said:

I see HUB/TS commentary hasn't improved at all. Are they really that out of touch with reality or are they playing some kind of YouTube meta?

What are you referring to? Their complaint about the naming or something else?

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, BabaGanuche said:

What are you referring to? Their complaint about the naming or something else?

I'd take the thread way off topic and/or reopen too many arguments that have already happened, multiple times, if I were to try. As a single example the following line particularly rubbed me the wrong way:

Quote

The new 4090 Laptop GPU is perfect for 1440p high refresh rate gaming, as well as Ray Tracing, and even 4K gaming at times

Was it just poorly chosen words or are they seriously implying this wont be great for 4k? 

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, porina said:

I'd take the thread way off topic and/or reopen too many arguments that have already happened, multiple times, if I were to try. As a single example the following line particularly rubbed me the wrong way:

That answers my question.

Link to comment
Share on other sites

Link to post
Share on other sites

And the most useless stament goes to "29-58% Faster on Average than 3080 Ti Laptop GPU,"

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, ouroesa said:

And the most useless stament goes to "29-58% Faster on Average than 3080 Ti Laptop GPU,"

It's not really that bad. That's the range of averages for the different resolutions. 

 

Also weak, the GPU needs to use more power. So much wasted potential performance 🙃

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, leadeater said:

It's not really that bad. That's the range of averages for the different resolutions. 

 

Also weak, the GPU needs to use more power. So much wasted potential performance 🙃

What I meant is that there is no such thing as average between x% and x%, average is calculated as the sum of items / the number of items and cant really be a range. Also, was jsut taking the piss.

 

Link to comment
Share on other sites

Link to post
Share on other sites

39 minutes ago, ouroesa said:

What I meant is that there is no such thing as average between x% and x%, average is calculated as the sum of items / the number of items and cant really be a range. Also, was jsut taking the piss.

I know you were but that's literally what they are. An average of X at 1080p, 1440p, and 4k. So the range of averages is X%-X%. It's not incorrect 😉

 

What would be incorrect would be (29 + 48 + 58) / 3 = 45%. There is a such a thing as dirty data and worthless outcomes.

Link to comment
Share on other sites

Link to post
Share on other sites

On 2/7/2023 at 12:42 PM, aDoomGuy said:

Pretty much. They can call it what they want but in my eyes it's just lies.

 

It's pretty common in this space, at least for NVIDIA. People have become complacent with the naming schemes with the mobile GPUs though. It's obviously a marketing ploy. Although, it should be noted that low to midrange Laptop GPUs do perform more similarly to their desktop counterparts. 

 

On 2/7/2023 at 1:22 PM, ewitte said:

1 tier?  It is like a 4070, possibly 4060ti.

 

Well this 4090 Laptop GPU is only about 10% slower on average than a 4070 Ti Desktop. It trades blows depending on the game and resolution. This is probably why it was compared to a 3090 and not a 3090 Ti. Where depending on the resolution a 4070 Ti is either a 3090 or 3090 Ti equivalent. 

 

Therefore, unless the 4070 is only 10% slower than the 4070 Ti; comparing it to a 4070 Ti makes more sense. The 4060 Ti will most likely only be as fast as a 3080, and the 4070 as fast as a 3080 Ti; if we are to use previous GPUs as an example (and some recent leaks/rumors). This already performs like a 3090. So, between a 4070 and 4070 Ti is more accurate. 

 

On 2/7/2023 at 2:13 PM, porina said:

Is it actually 3090 level? Their results were vs 3080 Ti Laptop not the 3090 desktop. I'm hesitant to attempt a relative performance comparison from there as a reference.

 

Yeah, it even can be up to 3090 Ti/4070 Ti levels at times. If you watched the video, Hardware Unboxed is planning on comparing this 4090 Laptop GPU to various desktop GPUs. They plan on making a video on it soon. Therefore, we will get a better idea of what desktop GPUs it's similar to. However, it only seems to be 10-15% slower on average than the 4070 Ti (depending on the game and resolution). Thus, saying its relative performance is around a 3090/3090 Ti is not a bad reference.

 

On 2/7/2023 at 2:30 PM, porina said:

Was it just poorly chosen words or are they seriously implying this wont be great for 4k? 

 

What they are implying is, depending on the game at 4K resolution will depend on whether or not it performs well at 4K. Hence, "at times". They aren't implying it won't be great at 4K, just that in some games it performs better than others (at 4K). This might be considered poorly chosen words, depending on who you are.

Link to comment
Share on other sites

Link to post
Share on other sites

Really wish they would just call it a 4070m or whatever the equivalent fps for desktop card is. Or even just be consistent with the core count to name.

 

When the desktop gpu with the same name has 66% more fps at 4k, it's not a real 4090.

The best gaming PC is the PC you like to game on, how you like to game on it

Link to comment
Share on other sites

Link to post
Share on other sites

I found the following interesting to give some perspective on historic nvidia "drops" going from the desktop 80 part to mobile equivalent. We only had two gens where desktop more or less equalled mobile. 

 

image.thumb.png.ba2461dd4f02876a1829719605ef9a9b.png

From:

(I just found the chart to link, you might want to back up a bit for context)

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, 05032-Mendicant-Bias said:

So, 60% more performance for 60% higher price. Well done, Nvidia, you went exactly nowhere!

 

And Nvidia still mislead consumers. That's not an RTX4090. Nvidia could have called it a RTX4090M.

Considering how the wattage impacts performance it should also be included in the name: RTX 4090M-150, RTX 4090M-175, etc.

A PC Enthusiast since 2011
AMD Ryzen 7 5700X@4.65GHz | GIGABYTE GTX 1660 GAMING OC @ Core 2085MHz Memory 5000MHz
Cinebench R23: 15669cb | Unigine Superposition 1080p Extreme: 3566
Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, GhostRoadieBL said:

Really wish they would just call it a 4070m or whatever the equivalent fps for desktop card is. Or even just be consistent with the core count to name.

 

When the desktop gpu with the same name has 66% more fps at 4k, it's not a real 4090.

I rather they make the stack naming the same the m modifier comunicates well enough that it is power sipping and there for way slower
4080m or 4090m means best mobile chip you can get that gen.
even if the 4090m performed like a desktop 4070
4070m makes me think there are faster mobile chips available, and as a retailer, that Im not selling them the best they can get for a 4000 dollar laptop. 
I like Vishera's idea of modifying the name even further with configured wattage. 

I find the recent trend of calling the mobile chip line the same as desktop confusing. Especially because OEMS HAVE put full desktop versions of chips into laptops for reasons.

People are to hung up on "what is a real 4090" a xx90 is just the BEST of a stack. 
nothing to do with what die is used, or what wattage is used. just that it is the best comparitivly. 
There is no AD100 used in consumer graphics cards. XX100 used to be the chips used in the x80 class gpus. that does not mean a 102 cant be a 4090, because at the end of the day, it is the top of the consumer stack. 

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, starsmine said:

I rather they make the stack naming the same the m modifier comunicates well enough that it is power sipping and there for way slower
4080m or 4090m means best mobile chip you can get that gen.

People are to hung up on "what is a real 4090" a xx90 is just the BEST of a stack.

I default to "what would the general public expect from a product named X" a person buying a laptop with a xx90 doesn't expect it to perform like an xx70 card. Even adding the 'm' designation implies the chip is the same core count as other xx90 chips just loeer clocks and wattage but this isn't the case. Enthusiasts know better and check specs but the average consumer isn't an enthusiast.

 

This is more of an issue with lower priced chips which attract more consumers than halo tier chips but the naming starts from the top of the stack. There was backlash against the 4080 being completely different from the 4080ti, and every laptop generation since before the 20 series have been vastly different core counts and memory compared to the same named desktop chips, let alone the muddy wattage labeling making 3060 and 3080 chips have the same fps at wildly different prices.

 

If the labeling was based on the actual performance rather than seemingly random model numbers, the general public would at least have a chance to pick the right chip for what they want.

The best gaming PC is the PC you like to game on, how you like to game on it

Link to comment
Share on other sites

Link to post
Share on other sites

42 minutes ago, GhostRoadieBL said:

I default to "what would the general public expect from a product named X" a person buying a laptop with a xx90 doesn't expect it to perform like an xx70 card. Even adding the 'm' designation implies the chip is the same core count as other xx90 chips just loeer clocks and wattage but this isn't the case. Enthusiasts know better and check specs but the average consumer isn't an enthusiast.

 

This is more of an issue with lower priced chips which attract more consumers than halo tier chips but the naming starts from the top of the stack. There was backlash against the 4080 being completely different from the 4080ti, and every laptop generation since before the 20 series have been vastly different core counts and memory compared to the same named desktop chips, let alone the muddy wattage labeling making 3060 and 3080 chips have the same fps at wildly different prices.

 

If the labeling was based on the actual performance rather than seemingly random model numbers, the general public would at least have a chance to pick the right chip for what they want.

I would argue that only the enthusiast would know how a 4090 performs on desktop to that specificity. (not even a buyer beware situation). I would also argue no person in the general public expects a laptop to ever perform the same as a desktop, especially if it is specified to not be a desktop part. its a pure wattage question. 

No one expects like a Chevy Tahoo to be as fast as a corvette when they both have 6.2L LS engines.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, starsmine said:

no person in the general public expects a laptop to ever perform the same as a desktop, especially if it is specified to not be a desktop part. its a pure wattage question. 

You actually gave the reason why the general public would think a laptop 4090 is like any other 4090, the general public don't even know how many watts their laptop or desktop even uses.

 

Enthusiasts do, people who research and compare, but the general public buys based on marketing and 'what they heard' is the best. If the 4090 desktop is talked about as being the fastest and they see 4090 on the product tag, they assume it's the same thing.

 

It was a problem over a decade ago when I worked geek squad, it was very common for people to bring in their new laptop and complain it must be broken, xyz game wasn't smooth or they weren't seeing the same smoothness as their friend's desktop even though 'it has the same stuff in it, that's why I bought this' I hear the same thing from people now with the same story, bought based on the numbers on the card and never performs as well as their desktop and these are engineers who follow model numbers.

 

I would rightly expect two engines, from the same company, in the same model year to have similar horsepower and torque out of engines with the same LS model title. Not one LS 6.2L with 500hp and another LS 6.2L with 300hp and 3/4 the cylinders.

The best gaming PC is the PC you like to game on, how you like to game on it

Link to comment
Share on other sites

Link to post
Share on other sites

36 minutes ago, GhostRoadieBL said:

You actually gave the reason why the general public would think a laptop 4090 is like any other 4090, the general public don't even know how many watts their laptop or desktop even uses.

 

I would rightly expect two engines, from the same company, in the same model year to have similar horsepower and torque out of engines with the same LS model title. Not one LS 6.2L with 500hp and another LS 6.2L with 300hp and 3/4 the cylinders.

People know what watts are, they pay the bill every month. They are able to figure out that a desktop consumes a lot more due to the fact the UPS battery are just as big as the laptop yet the desktop churns it in 20 min. 

 

A 420hp tahoe with an LS wont perform close to the same as a 490hp corvette. Not just cause of the HP but because weight and drag, nvm the ZR1 with the same engine that produces 755HP
They are all the Gen 5 6.2L LS.

People know the form factor and how much gas you can dump in, changes how fast you can go, like watts are to a computer. 

In terms of people bringing in their laptop saying their 680m does not perform as well as a 680, you dont have to spend time educating that the m modifier means less watts means less performance, but its still the flagship laptop gpu. 

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, starsmine said:

People know what watts are, they pay the bill every month. They are able to figure out that a desktop consumes a lot more due to the fact the UPS battery are just as big as the laptop yet the desktop churns it in 20 min. 

UPS battery, people using UPS is hyper ultra rare. Also knowing what watts are and how much something actually uses is very different thing. For some things it's easy like a 100W light bulb or 5w LED. Some products spell it out, many products do not have variable power usage.

 

Ask anyone how many kWh it takes to roast a chicken, without using google. Yea nobody knows much about most things power usage.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×