Jump to content

More details about the Qualcomm Snapdragon 1000 for Windows 10 laptops

Source: Winfuture.de (Google Translate) via Ars Technica

 

Quote

Details of the SDM1000, tentatively named Snapdragon 1000, a new Qualcomm chip built for Windows 10 laptops, have started to trickle out.

 

Microsoft's development of Windows 10 for ARM has seen the company partner with chip company Qualcomm. The first Windows 10 on ARM machines use the Snapdragon 835 processor, with designs based on the Snapdragon 850 (a higher clocked Snapdragon 845 intended for laptops) expected later this year. Snapdragon 1000 will be the follow-up to the 850.

 

The Snapdragon 1000 is believed to be an even more powerful laptop chip intended to go head to head with Intel's Y- and U-series Core processors. These have a 4.5W and 15W power envelope, respectively, and are used in a wide range of tablets and Ultrabook-type laptops. The Snapdragon 1000 is reported to have a 6.5W power draw for the CPU itself, with a total power draw of 12W for the entire SoC. The Snapdragon 1000 test platform has 16GB of LPDDR4X RAM and two 128GB UFS flash drives. It also has 802.11ad gigabit Wi-Fi, gigabit LTE, and a new power management controller.

Looks like Intel should be worried a bit. If the total TDP of the entire SoC is just 12W, it means that it's just slightly lower than Intel's U series processor like the i5-8350U (15W TDP, 4c/8t). And as the German site reports, it will be the first Qualcomm chip with a socket instead of being soldered to the board.

Quote

First socked ARM chip for work in progress?

However, one more detail is particularly piquant: According to the information available to us, the SDM1000 could be a "socketed" chip for the first time. At least on the test platform of Qualcomm, the chip is namely not soldered firmly on the motherboard, but like the built-in desktops processors from Intel and AMD housed in a special socket. In notebooks, the chip should still land directly on the motherboard, after all, this is common in mobile devices with x86 SoCs. 

The fact that you are aiming for completely new performance spheres also makes the size of the chip clear. If the Snapdragon 835, 845 and the upcoming Snapdragon 855 with 12.4x12.4 millimeter package size are still comparatively "small", then the SDM1000, whose final name will probably be different, 

I'm guessing the socket will use something like a BGA socket which is used by most laptop processors instead of the desktop LGA socket.

pgabgalga.jpg.9f236c195d2ea755d6cd2ce8967887d1.jpg

 

Add to the fact that Apple is rumored to be working on their own ARM chips to be used by their lower tier laptops namely the MacBook AIr and MacBook, Intel wishes they developed their own ARM chips back in the day.

They tried to make Atom chips to happen with netbooks first, then phones and tablets but it never succeeded. Now I wish Microsoft to figure out how to run x86 Win32 applications and not require emulation that causes performance penalties  or else someone will beat them to it. Just saying since the reviews of Windows 10S devices are rather lackluster that even someone like Michael Fisher who liked Windows Phone before can't recommend something like the Asus Nova Go with Windows 10S and Snapdragon 835.

 

Edited by captain_to_fire

There is more that meets the eye
I see the soul that is inside

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

This is going to be a really good competitor to Apple's ARM chips in MacBooks, and hopefully they will have enough oomph to run x86 apps without too much trouble, unlike the 835

Having problems with your fresh Windows 10 install? PM Me!
Windows 10- Want To Disable Telemetry, Disable Cortana, Disable Windows Updates? Look at my guide HERE
LTT Beginners Guide  | Community Standards | TN&R Posting Guidelines

Link to comment
Share on other sites

Link to post
Share on other sites

You mean it's slightly lower than U series. Y series is 4.5W whereas this Snapdragon would be 12W. Granted, Intel is kinda cheating considering those 4.5W is at base clock.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Trixanity said:

You mean it's slightly lower than U series. Y series is 4.5W whereas this Snapdragon would be 12W. Granted, Intel is kinda cheating considering those 4.5W is at base clock.

How is that considered cheating? 

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Trixanity said:

You mean it's slightly lower than U series. Y series is 4.5W whereas this Snapdragon would be 12W. Granted, Intel is kinda cheating considering those 4.5W is at base clock.

 I just checked and yeah, but it depends on the U series processor. The i5-8350U (4c/8t) has a TDP of 15W but the i7-8559U (4c/8t) has a 28W TDP.

There is more that meets the eye
I see the soul that is inside

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I doubt Qualcomm has the performance metric down to the point where anyone would want to use one of these. If anyone is going to make an x86 operating system run on an ARM processor well, its going to be Apple with their A series of SoCs. 

 

Or I guess Apple could make a separate line that are higher clocked and have bigger, more powerful GPUs like the iPads have been in the past. 

 

Shit the more I think of it the more I realize that Apple is a way more experienced SoC designer than Qualcomm is....

Laptop: 2019 16" MacBook Pro i7, 512GB, 5300M 4GB, 16GB DDR4 | Phone: iPhone 13 Pro Max 128GB | Wearables: Apple Watch SE | Car: 2007 Ford Taurus SE | CPU: R7 5700X | Mobo: ASRock B450M Pro4 | RAM: 32GB 3200 | GPU: ASRock RX 5700 8GB | Case: Apple PowerMac G5 | OS: Win 11 | Storage: 1TB Crucial P3 NVME SSD, 1TB PNY CS900, & 4TB WD Blue HDD | PSU: Be Quiet! Pure Power 11 600W | Display: LG 27GL83A-B 1440p @ 144Hz, Dell S2719DGF 1440p @144Hz | Cooling: Wraith Prism | Keyboard: G610 Orion Cherry MX Brown | Mouse: G305 | Audio: Audio Technica ATH-M50X & Blue Snowball | Server: 2018 Core i3 Mac mini, 128GB SSD, Intel UHD 630, 16GB DDR4 | Storage: OWC Mercury Elite Pro Quad (6TB WD Blue HDD, 12TB Seagate Barracuda, 1TB Crucial SSD, 2TB Seagate Barracuda HDD)
Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, DrMacintosh said:

-snip-

I wouldn't put the blame on Qualcomm immediately since a lot of their high end chips are amazing (except SD 801 and 810). I'd rather blame Microsoft for forcing something as heavy as Windows 10S to a passively cooled, low TDP chip like the SD835. I mean this morning, the Photos app which is running in the background is chewing 500MB of RAM which I think is unacceptable..

5b3052fd02ff4_Screenshot(168).png.d8eef61c97fe32f5f44be13d637c1ce1.png

There is more that meets the eye
I see the soul that is inside

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, mynameisjuan said:

How is that considered cheating? 

Because most of the time it will be running at boost clock unless hammered significantly enough to throttle or if the OEM implemented a shitty cooling system, so the 4.5W won't be representative of the average thermal output at load hence it's kinda cheating even if it's not necessarily wrong. It'd be like saying it only draws X mW of power because you measured in a sleep state: it's accurate but not very useful and kinda misleading but Intel at least advertises that all their TDP ratings are at base clock.

 

8 minutes ago, captain_to_fire said:

 I just checked and yeah, but it depends on the U series processor. The i5-8350U (4c/8t) has a TDP of 15W but the i7-8559U (4c/8t) has a 28W TDP.

The latter ones aren't common at all. They're pretty much only used in MacBooks and those aren't out yet (updated models with CFL that is), so 15W is the most representative figure to use.

Link to comment
Share on other sites

Link to post
Share on other sites

I am happy to see Intel starting to get competition. That company has been sleeping for ages. They are quickly becoming RIM/Blackberry, where once at the top, they stop innovating, and pushing the industry forward. Quickly Android powered devices and Apple iPhone quickly took over, and the company has awaken too late. We will see if this will also happen to Intel, or they'll not only wake up in time, but bring serious competition which means finally more powerful system between generations CPUs and lower prices.

 

4 hours ago, captain_to_fire said:

Now I wish Microsoft to figure out how to run x86 Win32 applications and not require emulation that causes performance penalties  or else someone else will beat them to it. Just saying since the reviews of Windows 10S devices are rather lackluster that even someone like Michael Fisher who liked Windows Phone before can't recommend something like the Asus Nova Go with Windows 10S and Snapdragon 835.

This is actually a pretty bad system. You can see that ASUS wanted to take 0 risk, and what is essentially a 500$ laptop, priced over the top.

HP has done a serious attempt with the HP Envy X2. That is funny with this laptop, is that you have both Intel and Snapdragon 835 model of the same system.

While performance is close between the 2 systems model when it comes to native ARM64 and UWP apps (Intel being slightly faster), the due to the x86 Win32emulation, of course, makes the 835 fair worst (that is expected).

 

However, and that is the funny bit:

  • the Intel model cost more
  • Has worst trackpad by considerable margins (835 uses a large MS Precision Driver for its trackpad, larger and has excellent tracking, while Intel model uses the essentially the cheapest trackpad money can buy)
  • Terrible keyboard while the the model with the 835 has a keyboard that is pretty good, closer to the Surface Pro. The Intel one is closer to the 500$ laptops.
  • Can only be position at 2 angles via a kickstand cover, a-la iPad style. While the 835 model is more like the Surface Pro, where you can pick any angle.
  • Intel model has horrible battery life in comparison to the 835 (less than half battery life of the 835).
  • Comes with Windows 10 Home, instead of Windows 10 S which can be upgraded to 10 Pro for free
  • Horrible LTE connectivity. Getting a fraction of the speed of the 835.

You can see, that despite all the cutting done by HP to keep the price of both system similar, the Intel is still ends more expensive, AND keep in mind that the Qualcomm 835 model is in limited production, so the cost is higher to start with.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Trixanity said:

ecause most of the time it will be running at boost clock unless hammered significantly enough to throttle or if the OEM implemented a shitty cooling system, so the 4.5W won't be representative of the average thermal output at load hence it's kinda cheating even if it's not necessarily wrong. It'd be like saying it only draws X mW of power because you measured in a sleep state: it's accurate but not very useful and kinda misleading but Intel at least advertises that all their TDP ratings are at base clock.

Cool story....so why again does this matter? Real time use is more important than stats.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, captain_to_fire said:

I wouldn't put the blame on Qualcomm immediately since a lot of their high end chips are amazing (except SD 801 and 810). I'd rather blame Microsoft for forcing something as heavy as Windows 10S to a passively cooled, low TDP chip like the SD835.

I just don't know if Qualcomm has the experience to make SoCs that aren't in phones. They don't specialize in chips designed to be real powerhouses, they design cheap and accessible SoCs for OEMs to buy and put in their products. 

 

We'll just have to wait and see what comes of these devices, but my money is on Apple knocking Qualcomm out of the water in terms of comparative performance with x86 emulations (which is going to be bad anyway though so who really gives a crap? Wheres my 1700X?)

Laptop: 2019 16" MacBook Pro i7, 512GB, 5300M 4GB, 16GB DDR4 | Phone: iPhone 13 Pro Max 128GB | Wearables: Apple Watch SE | Car: 2007 Ford Taurus SE | CPU: R7 5700X | Mobo: ASRock B450M Pro4 | RAM: 32GB 3200 | GPU: ASRock RX 5700 8GB | Case: Apple PowerMac G5 | OS: Win 11 | Storage: 1TB Crucial P3 NVME SSD, 1TB PNY CS900, & 4TB WD Blue HDD | PSU: Be Quiet! Pure Power 11 600W | Display: LG 27GL83A-B 1440p @ 144Hz, Dell S2719DGF 1440p @144Hz | Cooling: Wraith Prism | Keyboard: G610 Orion Cherry MX Brown | Mouse: G305 | Audio: Audio Technica ATH-M50X & Blue Snowball | Server: 2018 Core i3 Mac mini, 128GB SSD, Intel UHD 630, 16GB DDR4 | Storage: OWC Mercury Elite Pro Quad (6TB WD Blue HDD, 12TB Seagate Barracuda, 1TB Crucial SSD, 2TB Seagate Barracuda HDD)
Link to comment
Share on other sites

Link to post
Share on other sites

Why should the market care this time around? The sacrifices to performance for a measily extra 5 hours (supposedly) and a "constant" internet connection haven't been worth it to any market in the past.

Come Bloody Angel

Break off your chains

And look what I've found in the dirt.

 

Pale battered body

Seems she was struggling

Something is wrong with this world.

 

Fierce Bloody Angel

The blood is on your hands

Why did you come to this world?

 

Everybody turns to dust.

 

Everybody turns to dust.

 

The blood is on your hands.

 

The blood is on your hands!

 

Pyo.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, mynameisjuan said:

Cool story....so why again does this matter? Real time use is more important than stats.

Because you'll pretty much never see it staying at 4.5W then. So tell me why it doesn't matter? 

1 minute ago, DrMacintosh said:

I just don't know if Qualcomm has the experience to make SoCs that aren't in phones. They don't specialize in chips designed to be real powerhouses, they design cheap and accessible SoCs for OEMs to buy and put in their products. 

 

We'll just have to wait and see what comes of these devices, but my money is on Apple knocking Qualcomm out of the water in terms of comparative performance with x86 emulations (which is going to be bad anyway though so who really gives a crap? Wheres my 1700X?)

They've made server SoCs so they kinda do have the experience. You don't seem to know what Qualcomm can or cannot do.

It's question of money more than anything else. Qualcomm doesn't want to spend vast sums of money on R&D and large chips if their customers aren't interested.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Trixanity said:

Because you'll pretty much never see it staying at 4.5W then. So tell me why it doesn't matter? 

Because it doesnt. Wattage shouldnt matter to consumers. If it says it runs at 4.5 but runs at 12 then you will notice that in real world performance with battery. 

 

Wattage just doesnt matter for conusmers.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Trixanity said:

Qualcomm doesn't want to spend vast sums of money on R&D and large chips if their customers aren't interested.

And that is why they will fail. 

 

Customers won't be interested, the only way people will be interested is if manufactures can make them interesting. If that is Qualcomms corporate position, they are going to get pushed out of the market. 

Laptop: 2019 16" MacBook Pro i7, 512GB, 5300M 4GB, 16GB DDR4 | Phone: iPhone 13 Pro Max 128GB | Wearables: Apple Watch SE | Car: 2007 Ford Taurus SE | CPU: R7 5700X | Mobo: ASRock B450M Pro4 | RAM: 32GB 3200 | GPU: ASRock RX 5700 8GB | Case: Apple PowerMac G5 | OS: Win 11 | Storage: 1TB Crucial P3 NVME SSD, 1TB PNY CS900, & 4TB WD Blue HDD | PSU: Be Quiet! Pure Power 11 600W | Display: LG 27GL83A-B 1440p @ 144Hz, Dell S2719DGF 1440p @144Hz | Cooling: Wraith Prism | Keyboard: G610 Orion Cherry MX Brown | Mouse: G305 | Audio: Audio Technica ATH-M50X & Blue Snowball | Server: 2018 Core i3 Mac mini, 128GB SSD, Intel UHD 630, 16GB DDR4 | Storage: OWC Mercury Elite Pro Quad (6TB WD Blue HDD, 12TB Seagate Barracuda, 1TB Crucial SSD, 2TB Seagate Barracuda HDD)
Link to comment
Share on other sites

Link to post
Share on other sites

32 minutes ago, Kherm said:

This is going to be a really good competitor to Apple's ARM chips in MacBooks, and hopefully they will have enough oomph to run x86 apps without too much trouble, unlike the 835

There are no Macs with ARM (yet). Just the iPads and iPhones (and peripherals such as AppleTV, watch and homepod), though ;) 

 

And the A-series SOCs are much lower power than 12W so we'll see what they can do at a higher TDP than you see now.

PC Specs - AMD Ryzen 7 5800X3D MSI B550M Mortar - 32GB Corsair Vengeance RGB DDR4-3600 @ CL16 - ASRock RX7800XT 660p 1TBGB & Crucial P5 1TB Fractal Define Mini C CM V750v2 - Windows 11 Pro

 

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, DrMacintosh said:

Apple knocking Qualcomm out of the water in terms of comparative performance with x86 emulations

If Apple can bring back  universal binary macOS and force devs to make universal binary applications for both ARM and Intel either by rewriting them or by bringing iOS apps to the Mac, then yeah sure but that is not due until 2019.

10 hours ago, GoodBytes said:

You can see, that despite all the cutting done by HP to keep the price of both system similar, the Intel is still ends more expensive, AND keep in mind that the Qualcomm 835 model is in limited production, so the cost is higher to start with.

Well even the SD835 version is still very expensive not to mention it's 200 to 300 nits dimmer than competing tablets. Then it brings back the question, "how long will Qualcomm provide driver updates"? One of the reasons why Android is fragmented is that Qualcomm only releases updated drivers for two years so an Android phone will get 2 to 3 major Android releases for 24 months and security updates for 36 months. What's the assurance that Windows on ARM devices will not suffer the same fate?

 

10 hours ago, DrMacintosh said:

And that is why they will fail. 

 

Customers won't be interested, the only way people will be interested is if manufactures can make them interesting. If that is Qualcomms corporate position, they are going to get pushed out of the market. 

Just as everything else, I'll believe it till I see it. 

There is more that meets the eye
I see the soul that is inside

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, captain_to_fire said:

If Apple can bring back  universal binary macOS and force devs to make universal binary applications for both ARM and Intel either by rewriting them or by bringing iOS apps to the Mac, then yeah sure but that is not due until 2019

They will be doing the former, as stated at WWDC 2018. Apple is not merging iOS and macOS, they are making macOS able to work with iOS apps. 

 

And so what if its due at a later date? Apple rarely is the first to a market. They look at a market, see what everyone else is doing wrong, then makes their own product that corrects them. 

Laptop: 2019 16" MacBook Pro i7, 512GB, 5300M 4GB, 16GB DDR4 | Phone: iPhone 13 Pro Max 128GB | Wearables: Apple Watch SE | Car: 2007 Ford Taurus SE | CPU: R7 5700X | Mobo: ASRock B450M Pro4 | RAM: 32GB 3200 | GPU: ASRock RX 5700 8GB | Case: Apple PowerMac G5 | OS: Win 11 | Storage: 1TB Crucial P3 NVME SSD, 1TB PNY CS900, & 4TB WD Blue HDD | PSU: Be Quiet! Pure Power 11 600W | Display: LG 27GL83A-B 1440p @ 144Hz, Dell S2719DGF 1440p @144Hz | Cooling: Wraith Prism | Keyboard: G610 Orion Cherry MX Brown | Mouse: G305 | Audio: Audio Technica ATH-M50X & Blue Snowball | Server: 2018 Core i3 Mac mini, 128GB SSD, Intel UHD 630, 16GB DDR4 | Storage: OWC Mercury Elite Pro Quad (6TB WD Blue HDD, 12TB Seagate Barracuda, 1TB Crucial SSD, 2TB Seagate Barracuda HDD)
Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, mynameisjuan said:

Because it doesnt. Wattage shouldnt matter to consumers. If it says it runs at 4.5 but runs at 12 then you will notice that in real world performance with battery. 

 

Wattage just doesnt matter for conusmers.

But it does matter otherwise we wouldn't have the rating to begin with.

Intel does segment its processors according to TDP in a consumer facing manner.

If TDP never mattered we wouldn't discuss it so often.

A misleading figure that will rarely be accurate in any meaningful scenario isn't good.

So I'm going to assume apathy or ignorance on your part. If anything, you didn't really post any meaningful reason not to care except saying "it shouldn't matter" and "if it says 4.5 but is in fact 12 you'll notice on battery life". Actually, the latter quote just confirms that it does matter. Because if you get 3 hours less battery because it's consistently pulling too much power then you'll get complaints by multiple parties. So I think that's the end of that discussion.

12 minutes ago, DrMacintosh said:

And that is why they will fail. 

 

Customers won't be interested, the only way people will be interested is if manufactures can make them interesting. If that is Qualcomms corporate position, they are going to get pushed out of the market. 

They won't fail. Depending on the market you're referring to, the notion is ridiculous.

Qualcomm has downplayed their CPU performance for a while and don't focus on it anymore because smartphone SoCs are so much more than CPU performance which Qualcomm has abused to the nth degree. Qualcomm owns the market and will do so for the foreseeable future.

Consumers have very little knowledge of what's inside their phones so it isn't really a selling point to post good benchmarks. As long as it does what they need they don't care.

I personally wish Qualcomm would go back to exploring performance but they don't have to in the current market.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Trixanity said:

-snip-

So basically the TDP fluctuations to the U series processors is kind a like turbo boost then throttle later when it gets hot?

There is more that meets the eye
I see the soul that is inside

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Trixanity said:

Qualcomm owns the market and will do so for the foreseeable future.

They have market share, they but they don’t lead in performance or profits, aka the important stuff. 

 

Its like Android and iOS from a developers standpoint. Android has a larger user base but nobody on Android buys anything. 

Laptop: 2019 16" MacBook Pro i7, 512GB, 5300M 4GB, 16GB DDR4 | Phone: iPhone 13 Pro Max 128GB | Wearables: Apple Watch SE | Car: 2007 Ford Taurus SE | CPU: R7 5700X | Mobo: ASRock B450M Pro4 | RAM: 32GB 3200 | GPU: ASRock RX 5700 8GB | Case: Apple PowerMac G5 | OS: Win 11 | Storage: 1TB Crucial P3 NVME SSD, 1TB PNY CS900, & 4TB WD Blue HDD | PSU: Be Quiet! Pure Power 11 600W | Display: LG 27GL83A-B 1440p @ 144Hz, Dell S2719DGF 1440p @144Hz | Cooling: Wraith Prism | Keyboard: G610 Orion Cherry MX Brown | Mouse: G305 | Audio: Audio Technica ATH-M50X & Blue Snowball | Server: 2018 Core i3 Mac mini, 128GB SSD, Intel UHD 630, 16GB DDR4 | Storage: OWC Mercury Elite Pro Quad (6TB WD Blue HDD, 12TB Seagate Barracuda, 1TB Crucial SSD, 2TB Seagate Barracuda HDD)
Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, Trixanity said:

But it does matter otherwise we wouldn't have the rating to begin with.

Intel does segment its processors according to TDP in a consumer facing manner.

If TDP never mattered we wouldn't discuss it so often.

A misleading figure that will rarely be accurate in any meaningful scenario isn't good.

So I'm going to assume apathy or ignorance on your part. If anything, you didn't really post any meaningful reason not to care except saying "it shouldn't matter" and "if it says 4.5 but is in fact 12 you'll notice on battery life". Actually, the latter quote just confirms that it does matter. Because if you get 3 hours less battery because it's consistently pulling too much power then you'll get complaints by multiple parties. So I think that's the end of that discussion.

I am saying that specific wattage doesnt matter to anyone but the manufacturer to make sure enough power is supplied and sufficient cooling. Does TDP matter with custom machines? Sure, need to make sure you have enough cooling. In a fucking laptop? No its doesnt matter. 

 

Again this is not "cheating", in the end people will choose between performance and battery life and judge for themselves. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, captain_to_fire said:

So basically the TDP fluctuations to the U series processors is kind a like turbo boost then throttle later when it gets hot?

In theory, yes. It'll have a single/multi core boost algorithm but at the same time it's limited by power/thermal limits. It can be either. It should be noted that while there is a correlation they're different measurements and sometimes you can have plenty cooling but still get throttling because of power limits. I'm not entirely sure if it's accurate but I've heard the 15W U processors have a maximum power state of 41W or so but it'll throttle eventually though. It'll fluctuate to maintain proper thermals/power. It's actually resulted in some interesting benchmarks especially in gaming. This happens with both AMD and Intel by the way: you'll sometimes see the laptop be tuned for CPU power so you'll see the CPU maintain decent clocks but the GPU throttle heavily in a gaming session. However GPU clocks are preferable for gaming, so you'll see massive framerate drops when it throttles. I've seen people try workarounds to make the GPU to get bigger slice of the power budget resulting in a much better framerate. That's why TDP matters and why proper tuning of processor behavior is so important for reliable power and performance metrics. Tuning processors is an arduous task that very few get right. 

 

Unless you mean the difference between the 15W and 28W processors which in that case is a bit more complicated. The 28W processors have a much better GPU and the bigger TDP will allow it overall better performance due to the bigger headroom to maintain clocks. If they weren't so expensive, we'd see more OEMs use them besides Apple.

 

But the TL;DR on the TDP is that if the cooling is there then it'll maintain the boost unless there are other metrics that prevent it. It can be seen on AMD's U processors where if the cooling allows it, it'll step the TDP up from 15W to 25W.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, DrMacintosh said:

Its like Android and iOS from a developers standpoint. Android has a larger user base but nobody on Android buys anything. 

Why when there are plenty of free well made apps? 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, mynameisjuan said:

Why when there are plenty of free well made apps? 

Well Android users generally are in the lower income brackets so they are more likely to seek out free alternatives rather than consume the content that devs need to make their products. 

 

Also LOL, free well made apps on Android? Have you looked around the Play Store recently? 

Laptop: 2019 16" MacBook Pro i7, 512GB, 5300M 4GB, 16GB DDR4 | Phone: iPhone 13 Pro Max 128GB | Wearables: Apple Watch SE | Car: 2007 Ford Taurus SE | CPU: R7 5700X | Mobo: ASRock B450M Pro4 | RAM: 32GB 3200 | GPU: ASRock RX 5700 8GB | Case: Apple PowerMac G5 | OS: Win 11 | Storage: 1TB Crucial P3 NVME SSD, 1TB PNY CS900, & 4TB WD Blue HDD | PSU: Be Quiet! Pure Power 11 600W | Display: LG 27GL83A-B 1440p @ 144Hz, Dell S2719DGF 1440p @144Hz | Cooling: Wraith Prism | Keyboard: G610 Orion Cherry MX Brown | Mouse: G305 | Audio: Audio Technica ATH-M50X & Blue Snowball | Server: 2018 Core i3 Mac mini, 128GB SSD, Intel UHD 630, 16GB DDR4 | Storage: OWC Mercury Elite Pro Quad (6TB WD Blue HDD, 12TB Seagate Barracuda, 1TB Crucial SSD, 2TB Seagate Barracuda HDD)
Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×