Jump to content

Exynos Galaxy S9 significantly underperforms in comparison to Snapdragon models - Anandtech

D13H4RD

Given the impressive benchmark scores, one would definitely expect the Exynos to crush the Snapdragon 845 in the S9. While it certainly does so in benchmarks, in the real world, it's a completely different story that flips the page completely.

 

In their in-depth review, Anandtech found that the Exynos 9810 had significant worse real-world performance in their testing in comparison to the SD845 variant, with overall improvement being only marginally better than the Exynos 8895 in its predecessor. 

Quote

AnandTech is usually data-driven when making claims about performance so the stark contrast between the Exynos’ synthetic performance and the system benchmarks more than ever question the validity of both. There are two questions to answer here: are the benchmarks still working as intended and representative, and if they are, what happened to the Exynos 9810’s raw performance?

 

For the first question, I haven’t seen any evidence to contradict the results of our system benchmarks. The Exynos 9810 variant of the Galaxy S9 simply isn’t any faster in most workloads and in one-on-one comparisons against the Snapdragon 845 variant it was indeed the less consistent one in performance and losing out in terms of responsiveness, even if that difference in absolute terms is very minor.

 

As to why this is happening on the Exynos is something that I attribute to scheduler and DVFS. Samsung’s new scheduler and DVFS just seems absolutely atrociously tuned for performance. I tested an interactive workload on both Snapdragon and Exynos devices and the contrast couldn’t be any greater. On the Snapdragon 845 Galaxy S9 a steady state workload thread will seemingly migrate from a full idle state of the little CPUs onto the big CPUs after 65ms. At the migration moment the big CPUs kick into full gear at 2803MHz and will maintain that frequency for as long as the workload demands it.

 

On the Exynos 9810 Galaxy S9 the same workload will also migrate around at the 60ms time from the little cores up to the big cores, however once on the big cores the thread starts at the lowest frequencies of the big cluster – 650-741MHz. It takes the workload a whole 370ms until it reaches the 2314MHz state of the M3 cores – which according to the SPEC benchmarks is around the maximum single-threaded performance of the Snapdragon 845’s performance cores. To reach the full 2703MHz of the M3 cores the workload needs to have been active for a staggering 410ms before the DVFS mechanism starts switching to that frequency state.

 

UI workloads are highly transactional and very rarely is there something which takes longer than a few frames. The fact that the Exynos 9810 takes over 5x longer to reach the maximum performance state of the Snapdragon 845 basically invalidates absolutely everything about the performance of its cores. For workloads which are shorter than 400ms (which is a *lot* of time in computing terms) the Snapdragon will have already finished the race before the Exynos warms up. Only at higher workload durations would the Exynos then finally catch up. Acceleration vs maximum speed being the key aspects here. This is Samsung’s first EAS based scheduler for Exynos devices, and the way the schedutil governor is tuned here is a great disappointment for performance.

 

Beyond the Exynos’ overzealous “slow-and-steady” DVFS approach I’m also not happy how the core count/maximum frequency mechanism is implemented. This is a simple HR timer task that checks the CPU runqueues and based on a threshold of heavy threads it simply offlines or onlines the CPUs. The fixed interval here is 15ms when in a quad-core state and 30ms in dual- and single-core states. Beyond the fact that the whole offlineing/onlineing of the cores is extremely inefficient as a scheduler mechanism, it’s worrisome that when the SoC is in dual or single-mode and there’s suddenly a burst of threads, the CPUs will be highly underprovisioned in terms of capacity up to 30ms until the mechanism turns back on the other cores.

 

The fact that the DVFS mechanism is so slow completely invalidates the benefit of such a mechanism in the first place as the one use-case where single-threaded performance trumps everything is web-browsing and heavy JavaScript workloads, which by nature, are short and bursty. Samsung should have simply completely ignored frequencies above 2.1-2.3GHz (matching the Snapdragon in ST Performance), ignored this whole variable maximum frequency mechanism, and instead concentrated on getting performance through scheduler and DVFS response-time, something which Qualcomm seems to have learned to master. In the end S.LSI investment in a performant custom CPU core is sabotaged by very questionable software, and the Exynos’ CPU performance goals go largely unfulfilled in real interactive workloads.

Source: https://www.anandtech.com/show/12520/the-galaxy-s9-review/5

 

Even more damning of which is that when the Exynos processor is allowed to flex its muscles, it seems consumes much more power than its predecessor 

Quote

The Exynos 9810 Galaxy S9 absolutely fell flat on its face in this test and posted the worst results among our tracking of the latest generation devices, lasting 3 hours less than the Exynos 8895 Galaxy S8. This was such a terrible run that I redid the test and still resulted in the same runtime.

 

I investigated the matter further to try to see if this was caused by the high energy usage of the M3 cores – and it seems it is. Enabling the “CPU limiter” (S9 PS result in the graphs) which is found in the battery optimisation options of Samsung’s firmware greatly throttles the M3 cores down to 1469 MHz, memory controller to half speed and also seemingly changes some scheduler settings to make them more conservative. This results in peak performance equal to the Exynos 8895- however the scheduler alterations also noticeably slow down UI responsiveness so it’s actually a worse experience. Nevertheless, backing off on performance results in regaining almost 3 hours.

 

This is such a terrible battery performance of the Exynos 9810 variant that it again puts even more clout into the new SoC. My theory as to why this happens is that not only do the higher frequency state require more energy per work done than competing SoCs – because this is a big CPU complex there’s also lots of leakage at play. The DVFS system being so slow might actually be bad for energy here as we might be seeing the opposite of race-to-sleep, walk-to-waste. The fact that Apple’s SoCs don’t have any issues with battery life in this test showcases that it’s not an inherent problem of having a high-power micro-architecture, but rather something specific to the Exynos 9810.

Source: https://www.anandtech.com/show/12520/the-galaxy-s9-review/8

 

Granted, it should be noted that the reviewer's own experience isn't that negative, but it still paints a pretty cloudy picture. 

Quote

In my personal every-day usage I can’t saw that I noticed a massive disadvantage in battery life on the Galaxy S9, however my everyday usage is relatively light and I haven’t had enough time with the phone yet as a daily driver to make a final judgment. I did notice that the Exynos 9810 does shows signs of suffering in heavy tasks. Instances of Gmail syncing my inbox with a new account did once result in a warm phone while the Snapdragon 845 Galaxy S9 did not showcase this characteristic.

Source: https://www.anandtech.com/show/12520/the-galaxy-s9-review/8

 

It's entirely possible that these can be improved with a future software upgrade but for the Exynos 9810 SoC, it's not looking too bright. Which is a shame because the last 2 managed to outperform some of Qualcomm's offerings. This one doesn't seem to be able to carry that torch. 

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

Wasn't there some new a while back saying that Samsung was going to artificially limit their cpus to not cannibalize their snapdragon models? That would explain why but the power consumption factor IS worrying.

Use this guide to fix text problems in your postGo here and here for all your power supply needs

 

New Build Currently Under Construction! See here!!!! -----> 

 

Spoiler

Deathwatch:[CPU I7 4790K @ 4.5GHz][RAM TEAM VULCAN 16 GB 1600][MB ASRock Z97 Anniversary][GPU XFX Radeon RX 480 8GB][STORAGE 250GB SAMSUNG EVO SSD Samsung 2TB HDD 2TB WD External Drive][COOLER Cooler Master Hyper 212 Evo][PSU Cooler Master 650M][Case Thermaltake Core V31]

Spoiler

Cupid:[CPU Core 2 Duo E8600 3.33GHz][RAM 3 GB DDR2][750GB Samsung 2.5" HDD/HDD Seagate 80GB SATA/Samsung 80GB IDE/WD 325GB IDE][MB Acer M1641][CASE Antec][[PSU Altec 425 Watt][GPU Radeon HD 4890 1GB][TP-Link 54MBps Wireless Card]

Spoiler

Carlile: [CPU 2x Pentium 3 1.4GHz][MB ASUS TR-DLS][RAM 2x 512MB DDR ECC Registered][GPU Nvidia TNT2 Pro][PSU Enermax][HDD 1 IDE 160GB, 4 SCSI 70GB][RAID CARD Dell Perc 3]

Spoiler

Zeonnight [CPU AMD Athlon x2 4400][GPU Sapphire Radeon 4650 1GB][RAM 2GB DDR2]

Spoiler

Server [CPU 2x Xeon L5630][PSU Dell Poweredge 850w][HDD 1 SATA 160GB, 3 SAS 146GB][RAID CARD Dell Perc 6i]

Spoiler

Kero [CPU Pentium 1 133Mhz] [GPU Cirrus Logic LCD 1MB Graphics Controller] [Ram 48MB ][HDD 1.4GB Hitachi IDE]

Spoiler

Mining Rig: [CPU Athlon 64 X2 4400+][GPUS 9 RX 560s, 2 RX 570][HDD 160GB something][RAM 8GBs DDR3][PSUs 1 Thermaltake 700w, 2 Delta 900w 120v Server modded]

RAINBOWS!!!

 

 QUOTE ME SO I CAN SEE YOUR REPLYS!!!!

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, 8uhbbhu8 said:

Wasn't there some new a while back saying that Samsung was going to artificially limit their cpus to not cannibalize their snapdragon models? That would explain why but the power consumption factor IS worrying.

That's exactly what I was worried about. That the Exynos was made to be super powerful at the expense of efficiency, and not necessarily making sure it's powerful while efficiently managing that sort of power. 

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

48 minutes ago, D13H4RD2L1V3 said:

It's entirely possible that these can be improved with a future software upgrade but for the Exynos 9810 SoC, it's not looking too bright. Which is a shame because the last 2 managed to outperform some of Qualcomm's offerings. This one doesn't seem to be able to carry that torch. 

If it's mostly scheduler issues it shouldn't be that big of an issue to fix. For something like that it could even potentially come through as part of a security fix for most phones (not sure if Knox has issues with that).

 

The past couple Exynos chips have performed well in benchmarks but been taken down in real world performance. This is much of the same. I really hope they can get a handle on this, because it should legitimately be a lot more powerful and efficient than the Snapdragon offerings for the first time, outside of benchmarks.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Sniperfox47 said:

The past couple Exynos chips have performed well in benchmarks but been taken down in real world performance. This is much of the same. I really hope they can get a handle on this, because it should legitimately be a lot more powerful and efficient than the Snapdragon offerings for the first time, outside of benchmarks.

The 8890 and 8895 actually performed better in the real world compared to their Snapdragon counterparts AFAIK except in GPU.

 

Especially the 8890 in the S7. I believe that was when they started going custom. The gap is mostly closed with the 8895, and smaller still with Oreo. 

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

I've got quite a bad luck, I recently got my first Samsung Galaxy phone which is the Galaxy S9 (Exynos model, I'm from EU) and now this happens, despite early benchmarks stating the opposite :/

 

Though I found this video uploaded yesterday:

In general they trade blows and it's really hard to notice any difference in the real world. Scheduler issues on the Exynos will probably be partially or fully fixed by firmware updates so I'm not too worried, this phone is blistering-fast anyway so I'd actually prefer to have more power "under the hood" for the future. Especially if the difference is this insignificant.

CPU: AMD Ryzen 7 5800X3D GPU: AMD Radeon RX 6900 XT 16GB GDDR6 Motherboard: MSI PRESTIGE X570 CREATION
AIO: Corsair H150i Pro RAM: Corsair Dominator Platinum RGB 32GB 3600MHz DDR4 Case: Lian Li PC-O11 Dynamic PSU: Corsair RM850x White

Link to comment
Share on other sites

Link to post
Share on other sites

This seems awfully reminisce of Nvidia's Denver cpu where the gisnt core performed exceptionally well in benchmarks, but was otherwise.lackluster outside. Granted, one runs native ARM code, the other runs some form of code morphing, so it's probably not directly comparable, but it's still interesting to see the similarities.

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, D13H4RD2L1V3 said:

The 8890 and 8895 actually performed better in the real world compared to their Snapdragon counterparts AFAIK except in GPU.

 

Especially the 8890 in the S7. I believe that was when they started going custom. The gap is mostly closed with the 8895, and smaller still with Oreo. 

the S6 Didnt come in snapdragon chips at all 100% Samsung chips. They were the only android phone worth buying that year due to the 810 Being terrible.

 S7-9 they have offered both flavors with the samsung chips in past years sort of being the better option overall by most peoples standards but hasn't had the same lead as the year they Fully Dropped the SD lineup.

Link to comment
Share on other sites

Link to post
Share on other sites

25 minutes ago, michaelocarroll007 said:

the S6 Didnt come in snapdragon chips at all 100% Samsung chips. They were the only android phone worth buying that year due to the 810 Being terrible.

 S7-9 they have offered both flavors with the samsung chips in past years sort of being the better option overall by most peoples standards but hasn't had the same lead as the year they Fully Dropped the SD lineup.

I think that has a lot to do with how bad the 810 was

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah I even remember reading it before that it's cuase of scheduler and latency to ramp up clocks to max. Very odd yes, though also they shouldn't worry much about staying closer to SD version and nerf Exynos one. 

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

WWHHYY CANT WE GET THE 845 IN THE UK SAMSUNG, WHHHHYYYY!?

If you want to reply back to me or someone else USE THE QUOTE BUTTON!                                                      
Pascal laptops guide

Link to comment
Share on other sites

Link to post
Share on other sites

17 hours ago, D13H4RD2L1V3 said:

The 8890 and 8895 actually performed better in the real world compared to their Snapdragon counterparts AFAIK except in GPU.

 

Especially the 8890 in the S7. I believe that was when they started going custom. The gap is mostly closed with the 8895, and smaller still with Oreo. 

True of 8890, but IIRC the 835 did better than the 8895.

16 hours ago, Morgan MLGman said:

I've got quite a bad luck, I recently got my first Samsung Galaxy phone which is the Galaxy S9 (Exynos model, I'm from EU) and now this happens, despite early benchmarks stating the opposite :/

Only benchmark that out the exynos ahead that I've seen is geek bench. Unfortunately, tech outlets LOVE to use geek bench and absolutely nothing else.

16 hours ago, Morgan MLGman said:

In general they trade blows and it's really hard to notice any difference in the real world. Scheduler issues on the Exynos will probably be partially or fully fixed by firmware updates so I'm not too worried, this phone is blistering-fast anyway so I'd actually prefer to have more power "under the hood" for the future. Especially if the difference is this insignificant.

Speed tests are much more RAM and NAND based, not cpu or gpu dependent. The places you might notice it more are animations, web browsing, games, etc.

Make sure to quote me or tag me when responding to me, or I might not know you replied! Examples:

 

Do this:

Quote

And make sure you do it by hitting the quote button at the bottom left of my post, and not the one inside the editor!

Or this:

@DocSwag

 

Buy whatever product is best for you, not what product is "best" for the market.

 

Interested in computer architecture? Still in middle or high school? P.M. me!

 

I love computer hardware and feel free to ask me anything about that (or phones). I especially like SSDs. But please do not ask me anything about Networking, programming, command line stuff, or any relatively hard software stuff. I know next to nothing about that.

 

Compooters:

Spoiler

Desktop:

Spoiler

CPU: i7 6700k, CPU Cooler: be quiet! Dark Rock Pro 3, Motherboard: MSI Z170a KRAIT GAMING, RAM: G.Skill Ripjaws 4 Series 4x4gb DDR4-2666 MHz, Storage: SanDisk SSD Plus 240gb + OCZ Vertex 180 480 GB + Western Digital Caviar Blue 1 TB 7200 RPM, Video Card: EVGA GTX 970 SSC, Case: Fractal Design Define S, Power Supply: Seasonic Focus+ Gold 650w Yay, Keyboard: Logitech G710+, Mouse: Logitech G502 Proteus Spectrum, Headphones: B&O H9i, Monitor: LG 29um67 (2560x1080 75hz freesync)

Home Server:

Spoiler

CPU: Pentium G4400, CPU Cooler: Stock, Motherboard: MSI h110l Pro Mini AC, RAM: Hyper X Fury DDR4 1x8gb 2133 MHz, Storage: PNY CS1311 120gb SSD + two Segate 4tb HDDs in RAID 1, Video Card: Does Intel Integrated Graphics count?, Case: Fractal Design Node 304, Power Supply: Seasonic 360w 80+ Gold, Keyboard+Mouse+Monitor: Does it matter?

Laptop (I use it for school):

Spoiler

Surface book 2 13" with an i7 8650u, 8gb RAM, 256 GB storage, and a GTX 1050

And if you're curious (or a stalker) I have a Just Black Pixel 2 XL 64gb

 

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, DocSwag said:

Only benchmark that out the exynos ahead that I've seen is geek bench. Unfortunately, tech outlets LOVE to use geek bench and absolutely nothing else.

Speed tests are much more RAM and NAND based, not cpu or gpu dependent. The places you might notice it more are animations, web browsing, games, etc.

Hmm, from the videos that I watched comparing different phones to both the Exynos and Snapdragon models it doesn't seem like it:

The Exynos version was not only faster on average than the iPhone X but also had better results in his testing methodology than the Snapdragon S9+ model that he tested previously.

It was a lot faster when opening multiple different apps for the first time (2:27 vs 3:04) probably due to a more powerful chip, but was a bit slower when re-opening them again later (0:58 vs 0:44) so it probably might be caused by the scheduler issue.

 

I'm not sure how reliable are results like that, but they do paint a rough picture of how those two CPUs stack up to one another in terms of real-world performance.

 

Link to the Snapdragon model comparison to the iPhone X from the same channel: https://www.youtube.com/watch?v=nxFTgOfIEqw

CPU: AMD Ryzen 7 5800X3D GPU: AMD Radeon RX 6900 XT 16GB GDDR6 Motherboard: MSI PRESTIGE X570 CREATION
AIO: Corsair H150i Pro RAM: Corsair Dominator Platinum RGB 32GB 3600MHz DDR4 Case: Lian Li PC-O11 Dynamic PSU: Corsair RM850x White

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, Morgan MLGman said:

Hmm, from the videos that I watched comparing different phones to both the Exynos and Snapdragon models it doesn't seem like it:

The Exynos version was not only faster on average than the iPhone X but also had better results in his testing methodology than the Snapdragon S9+ model that he tested previously.

It was a lot faster when opening multiple different apps for the first time (2:27 vs 3:04) probably due to a more powerful chip, but was a bit slower when re-opening them again later (0:58 vs 0:44) so it probably might be caused by the scheduler issue.

 

I'm not sure how reliable are results like that, but they do paint a rough picture of how those two CPUs stack up to one another in terms of real-world performance.

 

Link to the Snapdragon model comparison to the iPhone X from the same channel: https://www.youtube.com/watch?v=nxFTgOfIEqw

AFAIK those tests don't test raw cpu performance though.... Rather, they test NAND performance and DRAM performance. The differences are more likely to be due to better/worse IO and a better/worse IMC than the cpu cores.

 

It's especially evident if you look at all the real world tests anandtech did that the exynos just doesn't do well. It takes the exynos way too long to turbo up, making it really fail at the short bursty workloads that honestly are the main reason single threaded performance is useful at all in smartphones.

Make sure to quote me or tag me when responding to me, or I might not know you replied! Examples:

 

Do this:

Quote

And make sure you do it by hitting the quote button at the bottom left of my post, and not the one inside the editor!

Or this:

@DocSwag

 

Buy whatever product is best for you, not what product is "best" for the market.

 

Interested in computer architecture? Still in middle or high school? P.M. me!

 

I love computer hardware and feel free to ask me anything about that (or phones). I especially like SSDs. But please do not ask me anything about Networking, programming, command line stuff, or any relatively hard software stuff. I know next to nothing about that.

 

Compooters:

Spoiler

Desktop:

Spoiler

CPU: i7 6700k, CPU Cooler: be quiet! Dark Rock Pro 3, Motherboard: MSI Z170a KRAIT GAMING, RAM: G.Skill Ripjaws 4 Series 4x4gb DDR4-2666 MHz, Storage: SanDisk SSD Plus 240gb + OCZ Vertex 180 480 GB + Western Digital Caviar Blue 1 TB 7200 RPM, Video Card: EVGA GTX 970 SSC, Case: Fractal Design Define S, Power Supply: Seasonic Focus+ Gold 650w Yay, Keyboard: Logitech G710+, Mouse: Logitech G502 Proteus Spectrum, Headphones: B&O H9i, Monitor: LG 29um67 (2560x1080 75hz freesync)

Home Server:

Spoiler

CPU: Pentium G4400, CPU Cooler: Stock, Motherboard: MSI h110l Pro Mini AC, RAM: Hyper X Fury DDR4 1x8gb 2133 MHz, Storage: PNY CS1311 120gb SSD + two Segate 4tb HDDs in RAID 1, Video Card: Does Intel Integrated Graphics count?, Case: Fractal Design Node 304, Power Supply: Seasonic 360w 80+ Gold, Keyboard+Mouse+Monitor: Does it matter?

Laptop (I use it for school):

Spoiler

Surface book 2 13" with an i7 8650u, 8gb RAM, 256 GB storage, and a GTX 1050

And if you're curious (or a stalker) I have a Just Black Pixel 2 XL 64gb

 

Link to comment
Share on other sites

Link to post
Share on other sites

Apple should just start selling the A series of SoCs to Android manufacturers. 

 

Though Android would only benefit from the raw power, not many of the optimization’s priopriatary to the OS. 

 

The S9 is wicked fast, but mostly because how the skin is designed to have fairly quick animations. The S9 also has more RAM so if you hit the RAM limit on the iPhone X in a side by side test the iPhone should lose. 

 

Apple still is leading the industry with graphics and raw multi core power. 

Laptop: 2019 16" MacBook Pro i7, 512GB, 5300M 4GB, 16GB DDR4 | Phone: iPhone 13 Pro Max 128GB | Wearables: Apple Watch SE | Car: 2007 Ford Taurus SE | CPU: R7 5700X | Mobo: ASRock B450M Pro4 | RAM: 32GB 3200 | GPU: ASRock RX 5700 8GB | Case: Apple PowerMac G5 | OS: Win 11 | Storage: 1TB Crucial P3 NVME SSD, 1TB PNY CS900, & 4TB WD Blue HDD | PSU: Be Quiet! Pure Power 11 600W | Display: LG 27GL83A-B 1440p @ 144Hz, Dell S2719DGF 1440p @144Hz | Cooling: Wraith Prism | Keyboard: G610 Orion Cherry MX Brown | Mouse: G305 | Audio: Audio Technica ATH-M50X & Blue Snowball | Server: 2018 Core i3 Mac mini, 128GB SSD, Intel UHD 630, 16GB DDR4 | Storage: OWC Mercury Elite Pro Quad (6TB WD Blue HDD, 12TB Seagate Barracuda, 1TB Crucial SSD, 2TB Seagate Barracuda HDD)
Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, DrMacintosh said:

Apple should just start selling the A series of SoCs to Android manufacturers. 

 

Though Android would only benefit from the raw power, not many of the optimization’s priopriatary to the OS. 

 

The S9 is wicked fast, but mostly because how the skin is designed to have fairly quick animations. The S9 also has more RAM so if you hit the RAM limit on the iPhone X in a side by side test the iPhone should lose. 

 

Apple still is leading the industry with graphics and raw multi core power. 

On the board level, I tend to see a lot more in the way of capacitors and other such components on Apple's devices than many Android devices. Beyond just the cost of the chip, I suspect a robust power delivery is required to feed the big CPU cores, which is probably going to be more expensive than what many OEMs would be willing to pay for.

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, Zodiark1593 said:

On the board level, I tend to see a lot more in the way of capacitors and other such components on Apple's devices than many Android devices. Beyond just the cost of the chip, I suspect a robust power delivery is required to feed the big CPU cores, which is probably going to be more expensive than what many OEMs would be willing to pay for.

That’s true, Apple’s PCBs are works of art. 

 

Black logic baords are so cool despite almost nobody getting to look at them. 

Laptop: 2019 16" MacBook Pro i7, 512GB, 5300M 4GB, 16GB DDR4 | Phone: iPhone 13 Pro Max 128GB | Wearables: Apple Watch SE | Car: 2007 Ford Taurus SE | CPU: R7 5700X | Mobo: ASRock B450M Pro4 | RAM: 32GB 3200 | GPU: ASRock RX 5700 8GB | Case: Apple PowerMac G5 | OS: Win 11 | Storage: 1TB Crucial P3 NVME SSD, 1TB PNY CS900, & 4TB WD Blue HDD | PSU: Be Quiet! Pure Power 11 600W | Display: LG 27GL83A-B 1440p @ 144Hz, Dell S2719DGF 1440p @144Hz | Cooling: Wraith Prism | Keyboard: G610 Orion Cherry MX Brown | Mouse: G305 | Audio: Audio Technica ATH-M50X & Blue Snowball | Server: 2018 Core i3 Mac mini, 128GB SSD, Intel UHD 630, 16GB DDR4 | Storage: OWC Mercury Elite Pro Quad (6TB WD Blue HDD, 12TB Seagate Barracuda, 1TB Crucial SSD, 2TB Seagate Barracuda HDD)
Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, DrMacintosh said:

Apple should just start selling the A series of SoCs to Android manufacturers. 

NO! Apple will lose what makes the iPhone and iPad special, very fast SoCs. Besides, that's like saying Apple should license macOS to PC OEMs which I hope  they don't consider. 

There is more that meets the eye
I see the soul that is inside

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, hey_yo_ said:

NO! Apple will lose what makes the iPhone and iPad special, very fast SoCs. Besides, that's like saying Apple should license macOS to PC OEMs which I hope  they don't consider. 

Kinda

 

The A series is just so much better than what Qualcomm is doing it would just be interesting to see if Apple could market a sub series of A SoCs for use in Android products. 

 

Now of course this means that Apple could tweak the Android variants and make the slower than their iOS counterparts to still give the iPhone an advantage. 

 

Its really more of a thought experiment than it is an actual suggestion. 

 

And yeah, don’t worry, macOS is staying on the Mac xD 

Laptop: 2019 16" MacBook Pro i7, 512GB, 5300M 4GB, 16GB DDR4 | Phone: iPhone 13 Pro Max 128GB | Wearables: Apple Watch SE | Car: 2007 Ford Taurus SE | CPU: R7 5700X | Mobo: ASRock B450M Pro4 | RAM: 32GB 3200 | GPU: ASRock RX 5700 8GB | Case: Apple PowerMac G5 | OS: Win 11 | Storage: 1TB Crucial P3 NVME SSD, 1TB PNY CS900, & 4TB WD Blue HDD | PSU: Be Quiet! Pure Power 11 600W | Display: LG 27GL83A-B 1440p @ 144Hz, Dell S2719DGF 1440p @144Hz | Cooling: Wraith Prism | Keyboard: G610 Orion Cherry MX Brown | Mouse: G305 | Audio: Audio Technica ATH-M50X & Blue Snowball | Server: 2018 Core i3 Mac mini, 128GB SSD, Intel UHD 630, 16GB DDR4 | Storage: OWC Mercury Elite Pro Quad (6TB WD Blue HDD, 12TB Seagate Barracuda, 1TB Crucial SSD, 2TB Seagate Barracuda HDD)
Link to comment
Share on other sites

Link to post
Share on other sites

26 minutes ago, DrMacintosh said:

Kinda

 

The A series is just so much better than what Qualcomm is doing it would just be interesting to see if Apple could market a sub series of A SoCs for use in Android products. 

 

Now of course this means that Apple could tweak the Android variants and make the slower than their iOS counterparts to still give the iPhone an advantage. 

 

Its really more of a thought experiment than it is an actual suggestion. 

 

And yeah, don’t worry, macOS is staying on the Mac xD 

Die shrunk Apple A9 with reduced power requirements for simpler board designs. :P

 

 

Since when does Apple ever willingly share though? One would have far better chances yanking teeth from a wide-awake Crocodile or Shark. Neither of which I'm keen on getting close too, mind you.

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, Zodiark1593 said:

Shark.

It's actually really easy, just get bit. :P

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, DrMacintosh said:

Apple should just start selling the A series of SoCs to Android manufacturers. 

 

Though Android would only benefit from the raw power, not many of the optimization’s priopriatary to the OS. 

 

The S9 is wicked fast, but mostly because how the skin is designed to have fairly quick animations. The S9 also has more RAM so if you hit the RAM limit on the iPhone X in a side by side test the iPhone should lose. 

 

Apple still is leading the industry with graphics and raw multi core power. 

I don’t think they would

 

Because then they’re throwing away one of their competitive advantages 

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, DrMacintosh said:

Apple should just start selling the A series of SoCs to Android manufacturers. 

 

Though Android would only benefit from the raw power, not many of the optimization’s priopriatary to the OS. 

 

The S9 is wicked fast, but mostly because how the skin is designed to have fairly quick animations. The S9 also has more RAM so if you hit the RAM limit on the iPhone X in a side by side test the iPhone should lose. 

 

Apple still is leading the industry with graphics and raw multi core power. 

False. Anandtech has showed time and time again how efficient Adreno graphics are. Sometimes Apple's graphics wins in benchmarks (often due to them using Metal and Adreno still being forced to run on OpenGL) but they do so while consuming more power and the size is also bigger. Apple doesn't allow measurements but they run into thermal constraints; that's proof enough.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Trixanity said:

False. Anandtech has showed time and time again how efficient Adreno graphics are. Sometimes Apple's graphics wins in benchmarks (often due to them using Metal and Adreno still being forced to run on OpenGL) but they do so while consuming more power and the size is also bigger. Apple doesn't allow measurements but they run into thermal constraints; that's proof enough.

Efficient =/= better

He who asks is stupid for 5 minutes. He who does not ask, remains stupid. -Chinese proverb. 

Those who know much are aware that they know little. - Slick roasting me

Spoiler

AXIOM

CPU- Intel i5-6500 GPU- EVGA 1060 6GB Motherboard- Gigabyte GA-H170-D3H RAM- 8GB HyperX DDR4-2133 PSU- EVGA GQ 650w HDD- OEM 750GB Seagate Case- NZXT S340 Mouse- Logitech Gaming g402 Keyboard-  Azio MGK1 Headset- HyperX Cloud Core

Offical first poster LTT V2.0

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Clanscorpia said:

Efficient =/= better

It is better though. Adreno wins in the vast majority of metrics including most absolute performance benchmarks. 

I'll give Apple that they're still new to the GPU game so it may improve but it doesn't appear like Qualcomm is twiddling their thumbs either.

 

It's well documented that Adreno is ahead of everyone else. And consistently too. 

 

Efficiency has proven to be a very important metric anyway. Especially in a battery-powered device.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×