Jump to content

Tom's Guide: A11 Bionic is the real deal (iPhone 8 review)

TheReal1980
33 minutes ago, Verne Arase said:

Not sure how Android apps work nowadays. Is it binary, OpenJava, or some JIT cross-compiled mess? (Dear God, think of the opportunity for memory leaks).

 

That's one problem Android has ... it was architected in the old-style mid-2000s feature/smart phone way - like parent-company Danger's offerings - with a linux kernel and code running in Mobile Java (even if you pronounced it Dalvik to avoid licensing). That's why it's not as instantaneously buttery-smooth on initiation. You have to instantiate a virtual machine environment or cross-compile landing page tokens to initiate execution. (And lord knows how that infiltrates into processor cache).

 

Unfortunately, when the top-secret iPhone launched in 2007, they had to totally redo the UI, and had no time to analyze (copy) the iPhone's biggest advantage - that it was a totally binary down-on-the-metal platform.

Android used to be JIT, and it used to use Dalvik. That's not been the case for years though.

These days it uses the Android Runtime (ART) which compiles everything at install (ahead of time). For all intense and purposes, Android too is a "totally binary down-on-the-metal platform", and has been for several years.

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/20/2017 at 5:56 PM, Inkz said:

Just wondering, was there any thing else they upgraded besides the CPU?GPU?

Let's see: A11 Bionic SoC has a CPU with two high powered cores, each 25% faster than the A10, and four high efficiency cores, each 70% faster (and twice as many). They have an improved M11 motion co-processor which means better gyros and motion sensing, a new neural engine capable of 600 billion AI ops/sec, a new GPU 30% faster and using (like the above) less energy. They have an improved ISP which has hardware noise reduction, and a video processor which splits the image into 2 million tiles and monitors for motion, edge detection and the like. They have a new processor controller which dispatches the CPU cores from light use all the way up to full blazing speed - which, unlike the A10 which could only dispatch the two high power _or_ the two high efficiency cores - can dispatch all cores.

 

All models now have Qi wireless charging and fast USB-C charging as long as you've got a USB-C to lightning cable and a USB-C power source. The primary wide angle camera is a f/1.8 OIS camera with deeper pixels and shoots in the DCI-P3 color space.

 

The 8 has 2GB and a 4.7" 750p screen with a True Tone DCI-P3 color IPS display, 3D touch, and HEIF and HEVC support (half the space at better quality), slow-mo at 240 fps @1080p, up to 60 fps @4K, IP67, 25% louder speakers with deeper base, and TouchID.

 

The plus has everything above and 3GB, a 2x f/2.8 telephoto (non-stabilized), a bigger battery, a 5.5" 1080p display, and portrait mode and portrait lighting on the rear cameras.

 

The X has everything above and a 5.8" 2436x1135 True Tone DCI-P3 OLED HDR display, a f/2.4 OIS 2x telephoto, a larger battery (I think), no TouchID and a TrueDepth FaceID front-facing camera (which is essentially a miniaturized Xbox Kinect). This phone offers portrait mode and portrait lighting on both front and rear cameras using stereoscopic depth on the rear camera and TrueDepth on the front camera.

 

Whew!

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/21/2017 at 10:52 AM, Sauron said:

In this case I would tend to blame Adobe for optimizing their software specifically for apple's chip and no bothering with the more varied competitors - it seems clear to me from the other tests that the difference in performance on that one test is unjustifiable from a raw power standpoint. Plus, there is something they didn't take into account.

Adobe optimizing their stuff for Apple hardware? Heh heh ROFL.

 

Adobe doesn't optimize their stuff for anyone now-days.

 

As for storage, I believe Apple does use NVMe storage. What do the Android handsets use?

 

Storage access time is a *huge* part of performance, and you absolutely have to meter storage access time as part of your benchmark.

 

Come to think of it, what kind of storage is that memory card you stick in your phone? Surely it's not USB storage ...

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/21/2017 at 11:52 AM, Sauron said:

First of all performance (which is what is being discussed here) and user friendliness are two completely separate things. Secondly, os optimization can only account for a small part of any performance edge. In this case I would tend to blame Adobe for optimizing their software specifically for apple's chip and no bothering with the more varied competitors - it seems clear to me from the other tests that the difference in performance on that one test is unjustifiable from a raw power standpoint. Plus, there is something they didn't take into account.

 

I see a lot of flawed methodology; firstly by using geekbench at all and doing the usual, stupid comparison with intel hardware. Secondly, the test pool in the other areas is way too small. Third, and most glaringly obvious mistake in my opinion, they didn't take storage speed into account, at all. Loading time speedups are much more dependant on how fast the storage is and we know that the iphone probably has the fastest of the bunch, by a not small margin. This would also help justify the massive difference in editing because guess what, video editing programs have to write their output somewhere while they render and the storage drive is often slower than the cpu, particularly when it comes to modest edits like what premiere clip allows you to do. This difference becomes more obvious as you go up in resolution (and bitrate as well as file size).

 

With all that said, I don't think any of this matters. Phones have been fast enough for 99% of use cases for years now. If you really need to edit mediocre 4k footage, which you took with your phone, on your phone, then I guess the iphone has something to offer; otherwise that performance will go completely wasted and your money will go straight in the trash, because aside from that there are plenty of much cheaper phones which offer pretty much the same features if not better.

 

My suggestion is that if you're serious about video making you should save money on your phone and use it to buy a real camera, as well as do the editing on a pc.

Jesus Christ...thank you!!!! Finally, someone with half a brain on this thread!!! 

Night Fury 2.0:

Spoiler

Intel Core i5-6500 / Cryorig H7 / Gigabyte GA-H170-D3H / Corsair Vengeance LPX 8GB DDR4 @ 2133MHz / EVGA GTX 1070 SC / Fractal Design Define R5 / Adata SP550 240GB / WD Blue 500GB / WD Blue 1TB / EVGA 750GQ 

Daily Drivers:

Spoiler

Google Pixel XL 128GB / Jaybird Bluebuds X3 / Logitech MX Master / Sennheiser HD 598 / 

 

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, Verne Arase said:

As for storage, I believe Apple does use NVMe storage. What do the Android handsets use?

 

Storage access time is a *huge* part of performance, and you absolutely have to meter storage access time as part of your benchmark.

 

Come to think of it, what kind of storage is that memory card you stick in your phone? Surely it's not USB storage ...

UFS for premium devices and eMMC for the rest. UFS is not necessarily worse than NVMe on mobile devices and it seems the industry is moving towards UFS entirely although I have seen NVMe solutions on the roadmap of SanDisk at least. Apple is exempt because they make proprietary stuff on a whim if need be and they made their own NVMe storage controller before UFS took off, so it might not be feasible to veer from that path.

 

Memory cards are microSD. Technically similar to eMMC and pretty slow for anything but mass storage. That's why Samsung is pushing to replace that aging tech with UFS memory cards.

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/20/2017 at 5:20 PM, ARikozuM said:

I guess I'm upgrading the SE and Note 5 to the 8+... 

Few days late but I got an S8+ last month, and seriously I have no regrets. The battery life is amazing. Sure Apple does have a good CPU and GPU in their phone now but unless you're playing a lot of games or crap like that then I don't see the point in spending $800+ when you could possibly get an S8 or S8+ for less than that. My S8+ was $816 originally but best buy was having a sale where they were taking $300 off of it. So my S8+ only is going to cost me $516.

a Moo Floof connoisseur and curator.

:x@handymanshandle x @pinksnowbirdie || Jake x Brendan :x
Youtube Audio Normalization
 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/27/2017 at 8:00 AM, LAwLz said:

Android used to be JIT, and it used to use Dalvik. That's not been the case for years though.

These days it uses the Android Runtime (ART) which compiles everything at install (ahead of time). For all intense and purposes, Android too is a "totally binary down-on-the-metal platform", and has been for several years.

Huh. I wonder why I still often read about it suffering from initial stuttering then ... or is that only on older systems which can't get the latest OS?

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/28/2017 at 7:21 AM, Trixanity said:

UFS for premium devices and eMMC for the rest. UFS is not necessarily worse than NVMe on mobile devices and it seems the industry is moving towards UFS entirely although I have seen NVMe solutions on the roadmap of SanDisk at least. Apple is exempt because they make proprietary stuff on a whim if need be and they made their own NVMe storage controller before UFS took off, so it might not be feasible to veer from that path.

 

Memory cards are microSD. Technically similar to eMMC and pretty slow for anything but mass storage. That's why Samsung is pushing to replace that aging tech with UFS memory cards.

I was watching some guy review one of the new iMacs on YouTube recently, and he had a fully decked out iMac 5K with 2TB of NVMe storage (from his workplace) and was chortling about the obscene price of the device (> $5000).

 

I think what he didn't realize is that 2TB NVMe SSD benchmarks out at over 2 GB/sec (thats gigabytes, not bits) which blows away our enterprise EMC and IBM flash in our SAN arrays, and made up a substantial portion of that cost.

 

If you remove that SSD and the obscenely priced 64 GB of Apple RAM - and note I did use the word obscene for Apple RAM prices - the iMac 5K is actually quite economical considering what you're getting.

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/21/2017 at 9:54 PM, Sniperfox47 said:

It's more a matter of how little they were allowed to do. It's licensed as one of ARMs new "Semicustom" chip designs which limits them on how far they can take their modifications of it. So they mostly tweaked power and cache to make it suitable for both core types.

I thought Qualcomm was an ARM Architecture licensee, which means they could pretty much do what they want.

 

Just look at Apple (who admittedly was one of the founding partners).

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/20/2017 at 5:19 PM, Septimus said:

 

I want to see Huawei's take at this "bionic" design with a smartphone SoC. Either way, they made the performance better for the user, so I guess that's a success. 

 

Considering the snark they used announcing the Kirin 970, that better be some hot SoC.

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, Verne Arase said:

I was watching some guy review one of the new iMacs on YouTube recently, and he had a fully decked out iMac 5K with 2TB of NVMe storage (from his workplace) and was chortling about the obscene price of the device (> $5000).

 

I think what he didn't realize is that 2TB NVMe SSD benchmarks out at over 2 GB/sec (thats gigabytes, not bits) which blows away our enterprise EMC and IBM flash in our SAN arrays, and made up a substantial portion of that cost.

 

If you remove that SSD and the obscenely priced 64 GB of Apple RAM - and note I did use the word obscene for Apple RAM prices - the iMac 5K is actually quite economical considering what you're getting.

I didn't mention anything about pricing but as a response: 

I don't think a workstation with mostly standard off-the-shelf components (with some custom board design etc) can be compared to specialized and proprietary solutions for what I assume is hardware that serves large scale enterprises. It's just a different ball game even if they're technically similar. It's seen time and time again that a product aimed at enterprise gets a massively inflated price tag for multiple reasons (warranty, reliability, QA, custom/proprietary/specialized hardware or software etc).

 

Mostly Mac products aren't that terribly priced. Well that or competing products get inflated to match Apple's prices. It varies between product lines though. For example, the MacBook Pro 13 usually doesn't have any direct competition (they shipped with 28W processors exclusively until the touch bar model arrived). There ruined that with the introduction of 15W processors in the lineup. The price and performance is/was great although few outside the Apple ecosystem would admit that.

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/20/2017 at 6:47 PM, Dan Castellaneta said:

Why are we comparing desktop Geekbench to mobile Geekbench again?

Outside of that, god damn. iPhone 8 is the real deal, and it seems like the extra $50 went somewhere else besides the storage.

This specific geekbench is said to be created to be pretty accurate across platforms Android PC IOs.  The Iphone scores more points then a base macbook pro 13" Crazy. But they did almost double the Iphone 7 in overall Muticore perf. 

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, Verne Arase said:

I thought Qualcomm was an ARM Architecture licensee, which means they could pretty much do what they want.

 

Just look at Apple (who admittedly was one of the founding partners).

The terminology makes it confusing. ARM has three different types of licensing: custom, semi-custom or off-the shelf. Apple has a custom license meaning they utilize ARM's v8 instruction set but design their own microarchitecture from scratch. Off-the-shelf (not an official name) is essentially ARM-designed cores with ARM's microarchitecture. Semi-custom is mostly off-the-shelf but with room for tweaking and you can change uncore blocks as you wish and you can re-brand it; Qualcomm is the first (and only, that I know of) to utilize the semi-custom license as it's a relatively new initiative.

 

Apple's involvement is uncertain. ARM was originally started without any Apple involvement, then Apple collaborated for a bit and then it was spun off as a separate entity. I'm sure they have some influence today but how much is uncertain.

7 minutes ago, Verne Arase said:

Considering the snark they used announcing the Kirin 970, that better be some hot SoC.

The Kirin 970 has the same cores as 960. Only difference is 10 nm instead of 16. Anything Huawei could brag about would have to pertain to the NPU (Neural Processing Unit) which supposedly has a bigger flop count than Apple's but it remains to be seen. There is no doubt that A11 is superior to Kirin 970 in every regard but neural processing. I have seen some argue that today it's everything but the CPU itself that is important; that the more you can off-load from the CPU the better.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Verne Arase said:

I thought Qualcomm was an ARM Architecture licensee, which means they could pretty much do what they want.

 

Just look at Apple (who admittedly was one of the founding partners).

They are. They have licensed the ARM-v8 ISA which they used for their custom Kryo CPU Microarchitecture.

 

But then Kryo was a massive flop, being super power hungry for little gain over the A53, and was outperformed by the A73, so they dropped Kryo and settled on a modified version of the A73, using the new license type (Semi-Custom) that ARM conveniently decided to announce in April/May of last year.

 

1 hour ago, Trixanity said:

The terminology makes it confusing. ARM has three different types of licensing: custom, semi-custom or off-the shelf. Apple has a custom license meaning they utilize ARM's v8 instruction set but design their own microarchitecture from scratch. Off-the-shelf (not an official name) is essentially ARM-designed cores with ARM's microarchitecture. Semi-custom is mostly off-the-shelf but with room for tweaking and you can change uncore blocks as you wish and you can re-brand it; Qualcomm is the first (and only, that I know of) to utilize the semi-custom license as it's a relatively new initiative.

Samsung is as well. They used Semi-Custom A73s as well on the Exynos 8895, although I have no idea what they changed since it looks pretty much a stock standard A73 afaics.

Link to comment
Share on other sites

Link to post
Share on other sites

On 20/09/2017 at 6:26 PM, themctipers said:

A9 still is good until it start's lagging in iOS 11. 

 

Who gives a shit about MORE PERFORMANCE if it doesn't lag in the OS or 99.98% apps on a phone.

I've been using iOS 11 on my iPhone 6s (A9 SoC) since it's public launch a week or two ago. I've had no slowdown of any kind. It feels exactly as fast as iOS 10 did.

On 20/09/2017 at 9:03 PM, djdwosk97 said:

I've only been running iOS11 for a day, but I haven't noticed any issues with it running on my 6s Plus.

A large portion of people who overclock their CPUs. 

Same - no issues w/ my 6s.

On 21/09/2017 at 12:53 PM, AlwaysFSX said:

For when you're trying to be obnoxious because not being obnoxious on the internet is too hard.

 

"Pyo" is a word that an adult female who frequents forums, constantly, uses in place of "Stop" when typing a message. It's considered a Very Annoying "Catch Phrase" anywhere the user goes.

Piyoko: Mine was a hundred bucks, it could have been cheaper, but I didn't feel like waiting for shipping, and wanted a nicer one, pyo. 

Piyoko: Oh, and thanks, pyo.

God... that was painful to read.

On 27/09/2017 at 7:06 PM, Verne Arase said:

Let's see: A11 Bionic SoC has a CPU with two high powered cores, each 25% faster than the A10, and four high efficiency cores, each 70% faster (and twice as many). They have an improved M11 motion co-processor which means better gyros and motion sensing, a new neural engine capable of 600 billion AI ops/sec, a new GPU 30% faster and using (like the above) less energy. They have an improved ISP which has hardware noise reduction, and a video processor which splits the image into 2 million tiles and monitors for motion, edge detection and the like. They have a new processor controller which dispatches the CPU cores from light use all the way up to full blazing speed - which, unlike the A10 which could only dispatch the two high power _or_ the two high efficiency cores - can dispatch all cores.

 

All models now have Qi wireless charging and fast USB-C charging as long as you've got a USB-C to lightning cable and a USB-C power source. The primary wide angle camera is a f/1.8 OIS camera with deeper pixels and shoots in the DCI-P3 color space.

 

The 8 has 2GB and a 4.7" 750p screen with a True Tone DCI-P3 color IPS display, 3D touch, and HEIF and HEVC support (half the space at better quality), slow-mo at 240 fps @1080p, up to 60 fps @4K, IP67, 25% louder speakers with deeper base, and TouchID.

 

The plus has everything above and 3GB, a 2x f/2.8 telephoto (non-stabilized), a bigger battery, a 5.5" 1080p display, and portrait mode and portrait lighting on the rear cameras.

 

The X has everything above and a 5.8" 2436x1135 True Tone DCI-P3 OLED HDR display, a f/2.4 OIS 2x telephoto, a larger battery (I think), no TouchID and a TrueDepth FaceID front-facing camera (which is essentially a miniaturized Xbox Kinect). This phone offers portrait mode and portrait lighting on both front and rear cameras using stereoscopic depth on the rear camera and TrueDepth on the front camera.

 

Whew!

Man the iPhone X sounds awesome... I wish they would make a version that was the same size physically as the 8, with the smaller screen, and TouchID. I like the idea of OLED HDR, and the 2x OIS lens. And the bigger battery doesn't hurt (Seriously, just make it thicker...)

 

I just don't want to lose TouchID, despite FaceID supposedly being better. And I don't want a Phablet sized phone. I've just gotten used to the 4.5" to 5" device size segment.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/29/2017 at 8:23 AM, Sniperfox47 said:

They are. They have licensed the ARM-v8 ISA which they used for their custom Kryo CPU Microarchitecture.

 

But then Kryo was a massive flop, being super power hungry for little gain over the A53, and was outperformed by the A73, so they dropped Kryo and settled on a modified version of the A73, using the new license type (Semi-Custom) that ARM conveniently decided to announce in April/May of last year.

 

Samsung is as well. They used Semi-Custom A73s as well on the Exynos 8895, although I have no idea what they changed since it looks pretty much a stock standard A73 afaics.

Ehh, Kryo has similar integer performance to the Cortex A57, but drastically better floating point performance compared to even the A72, very nearly competitive on a per core basis with Apple's Twister core. The A73 actually regresses (by quite a lot) on floating point, but it gains a lot on integer performance and power efficiency, essentially a much leaner architecture than A57, A72 or Kryo. 

 

Most workloads on mobile that would have otherwise required floating point performance are already accelerated by hardware (video decode). Integer performance is always in demand by comparison. Kryo itself seems to be a strong architecture in it's own merit, but it focuses too heavily on areas where mobile workloads are not bottlenecked, suffering from heightened power consumption, and was axed for the more focused A73 architecture. Small, lean cores are really Android's forte, and I don't think we'll see anything like Apple's efforts on the Android side as a result.

 

The semi-custom bit for the Cortex A73 series I'd wager has to do with power management, cache, and other such details aside from the core itself. The licensee can probably even change certain components such as branch prediction without altering the overall layout. I'm not a chip designer or engineer by any means, but given the information at hand, this seems reasonable.

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 weeks later...
On 9/29/2017 at 8:42 AM, Trixanity said:

It's seen time and time again that a product aimed at enterprise gets a massively inflated price tag for multiple reasons (warranty, reliability, QA, custom/proprietary/specialized hardware or software etc).

If we're talking EMC here, it's greed. Definitely greed.

 

I have a dedup box at work using variable length frames codeveloped between EMC and Quantum, and while the Quantum box was around $40K (of course this was an old almost EOL model), the EMC version was a quarter mil.

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/29/2017 at 8:54 AM, Trixanity said:

The Kirin 970 has the same cores as 960. Only difference is 10 nm instead of 16. Anything Huawei could brag about would have to pertain to the NPU (Neural Processing Unit) which supposedly has a bigger flop count than Apple's but it remains to be seen. There is no doubt that A11 is superior to Kirin 970 in every regard but neural processing. I have seen some argue that today it's everything but the CPU itself that is important; that the more you can off-load from the CPU the better.

The A11 contains a neural processor capable of (according to Apple) 600 billion operations/second.

 

Now what that means I have no idea - used to be inferences/second, but they're using new knowledge models which are apparently educated blobs of storage distilled down from standardized training models.

 

AI makes my head hurt.

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/30/2017 at 10:12 AM, dalekphalm said:

Man the iPhone X sounds awesome... I wish they would make a version that was the same size physically as the 8, with the smaller screen, and TouchID. I like the idea of OLED HDR, and the 2x OIS lens. And the bigger battery doesn't hurt (Seriously, just make it thicker...)

I may be wrong on the battery ... I think it's bigger than the 8, but the 8 plus may have the larger battery.

 

The X is pretty much the same size as the 8, but taller (I think). Unfortunately, that means that a 16x9 aspect ratio leaves black bars on both sides, and the actual physical height in landscape is less than the 8 plus (with a higher pixel density which you need with a pentile display).

 

In addition, I've been hearing stuff about OLED which doesn't thrill me - supposedly great for displaying nicely saturated stills, but not so nice for fast action video which artifacts due to pixel cool-down latency. (IPS is a shutter allowing light to show through [quick shut-off], whereas OLED is Christmas lights with filament cool down time.) That and the fact that the blue pixels tend to burn out faster - I don't know if they've fixed that one - and the display may over time start to display half-life induced color aberrations.

 

Still, I'm willing to give it a try - and I have annoying plans for the poop animoji ... ;-)

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, Verne Arase said:

I may be wrong on the battery ... I think it's bigger than the 8, but the 8 plus may have the larger battery.

 

The X is pretty much the same size as the 8, but taller (I think). Unfortunately, that means that a 16x9 aspect ratio leaves black bars on both sides, and the actual physical height in landscape is less than the 8 plus (with a higher pixel density which you need with a pentile display).

 

In addition, I've been hearing stuff about OLED which doesn't thrill me - supposedly great for displaying nicely saturated stills, but not so nice for fast action video which artifacts due to pixel cool-down latency. (IPS is a shutter allowing light to show through [quick shut-off], whereas OLED is Christmas lights with filament cool down time.) That and the fact that the blue pixels tend to burn out faster - I don't know if they've fixed that one - and the display may over time start to display half-life induced color aberrations.

 

Still, I'm willing to give it a try - and I have annoying plans for the poop animoji ... ;-)

OLED screens have been around on smartphones for quite awhile. It’s a pretty reliable technology at this point. 

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/27/2017 at 9:51 PM, Chaos_Sorcerer said:

Jesus Christ...thank you!!!! Finally, someone with half a brain on this thread!!! 

The things that will chew on your CPU are games, computational photography (including photo/video editing/transcoding), and AR/VR.

 

If you don't do any of those, you probably don't need a super-strong CPU.

 

I know that my 7+ gets quite warm when doing any of the above, and the 7+ is still quite competitive.

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, Verne Arase said:

The things that will chew on your CPU are games, computational photography (including photo/video editing/transcoding), and AR/VR.

 

If you don't do any of those, you probably don't need a super-strong CPU.

 

I know that my 7+ gets quite warm when doing any of the above, and the 7+ is still quite competitive.

Yup. My point exactly. Mobile CPUs aren't powerful enough to run half-decent games, anyway...so I've never even bothered with phone games. 

 

Who the hell does any sort of content creation on their phone so much that they would be limited by its performance?

 

Also...VR is still shit on phones. 

Night Fury 2.0:

Spoiler

Intel Core i5-6500 / Cryorig H7 / Gigabyte GA-H170-D3H / Corsair Vengeance LPX 8GB DDR4 @ 2133MHz / EVGA GTX 1070 SC / Fractal Design Define R5 / Adata SP550 240GB / WD Blue 500GB / WD Blue 1TB / EVGA 750GQ 

Daily Drivers:

Spoiler

Google Pixel XL 128GB / Jaybird Bluebuds X3 / Logitech MX Master / Sennheiser HD 598 / 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Chaos_Sorcerer said:

Yup. My point exactly. Mobile CPUs aren't powerful enough to run half-decent games, anyway...so I've never even bothered with phone games. 

 

Who the hell does any sort of content creation on their phone so much that they would be limited by its performance?

 

Also...VR is still shit on phones. 

Actually, mobile hardware has come quite a long way. Most high end SoC GPUs today (and last year for that matter) hand the PS3/360 their respective butts pretty easily. High end ARM cpus are also not far off from the capability of the PS4/One CPUs on a per-core basis either, though the console's gpu hardware blows away anything in mobile as of this writing.

 

Will mobile gaming blow minds in graphics, probably not. But as mobile hardware has the shader power to actually be quite flexible, it's also pretty far up from the poop tier too. Even effects like Ambient Occlusion (formerly quite expensive) should be feasible on today's mobile hardware.

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

24 minutes ago, Zodiark1593 said:

Actually, mobile hardware has come quite a long way. Most high end SoC GPUs today (and last year for that matter) hand the PS3/360 their respective butts pretty easily. High end ARM cpus are also not far off from the capability of the PS4/One CPUs on a per-core basis either, though the console's gpu hardware blows away anything in mobile as of this writing.

 

Will mobile gaming blow minds in graphics, probably not. But as mobile hardware has the shader power to actually be quite flexible, it's also pretty far up from the poop tier too. Even effects like Ambient Occlusion (formerly quite expensive) should be feasible on today's mobile hardware.

By mobile CPUs, I meant SoCs. 

 

Yes, of course, these chips have come a long way. They are surprisingly powerful. Phone games still don't come near the graphical prowess of modern PC/console titles, however. The complexity/depth of the gameplay is also quite lacking, and the controls are still bound to the small touchscreen. Most of the popular games (games that are practical on a phone...i.e., just time burners) can run on just about anything. 

 

Saying that the CPUs weren't powerful probably wasn't right....my bad. I guess it's more tied to the way they're designed and how they're meant to be played on a 5" touch panel, rather than with a controller or KBM. 

 

My main point is that mobile games are still shit and nothing more than time-burners, and because of this there is little point in making a SoC that is 20% more powerful than the competition, for example. A super-fast chip will only be practical should docking phones and using them as PC-replacements become a thing in the future, or something...until then, better performance than the competition is not really a selling point, imo. Most modern phones are already incredibly fast in day-to-day use, anyway.

 

 

Night Fury 2.0:

Spoiler

Intel Core i5-6500 / Cryorig H7 / Gigabyte GA-H170-D3H / Corsair Vengeance LPX 8GB DDR4 @ 2133MHz / EVGA GTX 1070 SC / Fractal Design Define R5 / Adata SP550 240GB / WD Blue 500GB / WD Blue 1TB / EVGA 750GQ 

Daily Drivers:

Spoiler

Google Pixel XL 128GB / Jaybird Bluebuds X3 / Logitech MX Master / Sennheiser HD 598 / 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×