Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

Qualcomm Snapdragon 835 Performance Previewed

3 hours ago, That Norwegian Guy said:

Don't care about the 835, I want to see the 660.

 

With how good the 650 was and the phones it was in costing roughly $200, there's no shitting reason to be excited for an 835.

It was good but deliberately crippled by the 28 nm process and GPU. I expect the 660 will be the same.

3 hours ago, wcreek said:

Skimming through this I can't seem to tell, but was it going to be exclusive to Samsung as it had seemed or is it just a sorta Samsung gets first dibs on it then other OEMs can get it too? Or what's the deal with that? Because like the LG G6 is still on pre-orders so like it could be possible for LG to revise it with a Snapdragon 835?

Yields are bad on 10 nm, so there is limited supply (it's getting better though). Samsung has placed a large order worth a lot of money, so they naturally get theirs first. There's no exclusivity.

 

It's unlikely for LG to launch a new G6 with the 835. It's an entirely different platform, so it would require more work. Expect it in the V30.

 

LG should have had the G6 out now globally to really benefit from the absence of Samsung but they are apparently incapable of executing their strategy properly.

Link to post
Share on other sites
8 minutes ago, Trixanity said:

LG should have had the G6 out now globally to really benefit from the absence of Samsung but they are apparently incapable of executing their strategy properly.

LG seems pretty incapable of executing any good smartphone strategy properly.

Which is a shame because they are one of the few manufacturers who could rival Samsung if they tried.

Link to post
Share on other sites
7 hours ago, wcreek said:

Skimming through this I can't seem to tell, but was it going to be exclusive to Samsung as it had seemed or is it just a sorta Samsung gets first dibs on it then other OEMs can get it too? Or what's the deal with that? Because like the LG G6 is still on pre-orders so like it could be possible for LG to revise it with a Snapdragon 835?

 

6 hours ago, LAwLz said:

According to rumors, Samsung gets first dibs. LG could have announced the G6 with a Snapdragon 835 if they wanted, but then they'd have to delay it for quite some time and they did not want that.

And they won't change from the 821 to the 835 in the middle of production.

If you order the G6, you will get a 821.

Sony announced a phone at MWC with the 835 (only announced, I think it'll be on sale sometime in Spring), so I don't think Samsung is getting first dibs or that wouldn't have happened.

2 hours ago, hey_yo_ said:

Shots fired 

Yo hey

Make sure to quote me or tag me when responding to me, or I might not know you replied! Examples:

 

Do this:

Quote

And make sure you do it by hitting the quote button at the bottom left of my post, and not the one inside the editor!

Or this:

@DocSwag

 

Buy whatever product is best for you, not what product is "best" for the market.

 

Interested in computer architecture? Still in middle or high school? P.M. me!

 

I love computer hardware and feel free to ask me anything about that (or phones). I especially like SSDs. But please do not ask me anything about Networking, programming, command line stuff, or any relatively hard software stuff. I know next to nothing about that.

 

Compooters:

Spoiler

Desktop:

Spoiler

CPU: i7 6700k, CPU Cooler: be quiet! Dark Rock Pro 3, Motherboard: MSI Z170a KRAIT GAMING, RAM: G.Skill Ripjaws 4 Series 4x4gb DDR4-2666 MHz, Storage: SanDisk SSD Plus 240gb + OCZ Vertex 180 480 GB + Western Digital Caviar Blue 1 TB 7200 RPM, Video Card: EVGA GTX 970 SSC, Case: Fractal Design Define S, Power Supply: Seasonic Focus+ Gold 650w Yay, Keyboard: Logitech G710+, Mouse: Logitech G502 Proteus Spectrum, Headphones: B&O H9i, Monitor: LG 29um67 (2560x1080 75hz freesync)

Home Server:

Spoiler

CPU: Pentium G4400, CPU Cooler: Stock, Motherboard: MSI h110l Pro Mini AC, RAM: Hyper X Fury DDR4 1x8gb 2133 MHz, Storage: PNY CS1311 120gb SSD + two Segate 4tb HDDs in RAID 1, Video Card: Does Intel Integrated Graphics count?, Case: Fractal Design Node 304, Power Supply: Seasonic 360w 80+ Gold, Keyboard+Mouse+Monitor: Does it matter?

Laptop (I use it for school):

Spoiler

Surface book 2 13" with an i7 8650u, 8gb RAM, 256 GB storage, and a GTX 1050

And if you're curious (or a stalker) I have a Just Black Pixel 2 XL 64gb

 

Link to post
Share on other sites
50 minutes ago, DocSwag said:

Sony announced a phone at MWC with the 835 (only announced, I think it'll be on sale sometime in Spring), so I don't think Samsung is getting first dibs or that wouldn't have happened.

I think it's like Trixanity said.

Yields are low so they can't match the demand. Samsung puts in a big order and since they are the largest customer, and the manufacturer, they got prioritized. 

Link to post
Share on other sites
1 hour ago, DocSwag said:

 

Sony announced a phone at MWC with the 835 (only announced, I think it'll be on sale sometime in Spring), so I don't think Samsung is getting first dibs or that wouldn't have happened.

Yo hey

It expected to launch in June, so the 835 will be (or should be) broadly available by then.

 

Samsung has stated recently that 10 nm production is on track now (and Galaxy S8 launch dates are rumored to have been pushed back a week to deal with the previous delays on 10 nm production).

Keep in mind that Samsung also produces their own Exynos 8895 on 10 nm. So they have at least two chips taking up all their capacity. They're expected to have 15~ million devices ready at launch. However, Samsung expects to sell 60 million S8 devices in 2017. Contrast that with how many chips Samsung can produce and whatever numbers other devices can manage to sell. We're probably looking at something like 80-100 million 10 nm chips for the year. That's a lot. TSMC is still stumbling around with delays on their equivalent manufacturing process which will prove interesting for Apple and Huawei (among others) later this year.

Link to post
Share on other sites
6 hours ago, Trixanity said:

It was good but deliberately crippled by the 28 nm process and GPU. I expect the 660 will be the same.

How is it crippled by a GPU good enough for the heaviest games, using less battery while doing it? Xiaomi Mi Max can game for 10 hours straight before needing a charge. (Asphalt for reference)

On the 14nm 660 it'll last even longer.

In case the moderators do not ban me as requested, this is a notice that I have left and am not coming back.

Link to post
Share on other sites
7 hours ago, Trixanity said:

It was good but deliberately crippled by the 28 nm process and GPU. I expect the 660 will be the same.

Still way better than 20nm though xD.

 

It was a lower volume and less performance oriented product than the 820 and 821, though, so going for something new like 14nm woulda cost Qualcomm more than it was worth compared to going for 28nm. It is possible the 660 could be based on 14nm instead, though, which would be interesting.

Make sure to quote me or tag me when responding to me, or I might not know you replied! Examples:

 

Do this:

Quote

And make sure you do it by hitting the quote button at the bottom left of my post, and not the one inside the editor!

Or this:

@DocSwag

 

Buy whatever product is best for you, not what product is "best" for the market.

 

Interested in computer architecture? Still in middle or high school? P.M. me!

 

I love computer hardware and feel free to ask me anything about that (or phones). I especially like SSDs. But please do not ask me anything about Networking, programming, command line stuff, or any relatively hard software stuff. I know next to nothing about that.

 

Compooters:

Spoiler

Desktop:

Spoiler

CPU: i7 6700k, CPU Cooler: be quiet! Dark Rock Pro 3, Motherboard: MSI Z170a KRAIT GAMING, RAM: G.Skill Ripjaws 4 Series 4x4gb DDR4-2666 MHz, Storage: SanDisk SSD Plus 240gb + OCZ Vertex 180 480 GB + Western Digital Caviar Blue 1 TB 7200 RPM, Video Card: EVGA GTX 970 SSC, Case: Fractal Design Define S, Power Supply: Seasonic Focus+ Gold 650w Yay, Keyboard: Logitech G710+, Mouse: Logitech G502 Proteus Spectrum, Headphones: B&O H9i, Monitor: LG 29um67 (2560x1080 75hz freesync)

Home Server:

Spoiler

CPU: Pentium G4400, CPU Cooler: Stock, Motherboard: MSI h110l Pro Mini AC, RAM: Hyper X Fury DDR4 1x8gb 2133 MHz, Storage: PNY CS1311 120gb SSD + two Segate 4tb HDDs in RAID 1, Video Card: Does Intel Integrated Graphics count?, Case: Fractal Design Node 304, Power Supply: Seasonic 360w 80+ Gold, Keyboard+Mouse+Monitor: Does it matter?

Laptop (I use it for school):

Spoiler

Surface book 2 13" with an i7 8650u, 8gb RAM, 256 GB storage, and a GTX 1050

And if you're curious (or a stalker) I have a Just Black Pixel 2 XL 64gb

 

Link to post
Share on other sites
2 hours ago, That Norwegian Guy said:

How is it crippled by a GPU good enough for the heaviest games, using less battery while doing it? Xiaomi Mi Max can game for 10 hours straight before needing a charge. (Asphalt for reference)

On the 14nm 660 it'll last even longer.

The GPU (Adreno 510) is significantly slower than the Adreno 530 or 540. We're not just talking 50%. Qualcomm could at least make something in between for their 600 series. It seems like their flagship chips have disproportionately faster GPUs than all other chips. Probably to save costs but I find it cheap. I'm not saying you can't game on the GPU but Qualcomm are holding back. They're only giving you 14 nm now because they're moving to 10 nm on the flagship and because the price has probably come down a fair bit along with the fact that the power efficiency does wonders for the mid range giving an incentive to pick up their chips. They're under pressure here from Mediatek in this segment. 

 

2 hours ago, DocSwag said:

Still way better than 20nm though xD.

 

It was a lower volume and less performance oriented product than the 820 and 821, though, so going for something new like 14nm woulda cost Qualcomm more than it was worth compared to going for 28nm. It is possible the 660 could be based on 14nm instead, though, which would be interesting.

Of course. 20 nm was awful :)

I think another reason was the fact that the 650 and 652 have A72 cores. Look at benchmarks. Other than floating point, the A72 were a fair bit faster than their custom Kryo cores. If they didn't hold back on the 652, it would make the 820/821 look bad. Cost of course it also a reason but think about it: a 14 nm A72 beats a 14 nm Kryo core in most benchmarks. How would they justify their R&D dollars spend on the Kryo core and how would they justify selling you a premium product with slower performance than their mid tier? Yeah,  I know: they did just change their branding so it's called a platform now to highlight what Qualcomm actually does better than their competitors (which isn't CPU performance and with good reason: everyone except Apple uses ARM designed cores).

Link to post
Share on other sites
1 hour ago, Trixanity said:

Of course. 20 nm was awful :)

I think another reason was the fact that the 650 and 652 have A72 cores. Look at benchmarks. Other than floating point, the A72 were a fair bit faster than their custom Kryo cores. If they didn't hold back on the 652, it would make the 820/821 look bad. Cost of course it also a reason but think about it: a 14 nm A72 beats a 14 nm Kryo core in most benchmarks. How would they justify their R&D dollars spend on the Kryo core and how would they justify selling you a premium product with slower performance than their mid tier? Yeah,  I know: they did just change their branding so it's called a platform now to highlight what Qualcomm actually does better than their competitors (which isn't CPU performance and with good reason: everyone except Apple uses ARM designed cores).

Yeah, seems like Kryo wasn't that great of a core design. It probably explains why Qualcomm switched to A73 and A53 for the 835. Qualcomm still beats pretty much everyone else except Apple and Nvidia in GPU performance thanks to the acquisition of Adreno from AMD, but CPU cores don't seem to be their strongpoint (though Krait was a pretty decent design).

Make sure to quote me or tag me when responding to me, or I might not know you replied! Examples:

 

Do this:

Quote

And make sure you do it by hitting the quote button at the bottom left of my post, and not the one inside the editor!

Or this:

@DocSwag

 

Buy whatever product is best for you, not what product is "best" for the market.

 

Interested in computer architecture? Still in middle or high school? P.M. me!

 

I love computer hardware and feel free to ask me anything about that (or phones). I especially like SSDs. But please do not ask me anything about Networking, programming, command line stuff, or any relatively hard software stuff. I know next to nothing about that.

 

Compooters:

Spoiler

Desktop:

Spoiler

CPU: i7 6700k, CPU Cooler: be quiet! Dark Rock Pro 3, Motherboard: MSI Z170a KRAIT GAMING, RAM: G.Skill Ripjaws 4 Series 4x4gb DDR4-2666 MHz, Storage: SanDisk SSD Plus 240gb + OCZ Vertex 180 480 GB + Western Digital Caviar Blue 1 TB 7200 RPM, Video Card: EVGA GTX 970 SSC, Case: Fractal Design Define S, Power Supply: Seasonic Focus+ Gold 650w Yay, Keyboard: Logitech G710+, Mouse: Logitech G502 Proteus Spectrum, Headphones: B&O H9i, Monitor: LG 29um67 (2560x1080 75hz freesync)

Home Server:

Spoiler

CPU: Pentium G4400, CPU Cooler: Stock, Motherboard: MSI h110l Pro Mini AC, RAM: Hyper X Fury DDR4 1x8gb 2133 MHz, Storage: PNY CS1311 120gb SSD + two Segate 4tb HDDs in RAID 1, Video Card: Does Intel Integrated Graphics count?, Case: Fractal Design Node 304, Power Supply: Seasonic 360w 80+ Gold, Keyboard+Mouse+Monitor: Does it matter?

Laptop (I use it for school):

Spoiler

Surface book 2 13" with an i7 8650u, 8gb RAM, 256 GB storage, and a GTX 1050

And if you're curious (or a stalker) I have a Just Black Pixel 2 XL 64gb

 

Link to post
Share on other sites
1 hour ago, Trixanity said:

(which isn't CPU performance and with good reason: everyone except Apple uses ARM designed cores).

Samsung has started making their own CPU cores too.

 

41 minutes ago, DocSwag said:

Qualcomm still beats pretty much everyone else except Apple and Nvidia in GPU performance

Mali isn't that far behind. At least not with enough cores and high enough clocks.

The Snapdragon 820 and Exynos 8890 are trading blows in terms of GPU performance (and the 8890 wins in terms of CPU performance). Anandtech has some benchmarks.

Link to post
Share on other sites
8 minutes ago, LAwLz said:

Samsung has started making their own CPU cores too.

 

Mali isn't that far behind. At least not with enough cores and high enough clocks.

The Snapdragon 820 and Exynos 8890 are trading blows in terms of GPU performance (and the 8890 wins in terms of CPU performance). Anandtech has some benchmarks.

True. I forgot the Mongoose cores.

 

And also true regarding Mali. Most Mali designs are small, so they appear to be very bad compared to Adreno. Huawei uses an MP8 design of the latest G71 GPU which performs pretty well. However, it's clocked at over 1 GHz causing it to spike in power consumption. It's clocked too high. Huawei cheaped out and went with TSMC's cheaper FFC process instead of FF+ which probably doesn't do well with those clocks either.

Leaks say that the upcoming Exynos 8895 will jump from a T880-MP12 to a G71-MP20 design. And on top of that we can expect a 100 MHz frequency boost (I think). It will be a beast implementation if true. I'm not sure but it might be over twice as fast. Huawei increased performance by 180% by going from T880-MP4 to G71-MP8 and like a 150 MHz boost.

Link to post
Share on other sites
29 minutes ago, LAwLz said:

Samsung has started making their own CPU cores too.

 

Mali isn't that far behind. At least not with enough cores and high enough clocks.

The Snapdragon 820 and Exynos 8890 are trading blows in terms of GPU performance (and the 8890 wins in terms of CPU performance). Anandtech has some benchmarks.

I think the Mongoose cores are still only semi custom A57s or A72s.

Make sure to quote me or tag me when responding to me, or I might not know you replied! Examples:

 

Do this:

Quote

And make sure you do it by hitting the quote button at the bottom left of my post, and not the one inside the editor!

Or this:

@DocSwag

 

Buy whatever product is best for you, not what product is "best" for the market.

 

Interested in computer architecture? Still in middle or high school? P.M. me!

 

I love computer hardware and feel free to ask me anything about that (or phones). I especially like SSDs. But please do not ask me anything about Networking, programming, command line stuff, or any relatively hard software stuff. I know next to nothing about that.

 

Compooters:

Spoiler

Desktop:

Spoiler

CPU: i7 6700k, CPU Cooler: be quiet! Dark Rock Pro 3, Motherboard: MSI Z170a KRAIT GAMING, RAM: G.Skill Ripjaws 4 Series 4x4gb DDR4-2666 MHz, Storage: SanDisk SSD Plus 240gb + OCZ Vertex 180 480 GB + Western Digital Caviar Blue 1 TB 7200 RPM, Video Card: EVGA GTX 970 SSC, Case: Fractal Design Define S, Power Supply: Seasonic Focus+ Gold 650w Yay, Keyboard: Logitech G710+, Mouse: Logitech G502 Proteus Spectrum, Headphones: B&O H9i, Monitor: LG 29um67 (2560x1080 75hz freesync)

Home Server:

Spoiler

CPU: Pentium G4400, CPU Cooler: Stock, Motherboard: MSI h110l Pro Mini AC, RAM: Hyper X Fury DDR4 1x8gb 2133 MHz, Storage: PNY CS1311 120gb SSD + two Segate 4tb HDDs in RAID 1, Video Card: Does Intel Integrated Graphics count?, Case: Fractal Design Node 304, Power Supply: Seasonic 360w 80+ Gold, Keyboard+Mouse+Monitor: Does it matter?

Laptop (I use it for school):

Spoiler

Surface book 2 13" with an i7 8650u, 8gb RAM, 256 GB storage, and a GTX 1050

And if you're curious (or a stalker) I have a Just Black Pixel 2 XL 64gb

 

Link to post
Share on other sites
5 hours ago, Trixanity said:

The GPU (Adreno 510) is significantly slower than the Adreno 530 or 540. We're not just talking 50%

And so what? If there aren't any stutters in the most graphically demanding game (which there aren't) on Android why should I care about some numbers on a page?

 

Hit me with your best "but it's future proof" and again, you're just going to buy the immediate next phone with a slightly higher number so you can be "future proof" again.

 

You're just leaving a trail of early adopter second hand pickups on the used market for people smarter than you to take off your hands, lol

In case the moderators do not ban me as requested, this is a notice that I have left and am not coming back.

Link to post
Share on other sites

Honestly I've had my Galaxy Note III for like three and a half years now and it still works wonderfully, no reason at all to upgrade.

 

This was a truly future-proof phone, I think it was the first if not one of the first with 4K recording, FullHD AMOLED screen, 32GB storage, 150mbps LTE, great camera, and it's built like a fucking tank. I've dropped it dozens of times on pavement, concrete, etc with no case on and it just got some scratches on the plastic casing. No shattered glass or even a broken frame. If anything I'd just take it to a repair shop to switch out the worn out plastic casing, rest of the phone is 100% as new. 

Corsair 600T | Intel Core i7-4770K @ 4.5GHz | Samsung SSD Evo 970 1TB | MS Windows 10 | Samsung CF791 34" | 16GB 1600 MHz Kingston DDR3 HyperX | ASUS Formula VI | Corsair H110  Corsair AX1200i | ASUS Strix Vega 56 8GB Internet http://beta.speedtest.net/result/4365368180

Link to post
Share on other sites
1 hour ago, Terodius said:

Honestly I've had my Galaxy Note III for like three and a half years now and it still works wonderfully, no reason at all to upgrade.

 

This was a truly future-proof phone, I think it was the first if not one of the first with 4K recording, FullHD AMOLED screen, 32GB storage, 150mbps LTE, great camera, and it's built like a fucking tank. I've dropped it dozens of times on pavement, concrete, etc with no case on and it just got some scratches on the plastic casing. No shattered glass or even a broken frame. If anything I'd just take it to a repair shop to switch out the worn out plastic casing, rest of the phone is 100% as new. 

so true, then Samsung had to go with the double glass, easy shatter, glued in battery design and fuck everything good about the note line

One day I will be able to play Monster Hunter Frontier in French/Italian/English on my PC, it's just a matter of time... 4 5 6 7 8 9 years later: It's finally coming!!!

Phones: iPhone 4S/SE | LG V10 | Lumia 920

Laptops: Macbook Pro 15" (mid-2012) | Compaq Presario V6000

Link to post
Share on other sites
11 hours ago, That Norwegian Guy said:

And so what? If there aren't any stutters in the most graphically demanding game (which there aren't) on Android why should I care about some numbers on a page?

 

Hit me with your best "but it's future proof" and again, you're just going to buy the immediate next phone with a slightly higher number so you can be "future proof" again.

 

You're just leaving a trail of early adopter second hand pickups on the used market for people smarter than you to take off your hands, lol

What you're saying is essentially "we don't need more graphical horse power". You limit what applications can do with that mindset. When you say it can run everything you throw at it with less than mid tier graphics; what that tells you is that developers either target the lowest common denominator in which case they're lazy and not pushing the industry forward or you're not running at 60 fps at which point I assume you subscribe to the 'cinematic' experience.

 

Either case is awful.

Link to post
Share on other sites
20 hours ago, suicidalfranco said:

so true, then Samsung had to go with the double glass, easy shatter, glued in battery design and fuck everything good about the note line

They tried to out-apple Apple.

Corsair 600T | Intel Core i7-4770K @ 4.5GHz | Samsung SSD Evo 970 1TB | MS Windows 10 | Samsung CF791 34" | 16GB 1600 MHz Kingston DDR3 HyperX | ASUS Formula VI | Corsair H110  Corsair AX1200i | ASUS Strix Vega 56 8GB Internet http://beta.speedtest.net/result/4365368180

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×