Jump to content

Intel May Be Entering the Smartphone, Tablet and Wearable Chip Markets

xriqn
Just now, xriqn said:

Qualcomm use ARM you know. An SD 410 for example is really an ARM-A53 based system.

I know... not sure what your point is.

Make sure to quote me or tag me when responding to me, or I might not know you replied! Examples:

 

Do this:

Quote

And make sure you do it by hitting the quote button at the bottom left of my post, and not the one inside the editor!

Or this:

@DocSwag

 

Buy whatever product is best for you, not what product is "best" for the market.

 

Interested in computer architecture? Still in middle or high school? P.M. me!

 

I love computer hardware and feel free to ask me anything about that (or phones). I especially like SSDs. But please do not ask me anything about Networking, programming, command line stuff, or any relatively hard software stuff. I know next to nothing about that.

 

Compooters:

Spoiler

Desktop:

Spoiler

CPU: i7 6700k, CPU Cooler: be quiet! Dark Rock Pro 3, Motherboard: MSI Z170a KRAIT GAMING, RAM: G.Skill Ripjaws 4 Series 4x4gb DDR4-2666 MHz, Storage: SanDisk SSD Plus 240gb + OCZ Vertex 180 480 GB + Western Digital Caviar Blue 1 TB 7200 RPM, Video Card: EVGA GTX 970 SSC, Case: Fractal Design Define S, Power Supply: Seasonic Focus+ Gold 650w Yay, Keyboard: Logitech G710+, Mouse: Logitech G502 Proteus Spectrum, Headphones: B&O H9i, Monitor: LG 29um67 (2560x1080 75hz freesync)

Home Server:

Spoiler

CPU: Pentium G4400, CPU Cooler: Stock, Motherboard: MSI h110l Pro Mini AC, RAM: Hyper X Fury DDR4 1x8gb 2133 MHz, Storage: PNY CS1311 120gb SSD + two Segate 4tb HDDs in RAID 1, Video Card: Does Intel Integrated Graphics count?, Case: Fractal Design Node 304, Power Supply: Seasonic 360w 80+ Gold, Keyboard+Mouse+Monitor: Does it matter?

Laptop (I use it for school):

Spoiler

Surface book 2 13" with an i7 8650u, 8gb RAM, 256 GB storage, and a GTX 1050

And if you're curious (or a stalker) I have a Just Black Pixel 2 XL 64gb

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, DocSwag said:

I know... not sure what your point is.

They just need to execute and convince OEMs to go with their SoCs over Qualcomm's.

Qualcomm = ARM based systems. My Qualcomm SnapDragon 410 based S4 Mini is really an ARM-A53 system.

Intel are basically using the same company who makes Qualcomm chips.

There are 10 types of people in this world. Those that understand binary and those that don't.

Current Rig (Dominator II): 8GB Corsair Vengeance LPX DDR4 3133 C15, AMD Ryzen 3 1200 at 4GHz, Coolermaster MasterLiquid Lite 120, ASRock B450M Pro4, AMD R9 280X, 120GB TCSunBow SSD, 3TB Seagate ST3000DM001-9YN166 HSD, Corsair CX750M Grey Label, Windows 10 Pro, 2x CoolerMaster MasterFan Pro 120, Thermaltake Versa H18 Tempered Glass.

 

Previous Rig (Black Magic): 8GB DDR3 1600, AMD FX6300 OC'd to 4.5GHz, Zalman CNPS5X Performa, Asus M5A78L-M PLUS /USB3, GTX 950 SC (former, it blew my PCIe lane so now on mobo graphics which is Radeon HD 3000 Series), 1TB Samsung Spinpoint F3 7200RPM HDD, 3TB Seagate ST3000DM001-9YN166 HDD (secondary), Corsair CX750M, Windows 8.1 Pro, 2x 120mm Red LED fans, Deepcool SMARTER case

 

My secondary rig (The Oldie): 4GB DDR2 800, Intel Core 2 Duo E8400 @ 3GHz, Stock Dell Cooler, Foxconn 0RY007, AMD Radeon HD 5450, 250GB Samsung Spinpoint 7200RPM HDD, Antec HCG 400M 400W Semi Modular PSU, Windows 8.1 Pro, 80mm Cooler Master fan, Dell Inspiron 530 Case modded for better cable management. UPDATE: SPECS UPGRADED DUE TO CASEMOD, 8GB DDR2 800, AMD Phenom X4 9650, Zalman CNPS5X Performa, Biostar GF8200C M2+, AMD Radeon HD 7450 GDDR5 edition, Samsung Spinpoint 250GB 7200RPM HDD, Antec HCG 400M 400W Semi Modular PSU, Windows 8.1 Pro, 80mm Cooler Master fan, Dell Inspiron 530 Case modded for better cable management and support for non Dell boards.

 

Retired/Dead Rigs: The OG (retired) (First ever PC I used at 3 years old back in 2005) Current Specs: 2GB DDR2, Pentium M 770 @ 2.13GHz, 60GB IDE laptop HDD, ZorinOS 12 Ultimate x86. Originally 512mb DDR2, Pentium M 740 @ 1.73GHzm 60GB IDE laptop HDD and single boot XP Pro. The Craptop (dead), 2gb DDR3, Celeron n2840 @ 2.1GHz, 50GB eMMC chip, Windows 10 Pro. Nightrider (dead and cannibalized for Dominator II): Ryzen 3 1200, Gigabyte A320M HD2, 8GB DDR4, XFX Ghost Core Radeon HD 7770, 1TB Samsung Spinpoint F3 (2010), 3TB Seagate Barracuda, Corsair CX750M Green, Deepcool SMARTER, Windows 10 Home.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, xriqn said:

Qualcomm = ARM based systems. My Qualcomm SnapDragon 410 based S4 Mini is really an ARM-A53 system.

I know... But they still are different based on actual core architecture and stuff. 

 

I just don't get how the fact that Qualcomm uses ARM means anything to my initial statement, being that Intel switching to ARM over X86 might help them better compete against Qualcomm.

Make sure to quote me or tag me when responding to me, or I might not know you replied! Examples:

 

Do this:

Quote

And make sure you do it by hitting the quote button at the bottom left of my post, and not the one inside the editor!

Or this:

@DocSwag

 

Buy whatever product is best for you, not what product is "best" for the market.

 

Interested in computer architecture? Still in middle or high school? P.M. me!

 

I love computer hardware and feel free to ask me anything about that (or phones). I especially like SSDs. But please do not ask me anything about Networking, programming, command line stuff, or any relatively hard software stuff. I know next to nothing about that.

 

Compooters:

Spoiler

Desktop:

Spoiler

CPU: i7 6700k, CPU Cooler: be quiet! Dark Rock Pro 3, Motherboard: MSI Z170a KRAIT GAMING, RAM: G.Skill Ripjaws 4 Series 4x4gb DDR4-2666 MHz, Storage: SanDisk SSD Plus 240gb + OCZ Vertex 180 480 GB + Western Digital Caviar Blue 1 TB 7200 RPM, Video Card: EVGA GTX 970 SSC, Case: Fractal Design Define S, Power Supply: Seasonic Focus+ Gold 650w Yay, Keyboard: Logitech G710+, Mouse: Logitech G502 Proteus Spectrum, Headphones: B&O H9i, Monitor: LG 29um67 (2560x1080 75hz freesync)

Home Server:

Spoiler

CPU: Pentium G4400, CPU Cooler: Stock, Motherboard: MSI h110l Pro Mini AC, RAM: Hyper X Fury DDR4 1x8gb 2133 MHz, Storage: PNY CS1311 120gb SSD + two Segate 4tb HDDs in RAID 1, Video Card: Does Intel Integrated Graphics count?, Case: Fractal Design Node 304, Power Supply: Seasonic 360w 80+ Gold, Keyboard+Mouse+Monitor: Does it matter?

Laptop (I use it for school):

Spoiler

Surface book 2 13" with an i7 8650u, 8gb RAM, 256 GB storage, and a GTX 1050

And if you're curious (or a stalker) I have a Just Black Pixel 2 XL 64gb

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, DocSwag said:

I know... But they still are different based on actual core architecture and stuff. 

 

I just don't get how the fact that Qualcomm uses ARM means anything to my initial statement, being that Intel switching to ARM over X86 might help them better compete against Qualcomm.

It really won't, not at this stage anyway. Most smartphones either use Qualcomm or Mediatek, if you're Huawei or Samsung you'll have a Kirin variant or an Exynos variant respectively. Intel got laughed out of the mobile game last time around and most OEMs WILL NOT choose somebody who's been laughed out of the mobile game before over a well established SoC provider such as Qualcomm.

There are 10 types of people in this world. Those that understand binary and those that don't.

Current Rig (Dominator II): 8GB Corsair Vengeance LPX DDR4 3133 C15, AMD Ryzen 3 1200 at 4GHz, Coolermaster MasterLiquid Lite 120, ASRock B450M Pro4, AMD R9 280X, 120GB TCSunBow SSD, 3TB Seagate ST3000DM001-9YN166 HSD, Corsair CX750M Grey Label, Windows 10 Pro, 2x CoolerMaster MasterFan Pro 120, Thermaltake Versa H18 Tempered Glass.

 

Previous Rig (Black Magic): 8GB DDR3 1600, AMD FX6300 OC'd to 4.5GHz, Zalman CNPS5X Performa, Asus M5A78L-M PLUS /USB3, GTX 950 SC (former, it blew my PCIe lane so now on mobo graphics which is Radeon HD 3000 Series), 1TB Samsung Spinpoint F3 7200RPM HDD, 3TB Seagate ST3000DM001-9YN166 HDD (secondary), Corsair CX750M, Windows 8.1 Pro, 2x 120mm Red LED fans, Deepcool SMARTER case

 

My secondary rig (The Oldie): 4GB DDR2 800, Intel Core 2 Duo E8400 @ 3GHz, Stock Dell Cooler, Foxconn 0RY007, AMD Radeon HD 5450, 250GB Samsung Spinpoint 7200RPM HDD, Antec HCG 400M 400W Semi Modular PSU, Windows 8.1 Pro, 80mm Cooler Master fan, Dell Inspiron 530 Case modded for better cable management. UPDATE: SPECS UPGRADED DUE TO CASEMOD, 8GB DDR2 800, AMD Phenom X4 9650, Zalman CNPS5X Performa, Biostar GF8200C M2+, AMD Radeon HD 7450 GDDR5 edition, Samsung Spinpoint 250GB 7200RPM HDD, Antec HCG 400M 400W Semi Modular PSU, Windows 8.1 Pro, 80mm Cooler Master fan, Dell Inspiron 530 Case modded for better cable management and support for non Dell boards.

 

Retired/Dead Rigs: The OG (retired) (First ever PC I used at 3 years old back in 2005) Current Specs: 2GB DDR2, Pentium M 770 @ 2.13GHz, 60GB IDE laptop HDD, ZorinOS 12 Ultimate x86. Originally 512mb DDR2, Pentium M 740 @ 1.73GHzm 60GB IDE laptop HDD and single boot XP Pro. The Craptop (dead), 2gb DDR3, Celeron n2840 @ 2.1GHz, 50GB eMMC chip, Windows 10 Pro. Nightrider (dead and cannibalized for Dominator II): Ryzen 3 1200, Gigabyte A320M HD2, 8GB DDR4, XFX Ghost Core Radeon HD 7770, 1TB Samsung Spinpoint F3 (2010), 3TB Seagate Barracuda, Corsair CX750M Green, Deepcool SMARTER, Windows 10 Home.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, xriqn said:

It really won't, not at this stage anyway. Most smartphones either use Qualcomm or Mediatek, if you're Huawei or Samsung you'll have a Kirin variant or an Exynos variant respectively. Intel got laughed out of the mobile game last time around and most OEMs WILL NOT choose somebody who's been laughed out of the mobile game before over a well established SoC provider such as Qualcomm.

I honestly think Intel could potentially deliver a SoC that could out-class everyone else except Apple. And if they do manage to do that, I don't see why it wouldn't be possible for some people to go Intel.

 

It's definitely not going to be easy when everyone's already using other SoCs, but it's still possible. Intel just needs to deliver on the performance side.

Make sure to quote me or tag me when responding to me, or I might not know you replied! Examples:

 

Do this:

Quote

And make sure you do it by hitting the quote button at the bottom left of my post, and not the one inside the editor!

Or this:

@DocSwag

 

Buy whatever product is best for you, not what product is "best" for the market.

 

Interested in computer architecture? Still in middle or high school? P.M. me!

 

I love computer hardware and feel free to ask me anything about that (or phones). I especially like SSDs. But please do not ask me anything about Networking, programming, command line stuff, or any relatively hard software stuff. I know next to nothing about that.

 

Compooters:

Spoiler

Desktop:

Spoiler

CPU: i7 6700k, CPU Cooler: be quiet! Dark Rock Pro 3, Motherboard: MSI Z170a KRAIT GAMING, RAM: G.Skill Ripjaws 4 Series 4x4gb DDR4-2666 MHz, Storage: SanDisk SSD Plus 240gb + OCZ Vertex 180 480 GB + Western Digital Caviar Blue 1 TB 7200 RPM, Video Card: EVGA GTX 970 SSC, Case: Fractal Design Define S, Power Supply: Seasonic Focus+ Gold 650w Yay, Keyboard: Logitech G710+, Mouse: Logitech G502 Proteus Spectrum, Headphones: B&O H9i, Monitor: LG 29um67 (2560x1080 75hz freesync)

Home Server:

Spoiler

CPU: Pentium G4400, CPU Cooler: Stock, Motherboard: MSI h110l Pro Mini AC, RAM: Hyper X Fury DDR4 1x8gb 2133 MHz, Storage: PNY CS1311 120gb SSD + two Segate 4tb HDDs in RAID 1, Video Card: Does Intel Integrated Graphics count?, Case: Fractal Design Node 304, Power Supply: Seasonic 360w 80+ Gold, Keyboard+Mouse+Monitor: Does it matter?

Laptop (I use it for school):

Spoiler

Surface book 2 13" with an i7 8650u, 8gb RAM, 256 GB storage, and a GTX 1050

And if you're curious (or a stalker) I have a Just Black Pixel 2 XL 64gb

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, xriqn said:

Well isn't this interesting? Just a day after hearing of Intel's shutdown of it's final wearable hardware department they're entering the Smartphone, Tablet and WEARABLE chip markets, according to fool.com of course. This is really interesting considering they failed on making their own wearable devices and have barely even dabbled in the smartphone or tablet (not 2 in one) market before. Perhaps the reason they failed is because the hardware they put inside their wearable devices was terrible, so WHY ON EARTH are they entering the chip market for wearables and other mobile hardware? Well, here's fool.com's take on it all:

Their last exit to the smartphone market was catastrophic, their smartphone chips failed and didn't live up to the performance or even usage by vendors that we were promised. My dad actually had an older Intel based smartphone and it was terribly unstable and regularly overheated, that ended up becoming my plaything when I was 11, got sick of its horrible performance and smashed it to smithereens roughly a year later. But it seems the instability and performance issues were not just something my father and myself were affected by, it was a widespread thing. I do think that Intel working with ARM this time round is a good thing as they primarily produce mobile chips and I've NEVER encountered poor performance or stability with an ARM based system.

But it's been mere DAYS since they exited the wearable market altogether, it is EVIDENT that Intel were NOT cut out for any kind of wearable based hardware. Again though, ARM working with Intel really could help them overcome any issues they may have producing these new wearable chips, which I believe the wearables Intel themselves released were a failure due to poor chip manufacturing. So when will we see these chips enter the market? fool.com reveals all:

Next year is quite a long time to wait, I agree, but with Intel's 10nm process and ARM's expertise in the mobile chip sector, we could see wonderful things arise from these new chipsets, so it could be well worth the wait. Then again, Intel could be heading for their next big flop in the mobile and wearable sector. If these chips are a success, I genuinely think that Intel is more than capable of becoming a big player in the mobile chip market, if it's a flop we obviously won't see them as a big player of course and even more will lose hope with them. Anyways, do you guys find this rather interesting too? Let me know down in the comments below!

Source: fool.com https://www.fool.com/investing/2018/04/19/intel-corp-may-be-entering-the-smartphone-tablet-a.aspx

They already ried once, it didnt go way.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Razor01 said:

Not in graphics, they were close with A5 if I remember correctly but it depends on what they were doing.

 

https://techcrunch.com/2011/12/01/first-nvidia-tegra-3-benchmarks-score-the-quad-core-chip-just-slightly-faster-than-apples-a5/

 

yeah they were very close, and Tegra 3 was delayed for at least 6 months think it was closer to 9 months.  Which is a no no in Cell phone markets, can't have that.

 

https://www.technobuffalo.com/2012/04/11/benchmarked-nvidia-tegra-3-vs-qualcomm-snapdragon-s4/

 

Tegra 3 and 2 both had advantages over their other ARM android counterparts.    Tegra 2 didn't have neon so it did loose in some benchmarks but for the most part it did better.

 

https://www.anandtech.com/show/4144/lg-optimus-2x-nvidia-tegra-2-review-the-first-dual-core-smartphone/8

I had a Tegra 2 phone (the LG Optimus 2X). I'm a bit of an anime watcher, so decent hardware video decode is a nice thing to have. This is where Tegra 2 really lets down however, as it lacked hardware decode for h.264 high profile (it could only do Baseline), and even overclocked to 1.5 GHz, without NEON, it didn't do too well in software decoding either. Overall, I felt it a pretty rushed (even though it was still very late) product.

 

Decent looking spec sheet, but overall a fairly lackluster result.

 

Given Nvidia's expertise in graphics, I really expected much better out of Tegra 3, let alone Tegra 4. It took them until K1 to finally move to a unified shader architecture, while their competitors (Adreno, PowerVR ) have done so for several years by that point.

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

Didnt Intel license ARM at one point in time? I clearly remember reading that somewhere. I wonder if they will jump in to ARM chips as well. 

I just want to sit back and watch the world burn. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Zodiark1593 said:

I had a Tegra 2 phone (the LG Optimus 2X). I'm a bit of an anime watcher, so decent hardware video decode is a nice thing to have. This is where Tegra 2 really lets down however, as it lacked hardware decode for h.264 high profile (it could only do Baseline), and even overclocked to 1.5 GHz, without NEON, it didn't do too well in software decoding either. Overall, I felt it a pretty rushed (even though it was still very late) product.

 

Decent looking spec sheet, but overall a fairly lackluster result.

 

Given Nvidia's expertise in graphics, I really expected much better out of Tegra 3, let alone Tegra 4. It took them until K1 to finally move to a unified shader architecture, while their competitors (Adreno, PowerVR ) have done so for several years by that point.

 

 

Well there were some weakness in the first few Tegra's but nV had to make design choices to reduce die space to keep margins.  Qualcomm, can't compete with those guys with their sheer volume lol. 

 

I had the Atrix for a short time, I wouldn't say lack luster, it was faster than any other android phone out at the time for what I used it for.  I didn't like how it got very warm to the touch with gaming though.

 

Yeah it did take them a while to go from VLIW to unified.   I don't think they were ready for how fast the cell phone market changes.  Its moves much faster than the graphics industry and with the lead times of creating a chip from others IP is another problem there too. 

 

Once Tegra 3 was delayed that was all the reason Qualcomm needed to push nV out, it made it easy for them.  It wasn't their base technology that nV wasn't able to compete with.  Who knows with the way Switch was received maybe they will find inroads that way again?  I doubt it though, the cell phone market is ruthless and risks are too high.

 

Hey I like anime too, if you can PM me what are some good anime shows that came out recently or coming out that would be great!  I don't like the childish stuff.  I like anime like Akira and stuff like that lol.

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, Donut417 said:

Didnt Intel license ARM at one point in time? I clearly remember reading that somewhere. I wonder if they will jump in to ARM chips as well. 

They did that for their fabs I think.

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, Razor01 said:

Its extremely difficult and companies that don't to hurt their margins can't get into it at all. nV absolutely got trounced by Qaulcomm and others even though Tegra was overall better, yeah its CPU wasn't as good but its graphics was way better.

 

Intel's Atom just has issues  with power.

I haven't found evidence for Tegra chips actually beating Adreno across the board. Especially on later chips it seems to me they're trading blows in performance and I have a feeling Adreno uses less power.

8 hours ago, xriqn said:

Most Qualcomm devices don't use their own graphics anymore, they use Adreno. Fun fact, Adreno is an anagram of Radeon and  believe it or not, AMD actually produce the Adreno GPU's we see in most ARM and Qualcomm based devices today. My S4 Mini VE for example has a Qualcomm SD410 chipset, ARM Cortex-A53 based and an Adreno 306.

Not exactly true. AMD sold their business to Qualcomm in 2009. AMD has nothing to do with it today. It's AMD tech at its core but it's been almost a decade so I doubt it's been recognizable for many years. Qualcomm has been moving fairly fast ever since although they may be starting to slip if Adreno 630 is any indication.

 

Also, I haven't seen Adreno in anything but Qualcomm chips and with good reason. Qualcomm is trying to keep a monopoly going and Adreno is one of their biggest assets.

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, xriqn said:

Most Qualcomm devices don't use their own graphics anymore, they use Adreno. Fun fact, Adreno is an anagram of Radeon and  believe it or not, AMD actually produce the Adreno GPU's we see in most ARM and Qualcomm based devices today. My S4 Mini VE for example has a Qualcomm SD410 chipset, ARM Cortex-A53 based and an Adreno 306.

Well, not really. AMD  doesn't have anything to do with Adreno anymore. Back in the day ATI had a mobile division which produced their Imageon brand of mobile gpus. Ati was then acquired by AMD in 2006, and they sold off that branch 2009, to qualcomm. Qualcomm then called the products of that branch adreno, as an homage of sorts. But amd isn't involved with adreno

AMD Ryzen R7 1700 (3.8ghz) w/ NH-D14, EVGA RTX 2080 XC (stock), 4*4GB DDR4 3000MT/s RAM, Gigabyte AB350-Gaming-3 MB, CX750M PSU, 1.5TB SDD + 7TB HDD, Phanteks enthoo pro case

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Trixanity said:

I haven't found evidence for Tegra chips actually beating Adreno across the board. Especially on later chips it seems to me they're trading blows in performance and I have a feeling Adreno uses less power.

Not exactly true. AMD sold their business to Qualcomm in 2009. AMD has nothing to do with it today. It's AMD tech at its core but it's been almost a decade so I doubt it's been recognizable for many years. Qualcomm has been moving fairly fast ever since although they may be starting to slip if Adreno 630 is any indication.

 

Also, I haven't seen Adreno in anything but Qualcomm chips and with good reason. Qualcomm is trying to keep a monopoly going and Adreno is one of their biggest assets.

 

Check Anandtech, they did two reviews one for cellphones one for tablets.

 

https://www.anandtech.com/show/4067/nvidia-tegra-2-graphics-performance-update

 

https://www.anandtech.com/show/4054/first-look-viewsonic-gtablet-and-tegra-2-performance-preview/2

 

Tegra 2 beat snap dragon and hummingbird hands down in some tests, there were some tests they pulled ahead but few and far between and many were close.

 

Tegra 3 on the other hand was a most mobile chip, if it came out on time, but because of its 8 or 9 month delay Qualcomm and Samsung were able to to catch up.  Pretty much a full generation delay for that chip.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Razor01 said:

 

Check Anandtech, they did two reviews one for cellphones one for tablets.

 

https://www.anandtech.com/show/4067/nvidia-tegra-2-graphics-performance-update

 

https://www.anandtech.com/show/4054/first-look-viewsonic-gtablet-and-tegra-2-performance-preview/2

 

Tegra 2 beat snap dragon and hummingbird hands down in some tests, there were some tests they pulled ahead but few and far between and many were close.

 

Tegra 3 on the other hand was a most mobile chip, if it came out on time, but because of its 8 or 9 month delay Qualcomm and Samsung were able to to catch up.  Pretty much a full generation delay for that chip.

Those are very old though. And the comparisons in graphics seem to be against Exynos chips with ARM cores and PowerVR graphics.

With that being said, I'm sure it's possible that Nvidia had an advantage early on especially considering that Qualcomm bought the AMD mobile graphics in 2009. It would take them some years to really get going and by then Tegra, as you say, started to slip up in a big way.

 

Tegra just never really panned out the way Nvidia wanted to and part of that is definitely due to the various power consumption issues that I seem to recall being prevalent at the time. I think Nvidia today could do some real good with their custom ARM cores and a Pascal or Volta based GPU. Their biggest problem is the need for modem integration. I know not everyone has that (including Apple) but it really gives you an edge to have that. However, I don't think Nvidia has any intention of trying again even if Intel intends to. A pity. It would be interesting if we all of a sudden had full scale war going in mobile SoCs. We, as consumers, would win every time.

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, Trixanity said:

Those are very old though. And the comparisons in graphics seem to be against Exynos chips with ARM cores and PowerVR graphics.

With that being said, I'm sure it's possible that Nvidia had an advantage early on especially considering that Qualcomm bought the AMD mobile graphics in 2009. It would take them some years to really get going and by then Tegra, as you say, started to slip up in a big way.

 

Tegra just never really panned out the way Nvidia wanted to and part of that is definitely due to the various power consumption issues that I seem to recall being prevalent at the time. I think Nvidia today could do some real good with their custom ARM cores and a Pascal or Volta based GPU. Their biggest problem is the need for modem integration. I know not everyone has that (including Apple) but it really gives you an edge to have that. However, I don't think Nvidia has any intention of trying again even if Intel intends to. A pity. It would be interesting if we all of a sudden had full scale war going in mobile SoCs. We, as consumers, would win every time.

As stated earlier though nV was using 6 year old tech at that time in their graphics and they were still doing that ;)

 

nV bought a modem company around Tegra 3 time... think it was Icera or something like that.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Razor01 said:

As stated earlier though nV was using 6 year old tech at that time in their graphics and they were still doing that ;)

True that but it's hard to deny the progress Qualcomm has made with Adreno. I mean Mali is far behind still. PowerVR hasn't seen wide-spread use in Android country for a while but Apple used PowerVR until last year and it was competitive in many aspects but from what I know the size of those was bigger than the biggest Adreno configuration. I think there is an argument to be made that Qualcomm could go wider and maybe dial the frequency back a little to regain some efficiency that they seem to have lost on the 600 series - however I doubt they will due to die space concerns. Everything Qualcomm's done so far indicate a refusal to increase die size.

Link to comment
Share on other sites

Link to post
Share on other sites

32 minutes ago, Trixanity said:

True that but it's hard to deny the progress Qualcomm has made with Adreno. I mean Mali is far behind still. PowerVR hasn't seen wide-spread use in Android country for a while but Apple used PowerVR until last year and it was competitive in many aspects but from what I know the size of those was bigger than the biggest Adreno configuration. I think there is an argument to be made that Qualcomm could go wider and maybe dial the frequency back a little to regain some efficiency that they seem to have lost on the 600 series - however I doubt they will due to die space concerns. Everything Qualcomm's done so far indicate a refusal to increase die size.

Its all about trying to create the cheapest chip to do the most minimal of things while staying ahead.  Android market its really hard to do, Qualcomm has the IP that ATi/AMD created, so they enough to keep going, for new comers to come into the market its very hard to do.  Intel might be able to do it if they go arm because of Raja, prior to that Atom just wasn't capable of doing that.

 

nV doesn't have the size to cut margins down to razor thin, they will hurt their other segments to do that, which really wouldn't be wise for them to do, we don't want to see another disaster like we saw with AMD, fighting two fronts with bigger companies is not something they can do even one company is bad enough.  In the coming years, we will see how the weakness in AMD graphics side pans out.  But I think it will hold them back if they were to put more money into the graphics side of things. 

 

nV's R&D is split among many areas of software and hardware development, AMD's is not, its strictly on hardware, they don't have the resources to push software and that is where they are really losing or actually lost the battle in some market segments.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Trixanity said:

True that but it's hard to deny the progress Qualcomm has made with Adreno. I mean Mali is far behind still. PowerVR hasn't seen wide-spread use in Android country for a while but Apple used PowerVR until last year and it was competitive in many aspects but from what I know the size of those was bigger than the biggest Adreno configuration. I think there is an argument to be made that Qualcomm could go wider and maybe dial the frequency back a little to regain some efficiency that they seem to have lost on the 600 series - however I doubt they will due to die space concerns. Everything Qualcomm's done so far indicate a refusal to increase die size.

OEMs want cheap, both cheap to buy, and cheap to implement on a board (size, power delivery, etc). Larger chips would be more costly to produce, and more costly to actually fit on the board. And power delivery concerns (or rather, the cost required to build for it are probably what prevent vendors such as Qualcomm from producing much wider CPU cores as running large cpu cores at high frequencies will require a robust power delivery.

 

Of course, Apple doesn't seem to give a crap about the above factors as board and SoC are designed in-house, something Qualcomm doesn't have the luxury of doing.

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

Next up: Intel develops a brain chip that is later found to have a huge flaw in it, resulting in decreased brain activity

Main Rig

CPU: Ryzen 2700X 
Cooler: Corsair H150i PRO RGB 360mm Liquid Cooler
Motherboard: ASUS Crosshair VII Hero
RAM: 16GB (2x8) Trident Z RGB 3200MHZ
SSD: Samsung 960 EVO NVME SSD 1TB, Intel 1TB NVME

Graphics Card: Asus ROG Strix GTX 1080Ti OC

Case: Phanteks Evolv X
Power Supply: Corsair HX1000i Platinum-Rated

Radiator Fans: 3x Corsair ML120
Case Fans: 4x be quiet! Silent Wings 3

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×