Jump to content

Nvidia is no longer interested in making mobile SoCs

Djole123
Just now, Ryan_Vickers said:

how?

its very glitchy on linux, because nvidia absolutely refuses to work with linux on that part.

 

no nvidia mobile chips means no nvidia optimus :P

Link to comment
Share on other sites

Link to post
Share on other sites

They probably were smart enough to realize what I've been saying ad nauseum: mobile tech is basically fucking pointless since there's next to no software at all to really take advantage of it. 

 

Mobile devices are a mass market. As such, the number of people who would play graphically intensive, complex games, are relatively tiny since well, we already have much better devices anyway. 

 

So for the most part you have your basic stupid fucking idiot with a 3 minute attention spawn they can use to try 1 level of Candy Crush or flap various birds while distracting themselves in the shitter or while the light changes.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, manikyath said:

its very glitchy on linux, because nvidia absolutely refuses to work with linux on that part.

 

no nvidia mobile chips means no nvidia optimus :P

I don't know about that... I assume they'll still have switchable graphics some how, since a desktop 1080 would probably kill your battery pretty fast

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Ryan_Vickers said:

I don't know about that... I assume they'll still have switchable graphics some how, since a desktop 1080 would probably kill your battery pretty fast

i should actually test how much power usage differs on my desktop between a gpu at idle, and no gpu at all.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, manikyath said:

i should actually test how much power usage differs on my desktop between a gpu at idle, and no gpu at all.

With my sandy bridge i7 and a 6770M, it's a good 5 - 10 W, so actually as much as a doubling of the idle drain on the machine.  Plus there's the heat, and the noise from the increased fan noise...

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Ryan_Vickers said:

With my sandy bridge i7 and a 6770M, it's a good 5 - 10 W, so actually as much as a doubling of the idle drain on the machine.  Plus there's the heat, and the noise from the increased fan noise...

thats a 6770m tho, which isnt quite a modern gpu. i mean, if someone told you back when you got that laptopt hat one day we'd have full featured desktop GPUs in a laptop you'd laugh at their face :P

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, manikyath said:

thats a 6770m tho, which isnt quite a modern gpu. i mean, if someone told you back when you got that laptopt hat one day we'd have full featured desktop GPUs in a laptop you'd laugh at their face :P

Believe me, I'd very very much like to be rid of switchable graphics, but so long as it's actually netting people some extra runtime, I'll be (grudgingly) all for it!  And I assume it is or nvidia and AMD would have done away with it by now...

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Ryan_Vickers said:

Believe me, I'd very very much like to be rid of switchable graphics, but so long as it's actually netting people some extra runtime, I'll be (grudgingly) all for it!  And I assume it is or nvidia and AMD would have done away with it by now...

well, maybe nvidia has decided this is the time. also, if you get rid of switchable graphics, you can also forget about needing an igpu, so technically if the dedicated chip would use as little as integrated at idle you'd be at a zero difference.

 

also, it gets rid of the "step" of enabling your dedicated gpu, which is quite a big jump in a lot of modern laptops.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, manikyath said:

well, maybe nvidia has decided this is the time. also, if you get rid of switchable graphics, you can also forget about needing an igpu, so technically if the dedicated chip would use as little as integrated at idle you'd be at a zero difference.

 

also, it gets rid of the "step" of enabling your dedicated gpu, which is quite a big jump in a lot of modern laptops.

Yes, that all sounds fantastic ^_^ I'd love that to happen but I guess it's a case of "I'll believe it when I see it"

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Ryan_Vickers said:

Yes, that all sounds fantastic ^_^ I'd love that to happen but I guess it's a case of "I'll believe it when I see it"

it doesnt happen until it does.

55202a6aad.jpg

Link to comment
Share on other sites

Link to post
Share on other sites

I've always preferred Apple's and Samsung's SoC's anyway.

Link to comment
Share on other sites

Link to post
Share on other sites

On one hand, it's sad that another SoC maker is exiting the market. On the other hand, Nvidia's SoCs were terrible for phones so nothing of value was lost.

 

So does this mean they will stop making SoCs for tablets and other devices too?

Link to comment
Share on other sites

Link to post
Share on other sites

That's my thing. Does that mean that they're no longer making SOC's for Shield devices? That really sucks, the entire Shield line is already starting to show its age.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, shdowhunt60 said:

That's my thing. Does that mean that they're no longer making SOC's for Shield devices? That really sucks, the entire Shield line is already starting to show its age.

well, i heard the chips in the shields arent all too amazing, maybe they'll outsource the shield guts, or (lol why do i even think this) make it an open system.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, manikyath said:

well, i heard the chips in the shields arent all too amazing, maybe they'll outsource the shield guts, or (lol why do i even think this) make it an open system.

Their LTE module is garbage, but otherwise they're some of the most powerful SOC's on the market. The thing is, is that's a lead they're steadily losing, especially in terms of CPU power.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, manikyath said:

well, i heard the chips in the shields arent all too amazing, maybe they'll outsource the shield guts, or (lol why do i even think this) make it an open system.

I remember when the shield tablet came out it was absolutely blowing everything else away with performance of ~5x better than the next best device in some cases.  To be honest though I haven't kept up with where the X1 falls in a current lineup of SoCs.

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Ryan_Vickers said:

I remember when the shield tablet came out it was absolutely blowing everything else away with performance of ~5x better than the next best device in some cases.  To be honest though I haven't kept up with where the X1 falls in a current lineup of SoCs.

The X1 isn't an apples to apples comparison though. The version that's in the TV runs at about 20 watts. 

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Ryan_Vickers said:

I remember when the shield tablet came out it was absolutely blowing everything else away with performance of ~5x better than the next best device in some cases.  To be honest though I haven't kept up with where the X1 falls in a current lineup of SoCs.

 

8 minutes ago, shdowhunt60 said:

Their LTE module is garbage, but otherwise they're some of the most powerful SOC's on the market. The thing is, is that's a lead they're steadily losing, especially in terms of CPU power.

i dont know who told about it, but while performance wasnt the issue the backend was a compatibility DISASTER.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, shdowhunt60 said:

runs at about 20 watts. 

20 watts from the wall or at the chip itself? (BIG difference)

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, manikyath said:

20 watts from the wall or at the chip itself? (BIG difference)

That's what Anandtech measured from the wall if I remember right.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, shdowhunt60 said:

That's what Anandtech measured from the wall if I remember right.

well, chances are apple cheaped out and went for the 40% efficient power supply.

(which is actually a less rare choice than you'd expect.)

Link to comment
Share on other sites

Link to post
Share on other sites

This is most disappointing. I build robots and the Tegra chips were fantastic because they enabled lightning fast onboard vision processing.

CPU: Intel i7 - 5820k @ 4.5GHz, Cooler: Corsair H80i, Motherboard: MSI X99S Gaming 7, RAM: Corsair Vengeance LPX 32GB DDR4 2666MHz CL16,

GPU: ASUS GTX 980 Strix, Case: Corsair 900D, PSU: Corsair AX860i 860W, Keyboard: Logitech G19, Mouse: Corsair M95, Storage: Intel 730 Series 480GB SSD, WD 1.5TB Black

Display: BenQ XL2730Z 2560x1440 144Hz

Link to comment
Share on other sites

Link to post
Share on other sites

I'm surprised it took them this long.

Tegra was a piece of shit. Nobody wanted to use it.

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, manikyath said:

well, chances are apple cheaped out and went for the 40% efficient power supply.

(which is actually a less rare choice than you'd expect.)

Nvidia* 

PSU Tier List | CoC

Gaming Build | FreeNAS Server

Spoiler

i5-4690k || Seidon 240m || GTX780 ACX || MSI Z97s SLI Plus || 8GB 2400mhz || 250GB 840 Evo || 1TB WD Blue || H440 (Black/Blue) || Windows 10 Pro || Dell P2414H & BenQ XL2411Z || Ducky Shine Mini || Logitech G502 Proteus Core

Spoiler

FreeNAS 9.3 - Stable || Xeon E3 1230v2 || Supermicro X9SCM-F || 32GB Crucial ECC DDR3 || 3x4TB WD Red (JBOD) || SYBA SI-PEX40064 sata controller || Corsair CX500m || NZXT Source 210.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×