Jump to content

Reports that Nintendo Switches are bending/warping, possibly due to heat?

Master Disaster
21 minutes ago, GoodBytes said:

Well, it is hard to say "leftover", as it is currently the latest Nvidia Tegra SoC for non-cars. If not, I would imagine that the Nvidia Shield TV would get the newer SoC.

 

Yup.. then again my phone has the 810, and I guess Microsoft was the only one to figure out in putting a heatpipe on it to solve its problem. :P But yes. Tegra chip is not a cool operating chip.

 

Well, it is marked as "custom", so something changed. This is all I can think about, at least at the moment:

  • The Tegra chip supports 3GB of RAM. Nvidia Tegra X1 chip that Nintendo got supports 4GB. Either a minor hardware change/fix was done to allow this, or Nvidia charges more to unlock the RAM limitation which is kept for (or at least the initial idea): Higher end model.
  • A or some Pascal firmware level tech (like maybe the memory compression), was past to the Tegra X1. to get the most performance possible out of the chip.

    Both cases would shows 0 differences in the X-Ray images.
     

Nha. A53 chip would not been used in any case. The Tegra chip core design was really a switch system. Either A53 or A57. I don't see the A53 being used by demanding games to reduce system heat by devs, same for Nintendo themselves... Nvidia advertised that the chip would use the A53 for the phone/tablet main screen, general navigation, and video/music playing. And switch to teh A57 for everything else.

 

Like who? So far, even the Snapdragon 835 doesn't have full OpenGL support. It has OpenGL ES support only. Plus they have virtually 0 tools for the GPU nor provide or have experience providing assistance/support to any devs.

 

It's only the latest because Nvidia has slowed down their Tegra development for the mobile segment due to lack of interest and lack of competitiveness.

 

One thing is advertisement., another is reality. According to those sources: the interconnect is simply broken, so the chip can only run the A57 cores and if Nvidia hasn't reworked it, it'll also be broken on the Switch. Whether it matters or not depends on use case. I'm not too familiar with what the Switch can do but it seems quite limited in everything but gaming but in anything else, the reduced consumption from the A53 cores would come in handy.

 

Does the Switch actually use OpenGL? That sounds ridiculous. The Snapdragon 835 wouldn't be ready anyway though.

 

I'm thinking a custom design (if they would not use an off-the-shelf ARM chip). It would have been the best solution, not the cheapest, but a design like Nintendo's needs either a custom or semi-custom to really shine. 

 

They might actually have been able to use Imagination's tech. They have MIPS and Rogue which could be used to make a decent SoC. 

It's not exactly impossible to partner with someone on this. I'm sure there would be plenty that would jump at the chance if they had the right resources to pull it off.

 

The biggest hindrance is documentation, tools and support but I'm sure that could be worked out. Nintendo contracting Nvidia seems an easy but desperate move. Easy because Nvidia provides support and documentation and has the graphics know-how but everything else is pretty damn bad. So I can only conclude this was the simple and cheap solution with good enough results.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, GeekJump said:

bend-n-snap gate? (i tried to make a legally blonde reference with the Nintendo *snap* thing...I'll see myself out)

 

Just like with phones bending, what do these people do that causes that? You can get away with saying you put your phone in your pocket and sit down but a switch (with controllers) is huge. But Nintendo quality control is garbage IMO. Got 3 3DSs in the house and they all look different screen wise (because they use different panels randomly), one has a yellow tint, and one has dirt under the screen. Not surprised with this bend situation and all the other things like dead pixels, etc. people are reporting on the switch.

Just wanted to say, using different panel is common everywhere in the industry to be able to meet production demand and keep price low.

This is also why we say, that if you want a dual screen setup on your PC, you better buy the 2 or more monitors all together, and not apart, as different bash would mean different panels, and that means different colors. Even if both monitor has identical panels brand and model, the panel manufacture might use a different supplier for the backlight to meet, on their side, the demand. It is pretty complicated to mass produce things, and that is why it is actually VERY difficult, and have a lot of factor out of ones hands. Heck Tesla had numerous issues when they started to mass produce their cars at first, and still faces issues.

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, Trixanity said:

It's only the latest because Nvidia has slowed down their Tegra development for the mobile segment due to lack of interest and lack of competitiveness.

Yup. But latest is latest.

 

Quote

One thing is advertisement., another is reality. According to those sources: the interconnect is simply broken, so the chip can only run the A57 cores and if Nvidia hasn't reworked it, it'll also be broken on the Switch. Whether it matters or not depends on use case. I'm not too familiar with what the Switch can do but it seems quite limited in everything but gaming but in anything else, the reduced consumption from the A53 cores would come in handy.

They are 2 ways you can disable features from a chip: Firmware, or blow the fuse. Depending on what you want to disable, and how the chip design works, one method os picked. Probably it has been blown for far simpler reasons: Too hard to modify Android (or for Nintendo making their OS work with it.. let alone Nintendo is not generally good at making OSs.. see WiiU lovely speed and RAM eating design for such simple OS. They aren't Sony and definitely not Microsoft). So to push sales (and probably make the Shield TV dev team life easier, and well, they are not power limited) disabled it. Maybe that is the only thing "Custom" about the tegra X1 chip that Nintendo has.. this feature has been removed, and the max CPU clock was boosted by 100MHz, and the SoC has a deeply tested, and certified downclock on battery clocks

 

Quote

Does the Switch actually use OpenGL? That sounds ridiculous. The Snapdragon 835 wouldn't be ready anyway though.

Yes sir. Full OpenGL and Vulkan. That is the sales pitch of Tegra, and Nintendo ensured  its OS to support it for devs.

 

Quote

I'm thinking a custom design (if they would not use an off-the-shelf ARM chip). It would have been the best solution, not the cheapest, but a design like Nintendo's needs either a custom or semi-custom to really shine. 

That is above Nintendo abilities. Even Sony, for its chip in the PS3, they used heavy assistance from IBM.

 

Quote

They might actually have been able to use Imagination's tech. They have MIPS and Rogue which could be used to make a decent SoC. 

But they don't have the dev tools. Nvidia tools are super powerful. Devs can see what happens behind the doors of the GPU, they can see the performance and memory usage of each thing that happens, line by line, and can see any latency per command. It is crazy stuff. AMD has similar tools of course for their GPU. But, not the others.

 

Quote

It's not exactly impossible to partner with someone on this. I'm sure there would be plenty that would jump at the chance if they had the right resources to pull it off.

You would think! But usually it ends up in a bad relationship.. not always, but many times. Where the partner takes full advantage of Nintendo. EA with WiiU online system, Philips over charging their tech to crazy amounts, Sony asking too many questions/documents for the SNES disk system.... Yet other partnership were great!. So it is not easier said then done.

 

Quote

The biggest hindrance is documentation, tools and support but I'm sure that could be worked out. Nintendo contracting Nvidia seems an easy but desperate move. Easy because Nvidia provides support and documentation and has the graphics know-how but everything else is pretty damn bad. So I can only conclude this was the simple and cheap solution with good enough results.

It is desperate. Third party support is critical Nintendo learned the hard way with the WIiU. Big publishers are interested in money.. I mean it is a business, they have shareholders and investors to please. It is all about maximizing profits. And devs themselves are tired of breaking their heads to get things working properly for weeks, instead of actually moving along with the game and enjoy what they do. And no it can't be worked out. Documentation would be limited to 1 source. If you don't understand, too bad. While on Nvidia/AMD side you are jam packed with documentation, and experience in explaining technical details to devs through resources, and the countless available documentation for OpenGL, and now growing Vulkan, not to mention all the devs knows how to use OpenGL, no need to learn something new, or work with a subset OpenGL with limitations left and right with chip limitation quirks.

Heck, Intel can't get their GPU interesting today, and still has bad drivers.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, GoodBytes said:

Heuuuummmm.. just want to say that doesn't mean it is on the WAN show that that it is legit, and many mistakes are said. WAN show is more of a talk show, not the place to get your only source of information. Its more, there is some news, and Luke and Linux give their opinion on it, there is usually no research done beside a quick read. If there is a piece of news that interest you in the WAN show, I would recommend to look deeper in it. As we are talking about the Switch, an example, if I recall correctly, Luke said that the console output in 720p while docked.

My point wasn't that being on WAN makes them a reliable source, it was that they're a large tech site that report on stuff very frequently and that he claims he has never heard of them despite them being a regular on ltt forum news posts.

Main Rig:-

Ryzen 7 3800X | Asus ROG Strix X570-F Gaming | 16GB Team Group Dark Pro 3600Mhz | Corsair MP600 1TB PCIe Gen 4 | Sapphire 5700 XT Pulse | Corsair H115i Platinum | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Asus TUF GT501 | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Ubuntu 20.04.2 LTS |

 

Server:-

Intel NUC running Server 2019 + Synology DSM218+ with 2 x 4TB Toshiba NAS Ready HDDs (RAID0)

Link to comment
Share on other sites

Link to post
Share on other sites

Glad I got a perfect Switch then.  No scratches, no dead pixels, no joycon issues, no screen issues, no bending.  Haven't even bothered putting a screen protector on it and I've docked and undocked it at least a few dozen times.  Guess I'm just lucky then.  

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, GoodBytes said:

Yup. But latest is latest.

 

They are 2 ways you can disable features from a chip: Firmware, or blow the fuse. Depending on what you want to disable, and how the chip design works, one method os picked. Probably it has been blown for far simpler reasons: Too hard to modify Android (or for Nintendo making their OS work with it.. let alone Nintendo is not generally good at making OSs.. see WiiU lovely speed and RAM eating design for such simple OS. They aren't Sony and definitely not Microsoft). So to push sales (and probably make the Shield TV dev team life easier, and well, they are not power limited) disabled it. Maybe that is the only thing "Custom" about the tegra X1 chip that Nintendo has.. this feature has been removed, and the max CPU clock was boosted by 100MHz, and the SoC has a deeply tested, and certified downclock on battery clocks

 

Yes sir. Full OpenGL and Vulkan. That is the sales pitch of Tegra, and Nintendo ensured  its OS to support it for devs.

 

That is above Nintendo abilities. Even Sony, for its chip in the PS3, they used heavy assistance from IBM.

 

But they don't have the dev tools. Nvidia tools are super powerful. Devs can see what happens behind the doors of the GPU, they can see the performance and memory usage of each thing that happens, line by line, and can see any latency per command. It is crazy stuff. AMD has similar tools of course for their GPU. But, not the others.

 

You would think! But usually it ends up in a bad relationship.. not always, but many times. Where the partner takes full advantage of Nintendo. EA with WiiU online system, Philips over charging their tech to crazy amounts, Sony asking too many questions/documents for the SNES disk system.... Yet other partnership were great!. So it is not easier said then done.

 

It is desperate. Third party support is critical Nintendo learned the hard way with the WIiU. Big publishers are interested in money.. I mean it is a business, they have shareholders and investors to please. It is all about maximizing profits. And devs themselves are tired of breaking their heads to get things working properly for weeks, instead of actually moving along with the game and enjoy what they do. And no it can't be worked out. Documentation would be limited to 1 source. If you don't understand, too bad. While on Nvidia/AMD side you are jam packed with documentation, and experience in explaining technical details to devs through resources, and the countless available documentation for OpenGL, and now growing Vulkan, not to mention all the devs knows how to use OpenGL, no need to learn something new, or work with a subset OpenGL with limitations left and right with chip limitation quirks.

Heck, Intel can't get their GPU interesting today, and still has bad drivers.

 

 

As I said: Nintendo went with the easy cheap solution. If they went custom they aren't exactly going to be developing the chip themselves. No recent game console in the last 20 years have had a chip that has been developed in-house. Nvidia would probably not have played ball but they could have contracted an AMD GPU for the design; possibly with standard ARM cores to save on R&D. Whether it be AMD or someone else designing it, it's possible to design such a chip if you wish to spend the cash (it has been done before multiple times). It's easy to argue it's not worth the cash and in all likelihood it isn't but Nvidia has delivered a subpar product and I don't think that's a secret. I just pity those who pay money for this. Both Nintendo and end users. Luckily it seems like the user experience is good (for the most part) but knowing that they could have gotten better; that's sad.

 

Whether you agree with that is really not my concern. 

Link to comment
Share on other sites

Link to post
Share on other sites

54 minutes ago, Trixanity said:

As I said: Nintendo went with the easy cheap solution. If they went custom they aren't exactly going to be developing the chip themselves. No recent game console in the last 20 years have had a chip that has been developed in-house. Nvidia would probably not have played ball but they could have contracted an AMD GPU for the design; possibly with standard ARM cores to save on R&D. Whether it be AMD or someone else designing it, it's possible to design such a chip if you wish to spend the cash (it has been done before multiple times). It's easy to argue it's not worth the cash and in all likelihood it isn't but Nvidia has delivered a subpar product and I don't think that's a secret. I just pity those who pay money for this. Both Nintendo and end users. Luckily it seems like the user experience is good (for the most part) but knowing that they could have gotten better; that's sad.

Oh yea for sure! But what will Nintendo gain?

Making a custom chip is not free. It involves R&D which is money... money that needs to be paid and cost more, costing the system more money.

And for what? A slightly faster and cooler operating CPU? Ok, say we have this, great! But is that the reason why the Tegra SoC is warm? Or is the GPU? Say it isn't the GPU. Fine, Does AMD have any GPU that is power efficient, and powerful than Nvidia? Sure you COULD say "Well AMD can make it happen..." It takes 5 YEARS to develop a GPU, Plus it would take few more years to create that custom SoC. All at the same time, crossing your fingers that the calculation where done, and simulation done where all correct in giving you a power efficient design, which there is no guaranty of. Let alone the actual performance.

 

Microsoft and Sony CPUs are "custom" where the tech already existed, and they knew the console they'll do.. same as before, but faster. in Nintendo case.. say they started working on the Switch the day the WiiU was released. It takes 1-2 years get the idea of the system, form the teams, do prototypes, improve, scrap the idea, get new better ideas or improvement and so on. Then it takes 2 years or so to make circuit board design, testing, integration, etc. Then it takes 6month to a year to get all the contracts from manufactures that you are using their components that you need. Then it comes to mass manufacturing test bash, production optimizations, defined QC testing (which is always in evolution, and the Quality Assurance team work with the production company to solve them, which is always why early productions have issues, and the very late are extremely unlikely that you have any issues)

 

As we can start seeing, starting the development of the Switch the day WiiU was released, is very tight already... But yes, a lot can be done in parallel, so a actual custom chip possible? Not to mention the documentation, and tools?

 

Yes! Yes it would have been perfectly possible actually! And yes, I am being serious. The math shows this.... that is IF and only IF the WiiU was a success, to push the console life another 2 years. Or 1 year and half. And yea, you are right, it would be possibly a kick-ass chip (or not really, and people would complain why they didn't take Pascal Tegra (which I am assuming will be released by then), but say, at worst, something still better than now).

 

But sadly, that is not the situation. The WIiU did poorly, and the next system had to be released now.

 

54 minutes ago, Trixanity said:

Whether you agree with that is really not my concern. 

We are doing a discussion, nothing more.

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, Evann said:

because of this post you dont wanna get a Switch?

There are always some people that have problems, but the majority doesnt have, because customers that dont have a problem have no reason to write about it on the internet.

 

I got myself a switch, even tho i had a bad feeling about all the problems.

 

Do you know what?

 

This thing looks / Feels extremely good, i have no dead pixels, i have no connection Problem whatsoever, my switch doesnt bend (yet..) ..

and even if it does.. you can bend every device if you use enough force .. and as the people in the thread said, they didnt even noticed it until one said it. 

 

The UI is one of the best i have ever seen in my life, and i don't regret the purchase at all, it's the best console i have ever played on.

 

 

And yes, Zelda is nuts. Great game.

This is probably the first time Nintendo pushed the thermal envelope so hard. The Wii and WiiU were gen old hardware on newer nodes, sidestepping thermal issues that Sony and Microsoft had long since ironed out. The 3DS opted for an extended fixed function gpu with a pair of slow clocked ARM11s, also negligible in the heat department (probably why they went with this as opposed to the then rumored Tegra 2).

 

It wouldn't surprise me to see issues with heat, though I also won't hold it against Nintendo, as Microsoft and Sony had teething pains with this issue as well. Remember the RROD. 

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, Evann said:

because of this post you dont wanna get a Switch?

There are always some people that have problems, but the majority doesnt have, because customers that dont have a problem have no reason to write about it on the internet.

 

I got myself a switch, even tho i had a bad feeling about all the problems.

 

Do you know what?

 

This thing looks / Feels extremely good, i have no dead pixels, i have no connection Problem whatsoever, my switch doesnt bend (yet..) ..

and even if it does.. you can bend every device if you use enough force .. and as the people in the thread said, they didnt even noticed it until one said it. 

 

The UI is one of the best i have ever seen in my life, and i don't regret the purchase at all, it's the best console i have ever played on.

 

 

And yes, Zelda is nuts. Great game.

mine's bent slightly already (you need to lay it on something flat to see for sure), and I've not played much on it aside from botw so I expect it to bend more. The plastic seems cheaper than a dollar store kids toy. Initially when reading this I was worried it was a battery thing, but the area the bend typically happens seems to be above the heatsink. I expect this to be able to happen to all switches given they are made of the same plastic. Nintendo is replacing them for those with the bad bends who've already contacted support, for me I expect to wait until the end of the warranty period given if continues to function. Given I have other issues with mine like joycan wobble, hopefully they will have a newer revision out by them for it to be replaced with. And I think for people on the fence, that's the best option- don't buy a launch sku lol. Just seems to me this entire console reeks of rushed production and not necessarily bad design 

 

I've also been thinking of a usbc extension cord for the dock for a while, and I just might couple that with a fan now. 

muh specs 

Gaming and HTPC (reparations)- ASUS 1080, MSI X99A SLI Plus, 5820k- 4.5GHz @ 1.25v, asetek based 360mm AIO, RM 1000x, 16GB memory, 750D with front USB 2.0 replaced with 3.0  ports, 2 250GB 850 EVOs in Raid 0 (why not, only has games on it), some hard drives

Screens- Acer preditor XB241H (1080p, 144Hz Gsync), LG 1080p ultrawide, (all mounted) directly wired to TV in other room

Stuff- k70 with reds, steel series rival, g13, full desk covering mouse mat

All parts black

Workstation(desk)- 3770k, 970 reference, 16GB of some crucial memory, a motherboard of some kind I don't remember, Micomsoft SC-512N1-L/DVI, CM Storm Trooper (It's got a handle, can you handle that?), 240mm Asetek based AIO, Crucial M550 256GB (upgrade soon), some hard drives, disc drives, and hot swap bays

Screens- 3  ASUS VN248H-P IPS 1080p screens mounted on a stand, some old tv on the wall above it. 

Stuff- Epicgear defiant (solderless swappable switches), g600, moutned mic and other stuff. 

Laptop docking area- 2 1440p korean monitors mounted, one AHVA matte, one samsung PLS gloss (very annoying, yes). Trashy Razer blackwidow chroma...I mean like the J key doesn't click anymore. I got a model M i use on it to, but its time for a new keyboard. Some edgy Utechsmart mouse similar to g600. Hooked to laptop dock for both of my dell precision laptops. (not only docking area)

Shelf- i7-2600 non-k (has vt-d), 380t, some ASUS sandy itx board, intel quad nic. Currently hosts shared files, setting up as pfsense box in VM. Also acts as spare gaming PC with a 580 or whatever someone brings. Hooked into laptop dock area via usb switch

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×