Jump to content

Nintendo reveals more specs for the Switch except for the ones that matter? That's not good....

AlTech
1 minute ago, Master Disaster said:

desktop class

so... laptops with an 860M also have a desktop class GPU core in them? ;)

 

sorry bro, but marketing has gotten to you.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, manikyath said:

so... laptops with an 860M also have a desktop class GPU core in them? ;)

 

sorry bro, but marketing has gotten to you.

Semantics dude, Maxwell is a desktop class GPU.

Main Rig:-

Ryzen 7 3800X | Asus ROG Strix X570-F Gaming | 16GB Team Group Dark Pro 3600Mhz | Corsair MP600 1TB PCIe Gen 4 | Sapphire 5700 XT Pulse | Corsair H115i Platinum | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Asus TUF GT501 | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Ubuntu 20.04.2 LTS |

 

Server:-

Intel NUC running Server 2019 + Synology DSM218+ with 2 x 4TB Toshiba NAS Ready HDDs (RAID0)

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Master Disaster said:

Semantics dude, Maxwell is a desktop class GPU.

no, maxwell is a generation of GPUs, every 800 series laptop GPU is maxwell, the <50 900 series chips are maxwell.

 

if you think by any chance they'll put something anywhere near dektop class on an arm platfrom, you're about 230 watts wrong.

Link to comment
Share on other sites

Link to post
Share on other sites

https://clips.twitch.tv/nintendo/RelievedAlbatrossSMOrc

 

Skyrim is indeed going to the nintendo switch.  No idea if this is the appropriate topic to add this, and don't really care.  Not at all interested in the switch, or nintendo in general, myself.  Sick of seeing the same game redone over and over until the end of time.

 

Anyways, I assume zero mod support, or similar to PS4 levels of mod abilities(AKA basically nothing)

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, manikyath said:

no, maxwell is a generation of GPUs, every 800 series laptop GPU is maxwell, the <50 900 series chips are maxwell.

 

if you think by any chance they'll put something anywhere near dektop class on an arm platfrom, you're about 230 watts wrong.

 

That's not what I said at all. I own a Shield TV (I'm actually watching AGDQ on it right now) so I'm more than aware of what it is capable of.

 

I've said it before but I'll repeat it, I genuinely don't believe the X1 is powerful enough for a full console experience, my Shield TV struggles playing GTA:SA on max settings at 1080p which is why I fully suspect Nintendo are using a custom built Tegra and not just the straight X1. Optimisation can only get you so far and I'm damn sure my Shield TV would render Skyrim unplayable so they must be using something with extra sauce added. Whether its Parker? I'm not convinced since Nvidia seem pretty adamant Parker is only being used in the new Nvidia autonomous driving modules but there's no chance Switch is using the same GPU as the Shield TV.

Main Rig:-

Ryzen 7 3800X | Asus ROG Strix X570-F Gaming | 16GB Team Group Dark Pro 3600Mhz | Corsair MP600 1TB PCIe Gen 4 | Sapphire 5700 XT Pulse | Corsair H115i Platinum | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Asus TUF GT501 | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Ubuntu 20.04.2 LTS |

 

Server:-

Intel NUC running Server 2019 + Synology DSM218+ with 2 x 4TB Toshiba NAS Ready HDDs (RAID0)

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Master Disaster said:

That's not what I said at all -- I own a Shield TV

these two bits..

 

yes, that's exactly what you said, scroll back if you dont believe me. and if you try to prove yourself right, maybe you should look into what exactly makes the difference between a desktop class GPU, and a mobile GPU.

 

and that other bit is exactly why you're defending the tegra so much. it's a powerful chip in the ARM family of devices, but it's nothing special, the nvidia special sauce doesnt make it anything more special than whatever else is out there, it's just waiting to be overtaken again by something else in the pile.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, manikyath said:

these two bits..

 

yes, that's exactly what you said, scroll back if you dont believe me. and if you try to prove yourself right, maybe you should look into what exactly makes the difference between a desktop class GPU, and a mobile GPU.

 

and that other bit is exactly why you're defending the tegra so much. it's a powerful chip in the ARM family of devices, but it's nothing special, the nvidia special sauce doesnt make it anything more special than whatever else is out there, it's just waiting to be overtaken again by something else in the pile.

No, I said at its heart its exactly the same GPU core as what you can buy for your desktop, which it is. Obviously its watered down quite a lot to accommodate heat and power restraints but the technology running its core is identical that that which runs your GTX9x0.

 

And I'm not defending anything, I'm merely pointing out that Tegra is unique in that its the only SoC which uses a GPU core taken from the X86 architecture and not one custom designed to run on a low power platform.

 

Your last statement is baffling, I already told you that the Tegra X1 is almost 2 years old and is still 1/3 faster than its closest non Apple competitor so yes it is something special, its the fastest ARM SoC you can use, bar none and by a considerable margin.

Main Rig:-

Ryzen 7 3800X | Asus ROG Strix X570-F Gaming | 16GB Team Group Dark Pro 3600Mhz | Corsair MP600 1TB PCIe Gen 4 | Sapphire 5700 XT Pulse | Corsair H115i Platinum | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Asus TUF GT501 | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Ubuntu 20.04.2 LTS |

 

Server:-

Intel NUC running Server 2019 + Synology DSM218+ with 2 x 4TB Toshiba NAS Ready HDDs (RAID0)

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, Master Disaster said:

No, I said at its heart its exactly the same GPU core as what you can buy for your desktop, which it is. Obviously its watered down quite a lot to accommodate heat and power restraints but the technology running its core is identical that that which runs your GTX9x0.

corection 1: GTX700, 800M and <50 900 units. (the GT's so to say)

 

correction 2: what they do is cut off pieces of the die to reduce power consumption, in the case of the tegra VS its desktop "equivalent" about 80-90% of the die, then undervolt and downclock it to save the rest of their way to the power target of a tablet.

 

result is that it is about as much a deskop GPU as a volkswagen beetle is a porsche, because the engines are both volkswagen engines, built to volkswagen spec. i dont need to explain you that a beetle isnt gonna be blasting down the german autobahn in a way that even resembles a porsche.

 

EDIT: and to talk about them being king of the hill, it's simply a matter of them being the only ones bothering with such beefy arm cores in a mobile device.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, manikyath said:

corection 1: GTX700, 800M and <50 900 units. (the GT's so to say)

 

correction 2: what they do is cut off pieces of the die to reduce power consumption, in the case of the tegra VS its desktop "equivalent" about 80-90% of the die, then undervolt and downclock it to save the rest of their way to the power target of a tablet.

 

result is that it is about as much a deskop GPU as a volkswagen beetle is a porsche, because the engines are both volkswagen engines, built to volkswagen spec. i dont need to explain you that a beetle isnt gonna be blasting down the german autobahn in a way that even resembles a porsche.

But my point was they start they process with a full fat maxwell core and water it down.

 

To correct your analogy, a Porsche Boxter is slower than a 911 but its still a Porsche.

 

Also just for shits and giggles...

 

Main Rig:-

Ryzen 7 3800X | Asus ROG Strix X570-F Gaming | 16GB Team Group Dark Pro 3600Mhz | Corsair MP600 1TB PCIe Gen 4 | Sapphire 5700 XT Pulse | Corsair H115i Platinum | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Asus TUF GT501 | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Ubuntu 20.04.2 LTS |

 

Server:-

Intel NUC running Server 2019 + Synology DSM218+ with 2 x 4TB Toshiba NAS Ready HDDs (RAID0)

Link to comment
Share on other sites

Link to post
Share on other sites

57 minutes ago, Castdeath97 said:

-snip-

The only people who complain about resolution in phones not being good enough are those who somehow think that 8k on a phone will actually make a difference, and that might never be good enough for them.

That's not to say that pushing the envelope is bad... people just need to realize that there comes a point when all you're doing is comparing penis or breast size because you can, but otherwise getting no benefit from it. Much like how you only see 4k on big laptops because of the power required to drive that display resolution.

CPU - Ryzen 7 3700X | RAM - 64 GB DDR4 3200MHz | GPU - Nvidia GTX 1660 ti | MOBO -  MSI B550 Gaming Plus

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Master Disaster said:

But my point was they start they process with a full fat maxwell core and water it down.

 

To correct your analogy, a Porsche Boxter is slower than a 911 but its still a Porsche.

 

Also just for shits and giggles...

 

well.. thats the thing... they dont really start from a full maxwell core, they just reuse the designs they had for the full maxwell core, and go "which piece of this could we use by itself to make a low power product, and then make a chip from that.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, PocketNerd said:

The only people who complain about resolution in phones not being good enough are those who somehow think that 8k on a phone will actually make a difference, and that might never be good enough for them.

That's not to say that pushing the envelope is bad... people just need to realize that there comes a point when all you're doing is comparing penis or breast size because you can, but otherwise getting no benefit from it. Much like how you only see 4k on big laptops because of the power required to drive that display resolution.

i have no idea what the res on my moto G is (seriously, i didnt even check when buying) but all i know is that i REALLY dont need more because i'm gonna give myself a dose of eye-syfilis before i can see pixels anyways.

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, manikyath said:

to be honest.. i dont understand why people care about the performance of the hardware in nintendo's stuff so much.

 

in the end, all that freaking matters is that the game's fun to play, and that it doesnt chug.

I do think the price is a bit high when you're only getting a 32 GB SD card that costs $10 and you're going to have to drop $70 on a decent controller for TV mode. That big monstrosity for plugging the two joy-cons into doesn't look like it'll be comfortable to game on. And no bundled game either. Unless the game cartridges are writable I don't see how you're not going to have to drop another $40 or so on a 128 GB microSD flash in the days of DLC and patches that can be 10-20 GB. I completely agree that the games are more important than the performance (I go in circles with PC Master Race here defending the PS4 for the same reason), but the price just seems steep when you factor in the extras people are probably going to want to buy. I think they should have gone with at least a 64 GB memory card. DLC and day one patches are here to stay and third party devs can't possibly be happy about seeing so little storage on the default model for it. But damn, Super Mario Odyssey looks exciting.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, manikyath said:

i have no idea what the res on my moto G is (seriously, i didnt even check when buying) but all i know is that i REALLY dont need more because i'm gonna give myself a dose of eye-syfilis before i can see pixels anyways.

Exactly my point, even with my Nexus 6 (496dpi) I can't see the pixels despite getting the phone as close as I could before it gets out of focus. Your Moto G (329 dpi) is running at 720p, while mine is above 1080p. Both of which are still HD.

CPU - Ryzen 7 3700X | RAM - 64 GB DDR4 3200MHz | GPU - Nvidia GTX 1660 ti | MOBO -  MSI B550 Gaming Plus

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, SteveGrabowski0 said:

I do think the price is a bit high when you're only getting a 32 GB SD card that costs $10 and you're going to have to drop $70 on a decent controller for TV mode. That big monstrosity for plugging the two joy-cons into doesn't look like it'll be comfortable to game on. And no bundled game either. Unless the game cartridges are writable I don't see how you're not going to have to drop another $40 or so on a 128 GB microSD flash in the days of DLC and patches that can be 10-20 GB. I completely agree that the games are more important than the performance (I go in circles with PC Master Race here defending the PS4 for the same reason), but the price just seems steep when you factor in the extras people are probably going to want to buy. I think they should have gone with at least a 64 GB memory card. DLC is here to stay and third party devs can't possibly be happy about seeing so little storage on the default model for it.

bom cost isnt everything. nintendo puts a huge amount of engineering into their products (it's arguable if that's a good idea or if that's even of benefit) and engineering isnt cheap.

 

to tell you how expensive engineering is: a few years back i had a teacher that in a former life worked as an assembly developer specialized in "shrinking code", on one occasion their team got paid "in the million dollar range" of money, to cut 5 bytes out of a program, which in turn made the company easily spin around making millions on top of that cost because they could go for a cheaper platform to run on.

--

what engineering means in the nvidia console department may be they spend large in one department, allowing them to add features not yet conceieved before, like the switch docking and undocking (if you want to know how much of a disaster that is in terms of software, ask people with early samples of surface books, they bluescreened half of the time), the whole controllers sliding on and off mess, i bet you half of the team was figuring out battery life, making the battery for this thing not being a brick that half of their marketshare cant carry around comfortably, and so on.

 

i could probably go on for a while about the rest of the issues you brought up, but i have a speedrun to watch :P

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, RagnarokDel said:

Nintendo missed a great opportunity to put a 460pro in there.

and power it with what exactly?

 

(that said, putting a GPU in the dock would have been a sexy idea, unfortunately expensive most likely tho.)

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, manikyath said:

and power it with what exactly?

 

(that said, putting a GPU in the dock would have been a sexy idea, unfortunately expensive most likely tho.)

I meant in the dock

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, RagnarokDel said:

I meant in the dock

 

do they actually have some form of extra processing in the dock or is it just a "dumb power and display link"? because that's screaming surface book issues :P

Link to comment
Share on other sites

Link to post
Share on other sites

 

 

They don't sound so enthused. They lowkey think it's shit.

Mobo: Z97 MSI Gaming 7 / CPU: i5-4690k@4.5GHz 1.23v / GPU: EVGA GTX 1070 / RAM: 8GB DDR3 1600MHz@CL9 1.5v / PSU: Corsair CX500M / Case: NZXT 410 / Monitor: 1080p IPS Acer R240HY bidx

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Yoinkerman said:

Lol you guys are hilarious.

 

Fun is fun I don't care what the specs are.

Look I'm not necessarily saying this is the case, but on one hand you don't care about specs yet your signature shows off what at the time were  pretty beefy specs. 

 

Again even if not you specifically (don't know how you came to own said rig) Many have one standard for PC and a separate and noticeablea lower standard for Nintendo. 

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, AluminiumTech said:

I liked Nintendo games in the past. But unless they are capable of running Skyrim off a potato, i'm still gonna care about specs.

Skyrim runs on an Atom. Just saying. 

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Zodiark1593 said:

Skyrim runs on an Atom. Just saying. 

The original game without mods looks like shit by standards 5 years newer. Even then I doubt it will be solid 30fps.

 

 

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Misanthrope said:

The original game without mods looks like shit by standards 5 years newer. Even then I doubt it will be solid 30fps.

 

 

The point being that it isn't a stretch to run Skyrim on potato hardware. 

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, Misanthrope said:

Look I'm not necessarily saying this is the case, but on one hand you don't care about specs yet your signature shows off what at the time were  pretty beefy specs. 

 

Again even if not you specifically (don't know how you came to own said rig) Many have one standard for PC and a separate and noticeablea lower standard for Nintendo. 

My computer has gotten better since then lol

 

Computer hardware is a hobby of mine.  I buy nice stuff because I can.  I used to raid on WoW with a p4 and a radeon 9000.  17fps all day

 

My point is more that games can run and play fine without beefy specs.  If its a smooth high frame rate and the visuals aren't jagged messes ala Halo with terrible texture popping ala GTA, but controls and plays well like super Mario whatever, then it doesn't bug me.

 

We only care that a Xbone runs at a frame rate that's all over the place because long frame times create stuttering, frame rate swings cause controls to go wonky, create motion sickness, cause unbearable tearing across the scene(tearing matters a lot more at 22fps than 60fps) and cause the overall game to feel sluggish and chunky.  They literally push visual "fidelity" at the expense of gameplay.  For the most part, Nintendo doesn't.

 

Pokemon sells 9999999999999999999 copies and looks like crap.  Because it's a well built, fun game.

Intel 4670K /w TT water 2.0 performer, GTX 1070FE, Gigabyte Z87X-DH3, Corsair HX750, 16GB Mushkin 1333mhz, Fractal R4 Windowed, Varmilo mint TKL, Logitech m310, HP Pavilion 23bw, Logitech 2.1 Speakers

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×