Jump to content

Community Console, How would you do it?

Noctua_Boy
18 hours ago, Noctua_Boy said:

Lets say you had the power creating a console , how would you do it? Lets assume your company has to release this by 2020 and has to cost effective.

 

My Specs:

  • Nvidia next gen GPU (Midrange) - they have the upper hand in GPU tech.
  • Free Online 
  • Ability to connect External GPU's to improve graphics
  • Hybrid Hard Drives (Nand built in to improve performance)
  • Cross Platform online on all games (Up to the developer)

Your specs are because you believe, not because it makes sense.

 

You won't see anything without an APU from AMD any time soon. It makes no sense to go away from AMD, because you can get everything you want. If you ask AMD to implement some feature, they do it. See that Checkerboard Rendering in PS4 PRO GPU. And also both second gen Consoles are somewhere in between Polaris and Vega.

 

M$ did Intel/nVidia with their first XBox - and went with AMD for the 360 and stuck with them. 

Because there was no shrink/second version of the Original XBox because Intel and nVidia didn't want to do it. And now both use AMD Hardware because they are "Alternativlos(tm)"...

 

And have you seen any Apple MAC Product with an nVidia GPU recently? There is a good reason for that! And that's driver!

With AMD Products, Apple can (and do) the drivers themselves, nVidia won't let them. And also sabotage the Apple plattform with their OpenCL implementation - that is something that Apple also introduced.

 

So there are more than enough reasons to avoid nVidia at all cost and NOT use them at all...

 

The external Graphics card is also something that will increase the cost and is not something most people care about. Just take a look at M$ Kinect, Sony Move and other extensions. They don't sell.

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

18 hours ago, Ryan_Vickers said:

I never heard of the PS2 doing that, interesting. 

You needed some extra hardware for that.

Look for the Lan/Modem connector. That one with the Harddrive Interface.

And of course only works with the old fat one that had a 3,5" Haddrive slot - but you need the Network adaptor for it.

Quote

As for the PS3, they locked it down shortly after launched (in fact there's a thread about the lawsuit over that here somewhere...)

It wasn't that shortly after.

I think it was something like 2 Years or so later...

It was about the same time when the Slim PS3 came....

 

The question is how many people used this feature.


I hear people complain about stuff all the time but mostly they woudln't use it anyway.

And with the 4 you have a Browser and Youtube anyway, what more do you want??

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, M.Yurizaki said:

Consoles are not about the hardware. That's just a marketing point. It's about the user experience. If the system isn't Apple levels of user experience (Okay, maybe not quite that level), it can't be called a console. It'll just be another desktop PC.

I got myself some kind of mac recently and its not that great, tbh.

If you use just one screen its totally fine and works well. But with 2 or more screens the problems arise. 

 

Anyway, with "user experience" you mean the Games you can buy for the Console.

That's why the WiiU failed -> didn't have any noteworthy software. Well, except for like a Hand full titles (Mario Wörld 3D, NSMB, Xenoblade Chronicles for some people, Tokyo Mirage FE, Hyrule Warriors).

And of course the Console was way too expensive for what you got. 370€ - when you could get a PS4 for 250€, the choice is pretty easy.... 

 

17 hours ago, Noctua_Boy said:

Intel QUADCORE I3 (Great IPC) 

NVIDIA "AMPERE" MIDRANGE GPU 

16GB GDDR5

1TB Hard Drive

You really have no idea what you are talking about, do you?
The 1TB HDD is already standard and the one X already has 12GiB Memory.

 

And good luck to get Intel and nV to work together to make a single Chip processor...

BTW: You can get an Intel Chip with AMD Graphics on package right now....

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Stefan Payne said:

I got myself some kind of mac recently and its not that great, tbh.

If you use just one screen its totally fine and works well. But with 2 or more screens the problems arise. 

I meant their consumer electronics outside of their desktop and laptop computers.

Just now, Stefan Payne said:

Anyway, with "user experience" you mean the Games you can buy for the Console.

I mean the user experience as a whole. The user interface is designed with sitting on a couch in mind (the so-called "10 foot" interface). Launching an app is simple. Installing and removing apps are equally simple. I don't have to worry about system or game updates interrupting me (a la Windows Update) or preventing me from using the app just because an update is available (a la Steam). I don't have to worry too much about updates breaking things or finding the correct one. And the experience is relatively consistent.

Link to comment
Share on other sites

Link to post
Share on other sites

1.  Id offer a regular tier console prices at $399. It runs games at 30fps at a higher resolution like 4k(however). This will be great for the vast majority of gamers. Then Id offer the enthusiast level console at $599. It offers options for either 60fps and lower fidelity or higher res and 30fps. Id let developers choose. 

 

3. Full backward comp 

 

2. Have a solid UI and every possible app. 

 

3. Invest into exclusive games that can only be played for said console. 

 

4. Market it like crazy. 

 

 

| Ryzen R9 3900x Enermax LIQTECH II 360  | Asus ROG Crosshair VI Hero| Nivida FE RTX 2080 Ti | Corsair Vengeance 16gb @3200MHZ | Crucial 500 GB SSD | 1TB WD Blue SSD| Corsair HX 750w  Platinum+ |Corsair Carbide Spec-Omega| Gigabyte Arous 27QD 1440p 144hz

Link to comment
Share on other sites

Link to post
Share on other sites

That's another thing I'd do: prioritize framerate over resolution.  How many people can even tell the difference between 1080p and 4K at normal viewing distance?  Or even 1080p and 720p for that matter... it wouldn't be many.  But (and I know some how this isn't true but one can dream) in theory everyone should be able to tell the difference between 30 and 60 fps.  Sure there's still the issue of TVs being laggy but it's a start.

 

They should also support more control types.  At least the native controller + keyboard and mouse, but others should be possible.

 

And mods would theoretically be possible by going in through the included OS and modifying the installed game files

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

On 20/03/2018 at 12:10 AM, KarathKasun said:

AMD GPU, you can leverage AMDs need for market share to get lower prices.

 

This is what Nintendo did with NV for the Switch.  NV wanted the mobile market share so they probably cut them a sweet deal.

AMD GPUs been trash for a while , maybe with the next architecture after GNC..they are planning a brand new design.

Link to comment
Share on other sites

Link to post
Share on other sites

On 20/03/2018 at 6:49 PM, Stefan Payne said:

 

 

You really have no idea what you are talking about, do you?
The 1TB HDD is already standard and the one X already has 12GiB Memory.

 

And good luck to get Intel and nV to work together to make a single Chip processor...

BTW: You can get an Intel Chip with AMD Graphics on package right now....

 So tge 1Tb is already standard , so what? it cant continue? Do you see 2TB magically dropping price? I said 16gb , not 12gb. Who said it was a single chip processor? have you seen Intel colab with AMD? its not an apu , its 3 different chips , a GPU , a GPU and a stack of HBM on a Multi chip Module , quite compact.

No one has to necessarily collaborate they just have to sell their chips at the right price , the board manufactured sorts out the rest.

Its a lot easier to ask AMD to do this , but AMD performance is quite bad , Zen is alright , but the GPUs are quite bad.

If Nvdia had no interest on consoles , why would they colab with the Nintendo? 

If you going to claim that i have no idea of what im talking about atleast correct me with coherent answers.

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Ryan_Vickers said:

That's another thing I'd do: prioritize framerate over resolution.  How many people can even tell the difference between 1080p and 4K at normal viewing distance?  Or even 1080p and 720p for that matter... it wouldn't be many.  But (and I know some how this isn't true but one can dream) in theory everyone should be able to tell the difference between 30 and 60 fps.  Sure there's still the issue of TVs being laggy but it's a start.

 

They should also support more control types.  At least the native controller + keyboard and mouse, but others should be possible.

 

And mods would theoretically be possible by going in through the included OS and modifying the installed game files

I second that. Maybe is my old age , but i cant tell the difference between 4k and 1080p even on my 43inch on my couch , its only when i go closer that things seem a bit more cleaner.

HDR its a different story , if properly implemented it does make a difference. (GT SPORT)

I would love to see more focus on High Frame rates , Physics and A.I.

Link to comment
Share on other sites

Link to post
Share on other sites

Target $500 price point.

Use new Intel/AMD single package CPU/GPU

Use Steam Big Picture like interface with controller remapping built in to the OS (use any controller)

128GB SSD cache

2TB HDD

 

Allow dual boot with windows/linux

 

Link to comment
Share on other sites

Link to post
Share on other sites

-Thread cleaned-

 

Please keep conversation on topic and civil.

Quote
  • Ensure a friendly atmosphere to our visitors and forum members.
  • Encourage the freedom of expression and exchange of information in a mature and responsible manner.
  • "Don't be a dick" - Wil Wheaton.
  • "Be excellent to each other" - Bill and Ted.
  • Remember your audience; both present and future.

 

Quote or tag me( @Crunchy Dragon) if you want me to see your reply

If a post solved your problem/answered your question, please consider marking it as "solved"

Community Standards // Join Floatplane!

Link to comment
Share on other sites

Link to post
Share on other sites

17 hours ago, KarathKasun said:

Target $500 price point.

Use new Intel/AMD single package CPU/GPU

Use Steam Big Picture like interface with controller remapping built in to the OS (use any controller)

128GB SSD cache

2TB HDD

 

Allow dual boot with windows/linux

Too expensive.

The Price point of the normal Console is more in the sub 300€ range, 500€ for the Higher end model like it is now.

And the Intel/AMD Chip is something you can expect for 500€ alone, if not more. And it doesn't have a unified Memory model wich makes it harder for everyone to use. Now we have one memory pool and that is quite advantage because you don't need to copy anything between CPU and GPU at all.

 

Dual boot with "Other OS" was done in the past, but people don't seem to care much about that.

 

22 hours ago, Ryan_Vickers said:

That's another thing I'd do: prioritize framerate over resolution. 

Most PS4 Pro enhanced games like Horizon Zero Dawn, Knack, Monster Hunter World and many more offer you the choice between higher resolution or higher performance.

And some better Texture Details, so a third option.

22 hours ago, Ryan_Vickers said:

 

They should also support more control types.  At least the native controller + keyboard and mouse, but others should be possible.

PS4 already supports that. The games don't in most cases.

 

18 hours ago, Noctua_Boy said:

AMD GPUs been trash for a while , maybe with the next architecture after GNC..they are planning a brand new design.

Yes, because People like to get ripped by nVidia instead and don't buy AMD at all like it was the case with the GTX970.

And they aren't as bad as you make them to be, because Miner prefer AMD and they are pretty potent. 

 

The AMD GPUs are more advanced than the nVidia ones, just take a look at that Compute Stuff and "Asyncrhounous Compute".

 

But without people bying their stuff and even paying more for the same or worse chip from the competition, it doesn't look good in that Segment and the Market is Broken (again).

 

 

17 hours ago, Noctua_Boy said:

 So tge 1Tb is already standard , so what? it cant continue? Do you see 2TB magically dropping price?

That's why there are 2 Versions of the normal Playstation 4.

One 500GB Model, one 1TB Version.

The PS4 PRO Only comes in 1TB.

 

What sense would it make to not increase the Drivespace?!

None, especially since there already are 2TB Drives with 7mm height.

17 hours ago, Noctua_Boy said:

I said 16gb , not 12gb.

Yes, and?

Doesn't really make much more sense anyway.

Especially since you're stuck with either 256bit or 512bit Memory Interface.

Or they might even use HBM.

17 hours ago, Noctua_Boy said:

Who said it was a single chip processor?

Common sense and every Developer I've seen talking about that.

 

 

You save cost, it makes the development much easier.

Especially the transfer from CPU to GPU is rather costly in terms of performance and Power Consumption as well. 

 

And the APUs work very well, there is no reason to change it yet again, Sony should have noticed that it makes sense to have an easy to use Plattform. And that's what they went for after the hard to use PS3.

They got some slack for it and you can bet your behind that ALL Developers comlained about the PS4 Architecture.

 

 

17 hours ago, Noctua_Boy said:

have you seen Intel colab with AMD? its not an apu , its 3 different chips , a GPU , a GPU and a stack of HBM on a Multi chip Module , quite compact.

Yes I know and it is an Intel Chip.

Have you seen the Radeon Control Center for that chip??
It was modified for that Chip!

 

17 hours ago, Noctua_Boy said:

No one has to necessarily collaborate they just have to sell their chips at the right price , the board manufactured sorts out the rest.

Yes, they have.

When Sony tells them to, they either have to do it or die. 

 

And why do you thinl that we might see discrete graphics in a console ever again?

Especially when the Transfer of Data is what really is costly and drives up the power consumption. You don't want that.


With an integrated chip, you can have more performance within the same TDP.

 

17 hours ago, Noctua_Boy said:

Its a lot easier to ask AMD to do this , but AMD performance is quite bad , Zen is alright , but the GPUs are quite bad.

No, they are not.

Especially since you miss that the companys should optimize more for the Hardware they have at hand. ANd thus get more Performance out of it.

And with a good and competent developer, your statement is false.

Just look at the New Colossus for example:

http://gamegpu.com/action-/-fps-/-tps/wolfenstein-ii-the-new-colossus-test-gpu-cpu

 

RX Vega beats the shit out of the 1080, even in 1080p. Only the 1080ti is a bit faster.

And the RX480/580 isn't far from the 1070 either.

 

SO it is not the hardware is as bad as you make it to be...

 

17 hours ago, Noctua_Boy said:

If Nvdia had no interest on consoles , why would they colab with the Nintendo? 

1.They don't really collab

2. Its an off the shelves nVidia Chip

3. NVidia has _NOTHING_ that meets the need for a Stationary higher end Console.

 

From a Hardware standpoint, the Swich is still rather crappy and way behind - a bit faster than the WiiU but not that much.

And even the Original XBone is way faster than the Switch...

 

23 hours ago, SSJGodemis said:

3. Full backward comp 

Yeah, that is kinda annoying with the PS4 but many people don't seem to care about it much.

 

And it wouldn't have been easy to offer that for the PS4 anyway, without increasing the cost dramatically...

They basically would have to have integrated at least the PS3 processor - wich would only be used for backwards compatibility...

 

But with the switch to x86 and AMD its not something you should worry about much. That should be backwards compatible because the architecture of the base system will stay similar and compatible.

With PS3 to PS4 it was not the case. It wasn't just the switch from PowerPC to x86 but also the switch from a rather strange main processor to a normal main processor...

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Stefan Payne said:

 

Quote

 

Yes, because People like to get ripped by nVidia instead and don't buy AMD at all like it was the case with the GTX970.

And they aren't as bad as you make them to be, because Miner prefer AMD and they are pretty potent. 

 

The AMD GPUs are more advanced than the nVidia ones, just take a look at that Compute Stuff and "Asyncrhounous Compute".

 

But without people bying their stuff and even paying more for the same or worse chip from the competition, it doesn't look good in that Segment and the Market is Broken (again).

 

The Pascal architecture is better at gaming than Vega. AMD tried to compete with the 480 which was great card btw , but NVidia still had a better value on the lower and mid range segment with the 1050ti and 1060. After countless hours i still cant find an equivalent to my gtx 1060 6gb , Nvidia cards have lower TDP's , which also helps.

The Miners are buying their stuff , the gamers prefer Nvidia because its a better deal. If they want the gamers then work on the architecture until its competitive. They just done that with Zen.

 

Quote

 

That's why there are 2 Versions of the normal Playstation 4.

One 500GB Model, one 1TB Version.

The PS4 PRO Only comes in 1TB.

 

Yh , but that wasn't available at launch.

Quote

 

Especially since you're stuck with either 256bit or 512bit Memory Interface.

Or they might even use HBM.

Common sense and every Developer I've seen talking about that.

 

Have there been any complaints about bandwidth starved games?

Quote

 And the APUs work very well, there is no reason to change it yet again, Sony should have noticed that it makes sense to have an easy to use Plattform. And that's what they went for after the hard to use PS3.

They got some slack for it and you can bet your behind that ALL Developers comlained about the PS4 Architecture.

 

The point of this tread was to see how would you approach doing your own console (if you were in a position to do so) i wasn't asking what sony would do.

Yes the APU thing is very cost effective , unfortunately i think as it is NVDIA has the best GPU tech by a mile. Consoles came out quite crippled this time around. I remember when the xbox 360 came out , the GPU was state of the art , it was the 1st to feature unified Shader architecture , which ATI later launched on the PC. 

Like i said if AMD gave me a glimpse of what the 2020 looks like with the new architecture , then the APU would make sense , since ZEN is pretty good as it is.

 

 

Quote

 

And why do you thinl that we might see discrete graphics in a console ever again?

Especially when the Transfer of Data is what really is costly and drives up the power consumption. You don't want that.

 

People dont care about that , if it looks good on screen is all they care.

Quote

 

RX Vega beats the shit out of the 1080, even in 1080p. Only the 1080ti is a bit faster.

And the RX480/580 isn't far from the 1070 either.

 

not on real gaming im afraid. FPS tells the story 

 


 

Quote

 

1.They don't really collab

2. Its an off the shelves nVidia Chip

3. NVidia has _NOTHING_ that meets the need for a Stationary higher end Console.

From a Hardware standpoint, the Swich is still rather crappy and way behind - a bit faster than the WiiU but not that much.

And even the Original XBone is way faster than the Switch...

 

 

 

1- whats your definition of collaboration?

2 - yep its an off the shelves part ..so...?

3- Since the xbox one x been compared to a GTX 1060... I would say GTX 1060 , which would be pushed even further by developers on a console scenario. Not sure about the regular consoles..maybe a 1050? The Jaguar CPU is quite appalling , even the current Pentium would run circles around it.

You mean the Original Xbox back in 2002? That is not faster than the switch.

 

 

 

Lets face it , Nvidia has the upper hand in GPU tech , Intel have the upper hand in IPC too. What AMD has is alot of flexibility since they are the underdogs on both fields. Competition is great for the consumer.

I would take a different direction with my Console and let the public and devs decide between day zero crippled architectures or a more powerful one with more room for growth.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Noctua_Boy said:

Lets face it , Nvidia has the upper hand in GPU tech ,

No they have not, I've provided a Link to a Vulcan based Game that is very well optimized for everyone.

That shows how it looks when the developers do their job (well).

 

So what you claim here is false.

 

1 hour ago, Noctua_Boy said:

Intel have the upper hand in IPC too.

AMD Ryzen is more efficient.

Just look at THG Benches of Ryzen.

 

And what is IPC? What does it mean?

1 hour ago, Noctua_Boy said:

I would take a different direction with my Console and let the public and devs decide between day zero crippled architectures or a more powerful one with more room for growth.

They will show you the middle finger and tell you that they don't want anything to do with it because it costs money, is worse to work with (remember the PS3?? That was one thing that led developers to not do stuff on it or not well).

 

Sorry, but you don't seem to know much about how stuff works and are just throwing around stuff that just does not work that way in real life. 

A developer will want _ONE_ Memory Pool for both graphics and main processor and not separate ones.

That saves much headache and performance. 

 

Just look at the PS3, there is a fuckup in that transfer system that is in two figures megabyte/second range. And no, that is no joke.

 

If nVidia is so superior as you say, why do the Miners mostly prefer AMD Hardware??

Because you are wrong and the AMD hardware is superior, has much more features and things you can do with it.

Running two things on it at the same time? No problem! You can do compute and rendering stuff at the same time!

With nVidia? Ähm, not really.

 

Thats also a statement of a Developer, that the nVidia cards only offer the bare minimum for DX11/DX12 while AMD implemented some new innovative stuff - Asynchronous Computing - just to name one thing.

 

And also look at the compute units. AMD Didn't change soo much since Tahiti its more like an evolution than revolution. If it wasn't superior, why the hell is nVidia moving to a design that resembles GCN more and more?

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, Stefan Payne said:

Too expensive.

The Price point of the normal Console is more in the sub 300€ range, 500€ for the Higher end model like it is now.

And the Intel/AMD Chip is something you can expect for 500€ alone, if not more. And it doesn't have a unified Memory model wich makes it harder for everyone to use. Now we have one memory pool and that is quite advantage because you don't need to copy anything between CPU and GPU at all.

 

Dual boot with "Other OS" was done in the past, but people don't seem to care much about that.

Not sure what you are going on about with "too expensive"  XB1 and PS4 were $400-$500 at launch and had zero problems selling.  Consoles are only $200-$300 now because they are at the end of their life-cycle.  PS3 started at, what, $650?

 

The chip does not cost that much, thats just what you pay at retail.  Intel and AMD would love to get a steady revenue stream coupled with advertising.  Its not beyond Intel to pay people to use their parts either. Atom chips were literally given away when they were heavily pushing tablet and phone parts.  Though in reality, It would likely be a custom Vega/Ryzen part.

 

Dual boot on the PS3 was popular enough that there was a class action lawsuit concerning the removal of that feature (and it was won AFAIK).  It was only removed because someone had figured out how to exploit it for piracy.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Noctua_Boy said:

The Pascal architecture is better at gaming than Vega.

No, see Link I've posted.

That's how it looks when the developers do their job and use a lower level API (Vulkan in this case)


And you also see how well the RX480 performs and that the 1060 doesn't have a chance against it. 

 

4 hours ago, Noctua_Boy said:

Have there been any complaints about bandwidth starved games?

HBM has other advantages besides Bandwith. And even that isn't really an argument because you could go for a 512bit Memory Interface.

 

There are other advantages that make HBM worth it. And that's because its an on package design, thus saves space on the PCB, allows for smaller consoles. 

But that's not the best part. That is that it saves die space because the PHYs are way smaller than GDDR5(x) PHYs, because they need less driver strenth, thus saving power.

 

Haven't you seen the AMD Document about HBM? That one where they talk about how much power HBM saves.

 

 

 

And with more bandwith you can use other effects or just use it for FSAA. Because there are some things that more or less only cost bandwith and not much else.

 

4 hours ago, Noctua_Boy said:

Yes the APU thing is very cost effective , unfortunately i think as it is NVDIA has the best GPU tech by a mile. Consoles came out quite crippled this time around. I remember when the xbox 360 came out , the GPU was state of the art , it was the 1st to feature unified Shader architecture , which ATI later launched on the PC. 

Like i said if AMD gave me a glimpse of what the 2020 looks like with the new architecture , then the APU would make sense , since ZEN is pretty good as it is.

That isn't the only advantage, as I stated.

It makes developing for the console way easier because you only have one memory pool. That makes copying from "system Memory" to "Graphic Memory" darn easy. And also you can choose if you want to waste the memory mostly for textures or for other stuff.

 

With a non unified memory pool that you have with discrete graphics chips, that is not the case. You have to deal with two different memory pools or do some really crappy stuff that costs performance and increases power consumption - like using the Graphics card as the Northbridge and connecting the CPU just to the GPU. 

XBox360 did it this way for example.

https://www.beyond3d.com/content/articles/4/3

 

Thats might be a viable option if we talk about something like 25GB/sec but not with around 200GB/sec. Because of Power Limitations. 

 

And as stated earlier, AMD has the more advanced graphics chip.

The AMD one you can use for example to do some AI calculations and not use the CPU for that. With nVidia that's not a great idea because nVidia Chips aren't good at doing Compute and graphics stuff.

 

4 hours ago, Noctua_Boy said:

People dont care about that , if it looks good on screen is all they care

EXACTLY!

BUT they do care about the price they have to pay!

And the size of the console.


With an APU you can save a couple of bucks, maybe even as much as 50-100€ for the same performance with lower Power consumption! 

Because the I/O transfers are one of those things that cost power these days...

 

4 hours ago, Noctua_Boy said:

not on real gaming im afraid. FPS tells the story 

Have you taken a look at what I've linked?
IS Wolfenstein not "real gaming"?? And what FPS are you talking about?

What about DOOM??

THAT's how it looks when the developers invest time and money in equal optimization for both!

 

And that is the problem. You assume the nVidia is superior because of Lazy and incompetent developers that do not optimize for both!

 

4 hours ago, Noctua_Boy said:

1- whats your definition of collaboration?

2 - yep its an off the shelves part ..so...?

3- Since the xbox one x been compared to a GTX 1060... I would say GTX 1060 , which would be pushed even further by developers on a console scenario. Not sure about the regular consoles..maybe a 1050? The Jaguar CPU is quite appalling , even the current Pentium would run circles around it.

You mean the Original Xbox back in 2002? That is not faster than the switch.

 

1) 2 or more companys doing stuff together. That means that both work on a chip that is better than the off the shelves stuff.

Like the XBox One and Playstation 4 APUs.

AMD built both, they both are kinda different. With the PS4 Pro vs Scorpio even more. And that's why those two went with AMD. They said what they wanted, AMD said what that cost and they ordered it. Nobody else does that stuff right now!

 

2) ...you can buy the same stuff from other companys. And in China you can use the Shield thing to play Switch games...

 

3) That's just wishful thinking on your part.

The facts are: The one X GPU is a full custom part from AMD that can not be compared to other chips! Because its half way between Polaris and Vega.

And most importantly uses a 384bit Memory interface, thus has way way more bandwith than a GTX1060 - almost double the Bandwith. And double the Shader cores and Texture Units. So no, that's just bullshit. 

How well the one X GPU really does is not something anyone can say. A developer who worked with the GPU might eventually be able to but we are not. Because its neither Polaris nor Vega and the Bandwith is so much greater than the RX480. 

The regular PS4 has a GPU roughly equal to the Radeon HD 7870, the One is way worse than that and under the 7850 and somewhere between 7770 and 7850. But then again, that was 5 Years ago!

 

4 hours ago, Noctua_Boy said:

What AMD has is alot of flexibility since they are the underdogs on both fields.

No, because they have the semi custom part that does everything you want, if they are able to do it and you have the money to pay for it...

 

Neither Intel nor nVidia does Custom Chips for anyone!

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, KarathKasun said:

Not sure what you are going on about with "too expensive"  XB1 and PS4 were $400-$500 at launch and had zero problems selling. 

Yes, but we are talking about at least 500€ just for the Intel/AMD low Power Chip alone.

What you are proposing is more or less this thing:

https://geizhals.de/intel-nuc-kit-nuc8i7hvk-hades-canyon-boxnuc8i7hvk-a1752708.html

 

Änd starting at 850€...

The other one starts at 700€:

https://geizhals.de/intel-nuc-kit-nuc8i7hnk-hades-canyon-boxnuc8i7hnk-a1752709.html

 

See the problem??

2 hours ago, KarathKasun said:

PS3 started at, what, $650?

yes and that's why they failed (more or less)...

And had such long lifetime.

And the PS2 still sold better because the price of the PS3 was that enormous...

2 hours ago, KarathKasun said:

The chip does not cost that much, thats just what you pay at retail. 

You don't know Intel...

250-500€ is something you pay for a normal lower power version of the chip. So its safe to assume that the G-Series with Vega graphics will be really really expensive - because of the niche it aims at...

 

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, Stefan Payne said:

Yes, but we are talking about at least 500€ just for the Intel/AMD low Power Chip alone.

What you are proposing is more or less this thing:

https://geizhals.de/intel-nuc-kit-nuc8i7hvk-hades-canyon-boxnuc8i7hvk-a1752708.html

 

Änd starting at 850€...

The other one starts at 700€:

https://geizhals.de/intel-nuc-kit-nuc8i7hnk-hades-canyon-boxnuc8i7hnk-a1752709.html

 

See the problem??

yes and that's why they failed (more or less)...

And had such long lifetime.

And the PS2 still sold better because the price of the PS3 was that enormous...

You don't know Intel...

250-500€ is something you pay for a normal lower power version of the chip. So its safe to assume that the G-Series with Vega graphics will be really really expensive - because of the niche it aims at...

 

The chips do not cost that much.  That at Intels 100% desktop profit margin.

PS2 launched at the equivalent of $450 in 2018 dollars, and had a longer lifespan than PS3.

 

You don't know intel.  Those laptop CPUs that you see at $500 MSRP sell to OEMs for under $300, sometimes below $200.  This is due to volume of the orders and the fact that the OEM is contractually obligated to advertise Intel for free.  Do some research on OEM to supplier pricing structures.

 

The BOM for that NUC is like $350, tops.  You could make a console for $500 with the same parts + memory + HDD + controller and still make money.  Consoles target lower profit per unit and make up for that with sheer volume.  At the end of the day Intel will sell more consoles than NUCs by a factor of 1000.

 

Also, the all in one chips from Intel are effectively i3 level at this point.

Link to comment
Share on other sites

Link to post
Share on other sites

19 hours ago, Stefan Payne said:
Quote

 

No, see Link I've posted.

That's how it looks when the developers do their job and use a lower level API (Vulkan in this case)


And you also see how well the RX480 performs and that the 1060 doesn't have a chance against it. 

 

Unfortunately a couple of games is not enough for me to fork money over to AMD when the majority perform better on Nvidia.

Quote

 

HBM has other advantages besides Bandwith. And even that isn't really an argument because you could go for a 512bit Memory Interface.

 

There are other advantages that make HBM worth it. And that's because its an on package design, thus saves space on the PCB, allows for smaller consoles. 

But that's not the best part. That is that it saves die space because the PHYs are way smaller than GDDR5(x) PHYs, because they need less driver strenth, thus saving power.

 

Haven't you seen the AMD Document about HBM? That one where they talk about how much power HBM saves.

 

 

Yh i know all about HBM ,i know the cost of the memory is not the issue , its the interposer , which HBM3 promises to remedy in order to compete with GDDR5x and GDDR6. Awesome tech , just needs to drop the price and get better yields.


 

Quote

 

It makes developing for the console way easier because you only have one memory pool. That makes copying from "system Memory" to "Graphic Memory" darn easy. And also you can choose if you want to waste the memory mostly for textures or for other stuff.

 

 

 

I have been sold on the idea of unified memory all the  way back to 2005 when i read the the xbox 360 tech docs. Maybe i should had been more clear..the 16GB was not DDR4 , was GDDR5. ( or whatever is cost/future proof effective. 

 

 

Quote

 

And as stated earlier, AMD has the more advanced graphics chip.

The AMD one you can use for example to do some AI calculations and not use the CPU for that. With nVidia that's not a great idea because nVidia Chips aren't good at doing Compute and graphics stuff.

 

Quite vague. Ever heard of CUDA?


 

Quote

And that is the problem. You assume the nVidia is superior because of Lazy and incompetent developers that do not optimize for both!

And that is the problem , you assume companies develop games as a hobby , its all about money..money.

They look at the market , they see majority Nvidia cards , bang...they are not going to put resources on optimizing a game for AMD if that is not going to make them a decent return. Obviously you have rare breeds like Doom , which are known to turn water into wine.(yes they made doom run on the Snes)

Quote

 

1) 2 or more companys doing stuff together. That means that both work on a chip that is better than the off the shelves stuff.

Like the XBox One and Playstation 4 APUs.

AMD built both, they both are kinda different. With the PS4 Pro vs Scorpio even more. And that's why those two went with AMD. They said what they wanted, AMD said what that cost and they ordered it. Nobody else does that stuff right now!

 

Why optimize something that already meets their needs? It does cost money after all.(engineering hours).

The APU were a different story , there was no APU on the market that meet Sony/microsoft's requirements , there was a necessity for it. Do you remember how weak the APUs of that era were? You could barely play 720p games.

The Tegra was already one of the best at mobile gaming , i just wish they waited a bit longer for the parker instead :(

 

 

Quote

 

3) That's just wishful thinking on your part.

The facts are: The one X GPU is a full custom part from AMD that can not be compared to other chips! Because its half way between Polaris and Vega.

And most importantly uses a 384bit Memory interface, thus has way way more bandwith than a GTX1060 - almost double the Bandwith. And double the Shader cores and Texture Units. So no, that's just bullshit. 

How well the one X GPU really does is not something anyone can say. A developer who worked with the GPU might eventually be able to but we are not. Because its neither Polaris nor Vega and the Bandwith is so much greater than the RX480. 

The regular PS4 has a GPU roughly equal to the Radeon HD 7870, the One is way worse than that and under the 7850 and somewhere between 7770 and 7850. But then again, that was 5 Years ago!

 

No, because they have the semi custom part that does everything you want, if they are able to do it and you have the money to pay for it...

 

Neither Intel nor nVidia does Custom Chips for anyone!

 

There is videos out there of both versions pc and xbox one x of the the game running at 4k with similar effects/quality levels and they compare the FPS. (nevermind that the xbox one x has dynamic resolution scaling and not always runs at true 4K). The console weak CPUs dont help either.

The GTX 1060 and GTX1060 6gb dont use the same GPU , the latter has considerable more cores.

The performance can be compared to what you see on the screen , i dont care about any corporate sales pitch buddy , i care about results.

There is videos out there of people with budget builds ,  Pentiums and 750 Ti's running 1080p games at medium to High settings at high frame rates something that even the PS4 struggles with.

You can give me all the specs you want , its what on the screen that counts im afraid.

 

 

Quote

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

22 hours ago, Stefan Payne said:
Quote

 

No they have not, I've provided a Link to a Vulcan based Game that is very well optimized for everyone.

That shows how it looks when the developers do their job (well).

 

So what you claim here is false.

 

 Now you just need to convince all the shareholders. I see the majority running better on Nvidia.A

Quote

 

AMD Ryzen is more efficient.

Just look at THG Benches of Ryzen.

 

I never said it wasnt. Look at the tests , Intel has the upper hand in framerates.

Quote

And what is IPC? What does it mean?

Instructions per cycle...just in case....pretty sure you just being sarcastic.

 

Quote

They will show you the middle finger and tell you that they don't want anything to do with it because it costs money, is worse to work with (remember the PS3?? That was one thing that led developers to not do stuff on it or not well).

PS3 used a power PC , im talking about x86 architecture here. Not sure where you are going with this? Do you want me to create another topic on why the PS3 was harder to program than the xbox 360?

 

Quote

Sorry, but you don't seem to know much about how stuff works and are just throwing around stuff that just does not work that way in real life. 

You are entitled to an opinion , so far i proved you wrong with very practical explanations/scenarios. i dont claim to be an expert on electronic engineering either , ;)

Quote

 

A developer will want _ONE_ Memory Pool for both graphics and main processor and not separate ones.

That saves much headache and performance. 

 

I agree , i never said otherwise , not sure why you are bringing this up either.

 

Quote

Just look at the PS3, there is a fuckup in that transfer system that is in two figures megabyte/second range. And no, that is no joke.

Did i ever mention myself trying to replicate its architecture?

Quote

 

If nVidia is so superior as you say, why do the Miners mostly prefer AMD Hardware??

Because you are wrong and the AMD hardware is superior, has much more features and things you can do with it.

Running two things on it at the same time? No problem! You can do compute and rendering stuff at the same time!

With nVidia? Ähm, not really.

 

I dont mine buddy , i game.

 

Quote

Thats also a statement of a Developer, that the nVidia cards only offer the bare minimum for DX11/DX12 while AMD implemented some new innovative stuff - Asynchronous Computing - just to name one thing.

Thats fantastic , im also a avid reader and observer , i cant see any of that translating into my  monitor , i will open my wallet to the folks over AMD when that happens. Im not in the business of giving charity to Multi Billion dollar companies either.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

@Noctua_Boy

pls fix the quotes.

 

Its not possible to answer that.


And no, you are wrong.

19 minutes ago, Noctua_Boy said:

Quite vague. Ever heard of CUDA?

Ever head of DirectCompute from Microsoft?
Ever heard of OpenCL?!

 

Stop trying to talk your way out of it and understand what people are saying.

Yes, I know of that shit, that was not what I was saying.

 

 

And CUDA is just proprietary Garbage that is there to keep the competition out of the Market - and one of the Reasons why nVidia was dropped by Apple.

And the Chinese.

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

21 hours ago, Stefan Payne said:

@Noctua_Boy

Quote

 

pls fix the quotes.

Its not possible to answer that.

 

Not sure why you have been answering? Im confused,

Quote

And no, you are wrong.

So are all of the other gamers , journalists and engineers that find AMD GPUs power inefficient  compared to the Nvdia counterparts. Less Frames per watt because the whole industry doesnt take the time to babysit their way of doing things? Lets drop everything and dedicate our existence to the boys at AMD. Seriously , how about them man up and do what the Ryzen Team did instead?

 

 

Quote

Ever head of DirectCompute from Microsoft?
Ever heard of OpenCL?!

yep. 

Quote

 

Stop trying to talk your way out of it and understand what people are saying.

Yes, I know of that shit, that was not what I was saying.

 

What's with the profanity? Do you need that to make your point across? Im not talking my way out of anything , im giving you my opinion , just because you don't agree there is no need to trow up a tantrum.

 

Quote

 

And CUDA is just proprietary Garbage that is there to keep the competition out of the Market - and one of the Reasons why nVidia was dropped by Apple.

And the Chinese.

 

Fair enough , its seems to work pretty well for me and other corporations. If there is alternative compute solutions how exactly is Nvidia keeping AMD from competing with their own Open Compute? If AMD is faster like you say , what is keeping a team of scientists to buy a bunch of those cards to do their stuff instead of Nvidia?

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 24/03/2018 at 1:00 AM, Stefan Payne said:

And CUDA is just proprietary Garbage that is there to keep the competition out of the Market - and one of the Reasons why nVidia was dropped by Apple.

And the Chinese.

For the record , i think Nvdia GPUs are awesome , but i hate what they are trying to do with the recent GeForce Partner Program.

I hope they get hit with an anti competitive lawsuit. we all know AMD is working in making a sucessor to the GNC architecture , if that can do for its GPUs , what Zen did for CPUs , then it could pose a threat to Nvidia.

Furthermore , Intel's plants do to a proper discrete GPU is something they will have to worry at some point.

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Noctua_Boy said:

For the record , i think Nvdia GPUs are awesome

Yes that's really not something that's hard to miss.

Especially when you get some facts that show that "the others" are way better than you make them out to be.

 

Just look at how good Doom and Wolfenstein II perform!

 

1 hour ago, Noctua_Boy said:

, but i hate what they are trying to do with the recent GeForce Partner Program.

ORLY?!
Haven't you heard the storys about the Original XBox? What do you think why Microsoft dropped Intel _AND_ nVidia for the 360 and made that incompatible??

And stuck with ATi/AMD since then...

 

And what about the CUDA/Open CL Story??

They dropped the Support for that with Kepler - right when 'the other ones' are better in that area, they do the walled garden shit...

 

1 hour ago, Noctua_Boy said:

I hope they get hit with an anti competitive lawsuit.

OT:
It would be a start if you stop buying every nVidia generation...

1 hour ago, Noctua_Boy said:

we all know AMD is working in making a sucessor to the GNC architecture , if that can do for its GPUs , what Zen did for CPUs , then it could pose a threat to Nvidia.

Radeon HD7970 was already equal or better than Kepler aka GK104.

Same with Hawaii...

And then there was the GTX 970 Memory Architecture...

 

And to be honest: an 192bit Memory interface would have been better - with 3GiB than what we have right now.

 

1 hour ago, Noctua_Boy said:

Furthermore , Intel's plants do to a proper discrete GPU is something they will have to worry at some point.

For what?
The 4th or fith Time?!

The i740 failed, then they did only IGP Stuff for a while.

Larrabee didn't work...

And their drivers are the worst for gaming right now...

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×