Jump to content

john Carmack Says to Take Nvidia's K1 Claims with Several Grains of Salt

qwertywarrior

Every phone has better memory then the Ps3/Xbox360 the Ps3 has only 256mb GDRR3 and 256mb DRR2 for the CPU.

The Ps3/Xbox360 doesn't even have Cuda Cores/Stream Processors and they don't support DX11.

You should remember the Ps3 has a GT7800 which is 9 Generations old.

1 Kepler SMX will be faster then any GT7000 series card and Tegra K1 has 1 Kepler SMX.

i specifically said bandwidth not size

regardless how how great this soc is its main bottleneck is the memory bandwidth

memory bandwidth on modern phones is roughly 10+Gbps vs 2xGbps on consoles

and even if it was fast as last gen consoles these Socs (when use in tablets/phones) will never operate at full power all the time they throttle very fast  about a 50% downclock

If your grave doesn't say "rest in peace" on it You are automatically drafted into the skeleton war.

Link to comment
Share on other sites

Link to post
Share on other sites

WTF since when did the xbox 360 have more power than the ps3? i know the ps3 was harder to code for but it was more powerful. Who made that chart?

That chart refers to GPU and CPU "horsepower", the PS3 had a weaker GPU than the 360, but a stronger CPU. (when it was actually taken advantage of)

 

I can understand the GPU part, but not so much the CPU being so much better in the  360 over ps3 according to that.

Link to comment
Share on other sites

Link to post
Share on other sites

lol

nvidia: "hey let's make it so that people look at this chart and see how powerful K1 is before realizing that the chart also states 360 = 3x PS3"

"Rawr XD"

Link to comment
Share on other sites

Link to post
Share on other sites

That chart refers to GPU and CPU "horsepower", the PS3 had a weaker GPU than the 360, but a stronger CPU. (when it was actually taken advantage of)

 

I can understand the GPU part, but not so much the CPU being so much better in the  360 over ps3 according to that.

 

According to them the 360 has 25% more GPU power and 3 times more CPU power then the PS3. Interesting because nVidia actually makes the GPU in the PS3. Which was pretty much a 7900GTX.

"Rawr XD"

Link to comment
Share on other sites

Link to post
Share on other sites

Every phone has better memory then the Ps3/Xbox360 the Ps3 has only 256mb GDRR3 and 256mb DRR2 for the CPU.

The Ps3/Xbox360 doesn't even have Cuda Cores/Stream Processors and they don't support DX11.

You should remember the Ps3 has a GT7800 which is 9 Generations old.

1 Kepler SMX will be faster then any GT7000 series card and Tegra K1 has 1 Kepler SMX.

 

Actually, the PS3 has had 256MB of XDR RAM, not DDR2. And the Xbox 360, being designed and built by Microsoft, actually did incorporate several DirectX 11 features such as tesselation support, though it's obviously not as good as decent DX11 GPUs.

 

The PS3 has an RSX GPU, which was a complete turd in comparison to the Xbox 360's Xenos GPU. The Cell CPU on the other hand was multitudes more powerful than the Xbox 360's Xenon CPU, and as such was often used to boost the graphical prowess of the GPU which is why the console was able to have such graphically impressive games as KillZone 3, The Last of Us, Gran Turismo 6, Uncharted and others. It's also a large part of why a lot of multiplatform games failed to operate as well on the PS3 as they did on the Xbox 360, not just the difficulty of development.

 

Until we can see a game as impressive as Gran Turismo 6 running at its native 1440x1080 upscaled to 1920x1080 at 60fps on a Tegra K1 like it does on the PS3, I don't care about NVidia's 'technical specifications'. Certainly not when they're evidently completely inaccurate and obtained from an overclocked and actively cooled unit.

 

(Oooh, also, impressive of NVidia to admit that their RSX GPU is less powerful than the ATI Xenos GPU launched almost a year earlier for the Xbox 360. Damn, guys... I mean it's true, but still. Damn...)

"Be excellent to each other" - Bill and Ted
Community Standards | Guides & Tutorials | Members of Staff

Link to comment
Share on other sites

Link to post
Share on other sites

Than why didn't they say so. Horse power is to do with horses, there are no horses in a GPU. Console peasants don't need to be convinced so Nvidia just needs to use proper scientific data. That is probably why they are really vague with the performance statistics.

I don't see any horses in cars either so. . . granted horsepower in GPUs makes no sense since a GPU has no force at all

Finally my Santa hat doesn't look out of place

Link to comment
Share on other sites

Link to post
Share on other sites

WTF since when did the xbox 360 have more power than the ps3? i know the ps3 was harder to code for but it was more powerful. Who made that chart?

The Xbox 360 actually has a better GPU than the PS3. As for the CPU part, they didn't include the SPUs, which basically makes the PS3 a single core CPU.

I'm more amazed at the 5W claim than anything else.Is that real?

That's very normal for an ARM SoC. Hell, Intel has demo'd an 8w TDP Haswell SoC.
Link to comment
Share on other sites

Link to post
Share on other sites

Horse power has to do with force per time.

And it's a colloquiallism. This is also a simplified table.

Horse power actually comes from a measurement of how much coal a horse could pull out of a coal mine in an hour.... It's a really retarded way to measure anything let alone GPU performance as it can't lift any coal... (This comparison was done to try and sell early steam engines to the mines instead of using horses BTW. )

 

 

I don't see any horses in cars either so. . . granted horsepower in GPUs makes no sense since a GPU has no force at all

"Her tsundere ratio is 8:2. So don't think you could see her dere side so easily."


Planing to make you debut here on the forums? Read Me First!


unofficial LTT Anime Club Heaven Society

Link to comment
Share on other sites

Link to post
Share on other sites

Xbox is more because Blu ray takes way and slows it down a lot--it need the power. Dont quote me.

Link to comment
Share on other sites

Link to post
Share on other sites

Bloody hell, it maybe crap statistics but stop whining like little kids about the term horsepower, its a press release deal, the graph needs to relay that this information means higher = better to the plebs in the audience.  For marketing and general information attainment horsepower is the best term they could have used.  Sure they could have called it operations/Sec. or Gflops or any one of hundreds of ways to relate processing performance, however not everyone knows what a flop is so not everyone knows that 5000 is better than 3000.  But everyone knows that more horsepower = better.  

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Am I the only one who thinks that nvidia should move away form this Tegra business before it really starts dragging the company down?

Link to comment
Share on other sites

Link to post
Share on other sites

I would lean to Mr Carmack`s idea. Nvidia does need to get something going. Although their stock looks ok right now they are in fact not ok. Time is ticking for them. They have may be a few years then their stock could pop. You can only buy so many shares of your own stock to pump your earnings per share when your Revenue is down year to year. Since their earnings are down 27M and EPS was up 3 cents. Helps when you borrow a billion dollars to purchas shares starting last November and they started another billion the middle of last month. The reason they must pump up their products is in the end Intel. Intel pays 233M after taxes ( 250 before ) to Nvidia per year. This is for the use of Nvidia graphics IP on INTC Cpu`s. It is thought that when this contract runs out Jan 2016 that Nvidia will lose A huge portion of their net income. Since their Net Income for 2013 was 451M and the Intel net payment was 233M that is 51% of total net. They have to scramble to get some products other than Video cards into their mix. If not their 18.00 a share will be 6.00 to 7.00 in 2016 some time as when a stock comes down hard if all sales stay the same and only Intel payment disappears that leaves about 36.8 cent per year eps at a 15 multiplier which is 5.52 a share bottom price. You think they are going to scramble to make anything and everything happen while they have the money ? Absolutely. 

Link to comment
Share on other sites

Link to post
Share on other sites

I would lean to Mr Carmack`s idea. Nvidia does need to get something going. Although their stock looks ok right now they are in fact not ok. Time is ticking for them. They have may be a few years then their stock could pop. You can only buy so many shares of your own stock to pump your earnings per share when your Revenue is down year to year. Since their earnings are down 27M and EPS was up 3 cents. Helps when you borrow a billion dollars to purchas shares starting last November and they started another billion the middle of last month. The reason they must pump up their products is in the end Intel. Intel pays 233M after taxes ( 250 before ) to Nvidia per year. This is for the use of Nvidia graphics IP on INTC Cpu`s. It is thought that when this contract runs out Jan 2016 that Nvidia will lose A huge portion of their net income. Since their Net Income for 2013 was 451M and the Intel net payment was 233M that is 51% of total net. They have to scramble to get some products other than Video cards into their mix. If not their 18.00 a share will be 6.00 to 7.00 in 2016 some time as when a stock comes down hard if all sales stay the same and only Intel payment disappears that leaves about 36.8 cent per year eps at a 15 multiplier which is 5.52 a share bottom price. You think they are going to scramble to make anything and everything happen while they have the money ? Absolutely. 

They have a gross income of $2.2B  If they have to they can right off $250M out of R+D which has a budget of $1.3B or SG&A with $1.3B or a combination of both (not ideal I'll grant but still an option).

http://www.marketwatch.com/investing/stock/nvda/financials

 

But you are right about them needing to firm up another department in order to maintain revenue,  and this is as good as any for them I guess.

 

 

Am I the only one who thinks that nvidia should move away form this Tegra business before it really starts dragging the company down?

 

I don't know about the only one,  but their tegra chips are really good performers and used a quite a few decent tablets like the galaxy tab 10.1, note 7's and acer iconia to name a few.

I really hope they stay viable in the field, not just for them but for the competition and pricing in general.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

When they showed off K1 at CES the heatsink on the board was much too large for a device targeting the power envelope it was. Its not uncommon for early hardware to require a heatsink, but it was a lot larger than the sort of thing you see on an ARM dev soc for example which suggested it was struggling to hit the power envelopes it was targeted at.

 

John Carmack has talked about the challenges of mobile development before. He mentioned for example that while mobile SOCs often get really quite high compute performance they seriously lack in basic memory bandwidth which hampers the fixed function capabilities. You can't feed the ROPs and the tesselation engine of that size with so little memory bandwidth, the balance is all in the compute side. That required them to think alot about how to change the performance balance of their game to try and use more compute (and low bandwidth usage compute at that) and reduce the general bandwidth usage in the fixed function parts and other aspects of their game. Its this that still makes porting a real game to mobile hard, 4GB/s is not 100GB/s which an equivalent desktop GPU would have. They don't have the room on a mobile SOC to be putting huge caches and other power hungry bandwidth limitation hiding devices on (like the xbone does) so it leaves them starved of bandwidth.

 

I think in practice this likely means we'll see mobiles and K1 doing very well with particular compute effects that don't require a lot of secondary images that take a lot of bandwidth and they will have to go with relatively low texture quality and avoid AA. 

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 weeks later...

Nvidia did this with Tegra 4 as well, essentially the way Nvidia represents the stats is misleading.

They show the active power or TDP of the Tegra SOC in an actual mobile product, likely a tablet.

But the performance numbers of the Tegra SOC that they preset are measured in a large actively cooled enclosure, so they run the chip at significantly higher clock speeds with active cooling to showcase peak theoretical performance rather than the actual real world performance in an actual product with typical power & cooling constraints of say a tablet or a convertible.

 

The Tegra 4 was the fastest SOC out last I checked.  It beat the SnapDragon 800.

 
Link to comment
Share on other sites

Link to post
Share on other sites

Am I the only one who thinks that nvidia should move away form this Tegra business before it really starts dragging the company down?

 

They make the best ARM SOC's.

Link to comment
Share on other sites

Link to post
Share on other sites

guys its the gpu here thats special not the cpu 

 

a custom 64 bit ARM v8 with 7 way superscalar isn't big?  Really?

Link to comment
Share on other sites

Link to post
Share on other sites

a custom 64 bit ARM v8 with 7 way superscalar isn't big?  Really?

it isnt compared to what the gpu is

the CPU will be probably be toe to toe  with the competition

the K1 is the FIRST desktop grade GPU ever one a mobile soc for phones

If your grave doesn't say "rest in peace" on it You are automatically drafted into the skeleton war.

Link to comment
Share on other sites

Link to post
Share on other sites

According to them the 360 has 25% more GPU power and 3 times more CPU power then the PS3. Interesting because nVidia actually makes the GPU in the PS3. Which was pretty much a 7900GTX.

read the fine print at the bottom.  SPEs not included.

Link to comment
Share on other sites

Link to post
Share on other sites

They make the best ARM SOC's.

No they don't... Especially not if you take things like the cellular connectivity parts into consideration, or power usage, or ISP. The video you linked is very unfair because the Shield does not have anywhere near as big thermal or power restrictions as the phones in the video has.

If you only take raw CPU performance into consideration, then the Tegra 4 might be the best (at least until Samsung enables HMP in one of their Exynos Octa chips). Take heat, power and all the other parts of an SoC into consideration and the Tegra 4 is pretty crappy. There is a reason why barely anyone uses it.

 

a custom 64 bit ARM v8 with 7 way superscalar isn't big?  Really?

The K1 is a quad core Cortex A15 CPU, just like the Tegra 4. The only difference is that they now has an ARMv8 core as the power savings core. It's not really a big deal.

It won't be 64bit (one of the main benefits of being ARMv8) since the other cores aren't. It won't be that powerful (it's designed around being power efficient). I am pretty sure it can't mix and match which cores are being lit up (so as soon as you open a decently intense app like the browser the ARMv8 core will just shut itself off).

If it had been a Cortex A53 and Cortex A57 big.LITTLE implementation then it would have been awesome. As it is right now it's not really a big deal, and it will have the same issues as the Tegra 4 has.

Link to comment
Share on other sites

Link to post
Share on other sites

 Thought before that this comparison was a bit to insane. Well we'll be able to know once this releases. :)

Hello and Welcome to LTT Forum!


If you are a new member, please read the rules located in "Forum News and Info". Thanks!  :)


Linus Tech Tips Forum Code of Conduct           FAQ           Privacy Policy & Legal Disclaimer

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×