Jump to content

Nvidia Accuses Qualcomm Of Mobile Chip Monopoly; Demands Compensation For Unfair Practices - 352 milion $

Mr_Troll
7 minutes ago, MMKing said:

I was referring to this:

https://forums.geforce.com/default/topic/885576/nvidia-drivers-soon-only-available-through-geforce-experience/

 

Chances are I don't know enough on this topic and I don't have the time to read through it so I'm sorry if I'm posting false info.

Link to comment
Share on other sites

Link to post
Share on other sites

24 minutes ago, Shahnewaz said:

Borderline competitive? How many design wins Nvidia have for their Tegra chips compared to Qualcomm's Snapdragon chips? And how many are tablets anyway?

Because @LAwLz was right about his criticisms. Tegra was hot and power hungry, when demoed at CES 2014. Charlie got to take a look at the demo board itself, and it confirms what LAwLz was claiming: http://semiaccurate.com/2014/03/05/nvidias-tegra-k1-draws-shocking-number-watts/

AnandTech also got their hands on the board, and they confirmed its absurdly high power consumption way before Charlie: http://www.anandtech.com/show/7169/nvidia-demonstrates-logan-soc-mobile-kepler/2

Bonus: The jetson TK1 board uses active cooling to keep a tablet processor cool: http://elinux.org/Jetson_TK1

You'd think that after 8 months, they had their problems fixed, but nope.

Adding more salt to the wounds are Joel Hruska and John Carmack:

http://www.extremetech.com/gaming/177002-john-carmack-suspicious-of-nvidias-outlandish-tegra-k1-claims

And then Samsung went as far as counter-suing them for falsely representing the performance of the Tegra K1: https://blogs.nvidia.com/blog/2014/11/11/nvidia-responds-to-samsung/

 

Anyways, sales, design wins and volume matter after everything. And why would anyone choose Denver, when right around the same time, the 20nm Snapdragon 810 became available? (And Samsung would only use their SoCs regardless, except their US devices.) Not only Denver looked slower, more power hungry and outdated, but nobody would pick a particularly weird and sup-optimal processor design: http://www.androidauthority.com/tegra-k1-exynos-5433-snap-805-541582/

The Nexus 9 is the only tablet that sported Nvidia's Denver cores to this date.

 

Even Nvidia themselves called quits with Denver after the Tegra K1. The Tegra X1 sports just plain ARM CPU cores: http://www.nvidia.com/object/tegra-x1-processor.html

 

In retrospect, Nvidia was the one who came borderline competitive with their Denver K1 offerings, but got immediately trounced by far superior solutions from Qualcomm.

 

Nvidia didn't throw their previous accusations lightly too. In fact, they went straight for the class action lawsuit with their previous claims, in comparison to court and press conference complaints about how Qualcomm is doing this and that. That is a way bolder move than this one, and even that bolder move didn't bode well in court. I have reasons to doubt them, and so do everyone else.

This is the basis for Nvidia's argument: Qualcomm only has its marketshare because of its name and the ability to sell chips with negative profit. Nvidia can have the better hardware, but if it's more expensive than Qualcomm's SOC, regardless of the amount, it wounds the profit margins of the phone/tablet maker. And that's generally unacceptable in any free market. Nvidia isn't winning design wins because of Qualcomm's anti-competitive behavior, or at least that is what Nvidia is now arguing in the courts. We'll see if that argument is both valid and verifiable.

 

The only cases where the power escaped 4W was under a heterogeneous computing workload, which was a bullshit metric to pull. Everyone shoved the GPU far beyond what it was intended for in that environment. The K1 Tegra under normal use conditions was far more efficient than the A8 and at far higher performance.

 

Carmack's been proven wrong in case we all missed the Shield console kicking the XBox 360 and PS3's tails and being decently competitive against the PS4 and XBOne.

 

Nvidia hasn't abandoned the Denver designs at all. It's doing something very similar to Intel's old 2-part cadence. It will come up with a new custom core every 2 generations and use vanilla ARM cores in between them. I believe the next generation is called Parker.

 

No, the patent move wasn't bold at all and held no consequences for losing. This is much more dangerous water to tread.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, patrickjp93 said:

Under HPC-style workloads (full-tilt GPU with full-tilt vector processors on the CPU cores too). No one gives a damn when the average is 4W! And no, the TDP, rated for the maximum of the average cases, is and was 5W.

[Citation Needed]

No, it was not 11 watts fully loaded on everything. In those scenarios (and active cooling) it could draw 33 watts.

Here is my source. I will wait for yours.

Also, the battery tests for the Nexus 9 kind of speaks for themselves (particularly the browser one).

 

6 hours ago, patrickjp93 said:

An interpreter is just a lightweight compiler that organizes/optimizes code in real time. Every language that ever goes down to machine instructions is compiled AT SOME POINT.

And you will never, not even with the best compiler on the planet, get away from the fact that code morphing adds extra time in some scenarios, which is why Nvidia included functions to allow Denver to decode ARMv8 natively. The problem is that it was shit at it. Significantly slower than even Cortex A15. Code morphing is not an easy thing to do either (I know you will claim that you can do it in your sleep, but obviously Nvidia failed so it was hard for them) so you can't expect it to work 100% efficiently 100% of the time. We did see huge variations in the performance of the Denver cores because of the code morphing. That is the cold, hard truth. You can argue about why it happened however much you want, but it won't change the fact that it did not work out that great in practice.

 

6 hours ago, patrickjp93 said:

A modem is a modem is a modem (spectrum coverage being the main distinguishing point). Most of them are no larger than the head of small nail. Adding them to a socket on the mainboard takes practically no extra space than it would have being directly integrated into the SOC.

But it was not just a modem that was missing. Tegra barely included anything, so you had to get a modem for cellular, and another chip for things like WiFi, and possibly a third chip for something else. NFC or whatever. Having separate chips for all these things will end up costing more and using more power. The design of the motherboard becomes more complex as well.

By the way, I think you greatly underestimate the size of these things. Here is a picture of the iPhone 6S's guts. Wanna guess which chip the modem is? Hint: It's the second largest one. The WiFi chip can be seen in this image, it's the orange one (and the yellow one next to it is for NFC). Here is the full article. They almost take much as much space as the A9 itself.

It is not the end of the world, but there are big cost, space and power savings to be made when using integrated modems.

 

6 hours ago, patrickjp93 said:

maybe Lawlz is right for once

That's very rich coming from you.

 

 

The K1 with Denver cores was shit.

There might be some truth to the whole modem lawsuit. I don't know what Qualcomm sells their modems for compared to how much they cost to develop and make, but Nvidia lost their mobile marketshare by producing SoCs that were mediocre or bad compared to their competitors (which includes Samsung and MediaTek, not just Qualcomm). They got only themselves to blame for that.

 

 

Edit:

3 hours ago, patrickjp93 said:

The K1 Tegra under normal use conditions was far more efficient than the A8 and at far higher performance.

[Citation Needed]

I don't want your word for that either. I want a link to a third party doing actual measurements and number crunching that proves that the K1 with Denver cores, under normal conditions, used less power than the A8 while also having much higher performance.

Don't forget to make sure it is the CPU portion that was being tested. I think we can all agree that the GPU in the K1 was a beast (I called it the only good thing about the K1).

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, ivan134 said:

To us lay men, it seems like it shouldn't be, even if it is an assholish thing to do, but maybe there are laws against it and I want to know if there are.

I don't think so. IIRC in the last console generation, both PS3 and Xbox 360 were being sold at a loss. 

MacBook Pro 15' 2018 (Pretty much the only system I use)

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, FakezZ said:

I don't think so. IIRC in the last console generation, both PS3 and Xbox 360 were being sold at a loss. 

That's a completely different situation. That was lowering prices to get sales as opposed to lowering prices below their actual market price just to price your competition out of business which as some have explained to me is called predatory pricing and is illegal in most places.

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, patrickjp93 said:

This is the basis for Nvidia's argument: Qualcomm only has its marketshare because of its name and the ability to sell chips with negative profit. Nvidia can have the better hardware, but if it's more expensive than Qualcomm's SOC, regardless of the amount, it wounds the profit margins of the phone/tablet maker. And that's generally unacceptable in any free market. Nvidia isn't winning design wins because of Qualcomm's anti-competitive behavior, or at least that is what Nvidia is now arguing in the courts. We'll see if that argument is both valid and verifiable.

 

The only cases where the power escaped 4W was under a heterogeneous computing workload, which was a bullshit metric to pull. Everyone shoved the GPU far beyond what it was intended for in that environment. The K1 Tegra under normal use conditions was far more efficient than the A8 and at far higher performance.

 

Carmack's been proven wrong in case we all missed the Shield console kicking the XBox 360 and PS3's tails and being decently competitive against the PS4 and XBOne.

 

Nvidia hasn't abandoned the Denver designs at all. It's doing something very similar to Intel's old 2-part cadence. It will come up with a new custom core every 2 generations and use vanilla ARM cores in between them. I believe the next generation is called Parker.

 

No, the patent move wasn't bold at all and held no consequences for losing. This is much more dangerous water to tread.

if you think a SHIELD console can match the PS4 or XBONE you are sorely mistaken.

The raw output of the most recent SHIELD SKU is slightly under HALF that of the XBONEs raw GPU output, and 1/3rd of the PS4s graphical output.

The ARM CPU may actually be faster, but unless you are playing a strictly CPU limited game, no current gen SHIELD product is even remotely close to beat the current gen consoles.

 

NOT, EVEN, CLOSE.

 

If you still think so, you are outright delusional patrick. Sure GFLOPS is not a surefire metric to measure actual GPU prowess in games, but in this case, it is the only argument you need.

 

But if you want more arguments against you.

PS4 and XBONE has more ROPs, TMUs, SPs, ACEs (8 for PS4, 2 for XBONE) and higher memory capacity and bandwidth.

 

Nvidia is good at GPUs, but Tegra SKUs are based on Kepler, and even if you used Maxwell 2, the likelihood of a tablet GPU being able to match a laptop GPU which has more cores, more shaders, more rasterizers, more texture units, more memory and more bandwidth is just IDIOCY.

Considering the consoles, both of them, uses bare metal APIs, the overhead they have is probably equal to or lower then what you have on android.... So "drivers" arent going to affect your delusional pipeline dream either.

 

Can a Tegra SKU show as pretty or a prettier image on screen then the consoles? YES

But that is due to pre-rendered assets in the game, NOT the Tegra GPU itself.

 

EDIT:

If talking power consumption. Tegra X1 can hit nearly 20w according to Anandtechs review which you can read here:

http://www.anandtech.com/show/9289/the-nvidia-shield-android-tv-review/9

 

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, themaniac said:

inb4 people shout saying that NVidia is also a monopoly, which is by no means true

Tell that to amd's face.

(⌐■_■) 

Link to comment
Share on other sites

Link to post
Share on other sites

I thought it was common knowledge that if you can't produce a better product that a potential competitor/current competitor then you merely bring them to court on mostly wild accussations.

 

Qualcomm do not have the best chips on the market for example the heat issues with the 810 which also helped suck the bettery dry quicker. We do not know what facts/rumours NVidia have heard/have but they could be legitimate or they could be a way for NVidia to receive money for doing nothing. 

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Ridska said:

Tell that to amd's face.

No problem. Both of them patent troll Intel so it can't compete in the dGPU space anyway. Nvidia is a very dominant player, but that is primarily AMD's fault from a combination of buying ATI for far more than it was worth, bad marketing and terrible driver support up until mid 2015.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, Curufinwe_wins said:

 

 

For example the JEDEC standards (among many) explicitly prevent companies from selling the same memory silicon at different prices to different OEMs.

Why is there a special pricing restriction for memory (compared to any other product in existence)? 

PSU Tier List | CoC

Gaming Build | FreeNAS Server

Spoiler

i5-4690k || Seidon 240m || GTX780 ACX || MSI Z97s SLI Plus || 8GB 2400mhz || 250GB 840 Evo || 1TB WD Blue || H440 (Black/Blue) || Windows 10 Pro || Dell P2414H & BenQ XL2411Z || Ducky Shine Mini || Logitech G502 Proteus Core

Spoiler

FreeNAS 9.3 - Stable || Xeon E3 1230v2 || Supermicro X9SCM-F || 32GB Crucial ECC DDR3 || 3x4TB WD Red (JBOD) || SYBA SI-PEX40064 sata controller || Corsair CX500m || NZXT Source 210.

Link to comment
Share on other sites

Link to post
Share on other sites

21 hours ago, DeadEyePsycho said:

Nvidia thinking they had the best SoCs when in reality their Tegra line wasn't that spectacular.

They're recalling their SHIELDs right now, even.. It's hilarious that this popped up in the news.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, djdwosk97 said:

Why is there a special pricing restriction for memory (compared to any other product in existence)? 

It's part of the agreed stipulations of being part of the organizational group that decides the standards.

 

Contractual obligation to prevent favoritism. Also there are numerous other things that have these sort of stipulations but memory is the biggest high-liner.

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, patrickjp93 said:

This is the basis for Nvidia's argument: Qualcomm only has its marketshare because of its name and the ability to sell chips with negative profit. Nvidia can have the better hardware, but if it's more expensive than Qualcomm's SOC, regardless of the amount, it wounds the profit margins of the phone/tablet maker. And that's generally unacceptable in any free market. Nvidia isn't winning design wins because of Qualcomm's anti-competitive behavior, or at least that is what Nvidia is now arguing in the courts. We'll see if that argument is both valid and verifiable.

Of course, we'll see, but there is no harm making predictions or some analysis based on the information we have so far.

14 hours ago, patrickjp93 said:

The only cases where the power escaped 4W was under a heterogeneous computing workload, which was a bullshit metric to pull. Everyone shoved the GPU far beyond what it was intended for in that environment. The K1 Tegra under normal use conditions was far more efficient than the A8 and at far higher performance.

It really doesn't matter if it did beat the Apple A8 or the Exynos 7 Octa. Apple and/or Samsung simply uses their own chips for their devices. Its main competitor is Qualcomm, and it didn't beat the Snapdragon 810, even with the Quad ARM variant.

14 hours ago, patrickjp93 said:

Carmack's been proven wrong in case we all missed the Shield console kicking the XBox 360 and PS3's tails and being decently competitive against the PS4 and XBOne.

Charlie said that it doesn't beat its competitors' performance AND have lower power consumption: http://semiaccurate.com/2014/01/20/well-nvidias-tegra-k1-really-perform/

And that was the case when pitted against the Snapdragon 810. The Tegra K1 stomp the Snapdragon while having a disproportionate advantage of being on a developer board with a beefier PSU and active cooling: http://www.tomshardware.com/reviews/snapdragon-810-benchmarks,4053-3.html

But real world comparison? Not so much. Tables are turned there, and the Denver Cores just manages to keep up with the Snapdragon 810: http://anandtech.com/bench/product/1366?vs=1600

That's evidence enough to believe Carmack's words. Either show top notch performance, or show very low power consumption, doing both is simply no possible on a phone or tablet form factor. And the Shield Console is just another form of the Jetson TK1 development board: 2-NvidiaShieldTakeApart.jpg

Don't forget the lack of any integrated Wireless or Cellular modules in the Tegra K1, compared to SoCs from Qualcomm or Samsung: https://www.ifixit.com/Teardown/Nexus+9+Teardown/31425

Nexus 9 Motherboard.jpg

The chip on the left of the yellow dotted Samsung internal storage is the Broadcom BCM4354XKUBG MIMO 5G Wi-Fi 802.11ac/Bluetooth 4.0/FM Module.

And what's inside the Snapdragon 810? Not only does it sport integrated WiFi and Bluetooth, but including: http://www.eetimes.com/author.asp?section_id=36&doc_id=1324976

Quote

The 810 marks many milestones:

  • Qualcomm's first 20nm (TSMC) SoC
  • The industry's first multi-channel 4G LTE SoC supporting Category 9 Carrier Aggregation
  • Qualcomm's first 64-bit ARMv8 CPUs (four A57 @ ~2GHz, and four A53 @ 1.55GHz)
  • The first chip to use the Adreno 430 GPU
  • Qualcomm's first support for a dual 14-bit ISP camera
  • The first dual channel 1600 MHz LLPDDR4 memory implementation in the industry
  • The first hardware implementation of 4K HEVC/H.265 video encode
  • The first UFS 2.0 storage support
  • Qualcomm's first WCD9330 analog codec

Leaving the Tegra K1 in the dust. The K1 even needed a discrete GNSS chipset, which Qualcomm has integrated into their SoCs.

Nexus 9 Motherboard Back.jpg

14 hours ago, patrickjp93 said:

Nvidia hasn't abandoned the Denver designs at all. It's doing something very similar to Intel's old 2-part cadence. It will come up with a new custom core every 2 generations and use vanilla ARM cores in between them. I believe the next generation is called Parker.

I'll wait till they have actual silicon to show.

14 hours ago, patrickjp93 said:

No, the patent move wasn't bold at all and held no consequences for losing. This is much more dangerous water to tread.

Oh really? The last patent suit wasn't dangerous?

1) Nvidia pulled 4 of their 7 patents off the lawsuit: https://www.usitc.gov/secretary/fed_reg_notices/337/337_932_notice09012015sgl.pdf

2) Nvidia not only loses their lawsuit, but their patents get partially/fully invalidated, or found non-infringing, or both: https://www.usitc.gov/press_room/documents/337-932_signed.pdf

Some words from Charlie about the ruling: http://semiaccurate.com/2016/01/04/37902/

Quote

The ITC also effectively declared all three major mobile GPU architectures free from infringement which removes the overwhelming majority of mobile SoCs from any of Nvidia’s further trolling ambitions. When SemiAccurate said Nvidia was ‘devastated’ by the ITC ruling, were weren’t joking. Five and some of a sixth of the seven patents were functionally invalidated, the last was not not infringed upon as was one of the invalidated ones. The experts were dead on correct in 2014, the Nvidia patents were worthless as SemiAccurate has been saying for years.

3) As if that wasn't bad enough, Nvidia was then found infringing 3 of Samsung's patents: https://www.usitc.gov/press_room/documents/337_941_id.pdf

 

Yeah, not dangerous at all. 9_9

Quote

The problem is that this is an nVidia product and scoring any nVidia product a "zero" is also highly predictive of the number of nVidia products the reviewer will receive for review in the future.

On 2015-01-28 at 5:24 PM, Victorious Secret said:

Only yours, you don't shitpost on the same level that we can, mainly because this thread is finally dead and should be locked.

On 2016-06-07 at 11:25 PM, patrickjp93 said:

I wasn't wrong. It's extremely rare that I am. I provided sources as well. Different devs can disagree. Further, we now have confirmed discrepancy from Twitter about he use of the pre-release 1080 driver in AMD's demo despite the release 1080 driver having been out a week prior.

On 2016-09-10 at 4:32 PM, Hikaru12 said:

You apparently haven't seen his responses to questions on YouTube. He is very condescending and aggressive in his comments with which there is little justification. He acts totally different in his videos. I don't necessarily care for this content style and there is nothing really unique about him or his channel. His endless dick jokes and toilet humor are annoying as well.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 06.04.2016 at 10:11 PM, kurahk7 said:

It's simple. Make a good chip and people will want it. People are wanting the exynos version of the s7 for the better performance and battery life because Qualcomm no longer is able to make the best cpu. 

this isnt about what people wants lol.

 

On 06.04.2016 at 10:01 PM, Kobathor said:

That's like saying nVidia has a monopoly on the GPU market, or Intel has a monopoly on the desktop CPU market. It's simply not true. Market share /= monopoly.

 

What is nVidia doing? Dumbasses.

no. this is like nvidia selling the gtx 980 ti below production cost so lets say 200$ so that nobody buys amd gpus anymore. to drive amd out of the market on purpose

Link to comment
Share on other sites

Link to post
Share on other sites

Its like saying to amd they should make 390 cost like 980's so nVidia can sell more 980's.

That its quite WTF to me

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×