Jump to content

[Updated #1][Rumour] Intel and NVIDIA had an internal agreement that blocked the development of laptops with AMD Renoir and GeForce RTX 2070 and above

schwellmo92

Summary

Polish tech site purepc.pl is claiming that Intel and NVIDIA had an exclusive agreement to only provide RTX 2070 or greater mobile GPU's in laptops with Intel 10th generation CPU's. I have not seen reports of this elsewhere and purepc has not provided a source other than saying an OEM confirmed this to be the case, so take this with a grain of salt for now. But it would make sense as to why we never saw an RTX 2070/2080 with a Ryzen 4000 mobile CPU.

 

Quotes

Quote

According to the information we have found, Intel and NVIDIA had an agreement signed last year, under which it was not possible to prepare laptop configurations with AMD Ryzen 4000 processors and graphics cards at the GeForce RTX 2070 level and above.

 

The Renoir APUs were compatible with the PCIe 3.0 x8 interface, which means that a maximum of 8 PCIe 3.0 lanes were dedicated to the dGPU. Some OEMs admitted that these limitations (Intel offered 16 lines for the graphics card) reduced the maximum performance of NVIDIA GeForce RTX 2070 graphics cards and above. For this reason, there was not a single notebook in this configuration on the market. From the information we have come across, it is clear that the said reason was only a smokescreen and not the real reason.

 

One of the OEMs finally secretly admitted that the real reason for this was an internal agreement between Intel and NVIDIA, under which the most powerful graphics cards from the Turing family could only be combined with 10th generation Intel processors.

Interestingly, this year's AMD Ryzen 5000-H (Cezanne-H) processors also have a maximum of 8 PCIe 3.0 lanes for the graphics card, so theoretically, you could use the same excuse as a year ago. 

 

My thoughts

If true this is very dodgy on Intel and NVIDIA's part. We already know that AMD 5000 series mobile chips will be combined with up to RTX 3080's, this is in spite of the new chips also having the same "limitation" that has been quoted as being the reason why the 4000 series was limited to RTX 2060 or less.

 

Sources

https://translate.google.com/translate?sl=auto&tl=en&u=https://www.purepc.pl/intel-oraz-nvidia-mieli-wewnetrzna-umowe-ktora-blokowala-tworzenie-laptopow-z-amd-renoir-oraz-geforce-rtx-2070-i-wyzej

 

Note: I added a rumour tag as this hasn't been verified anywhere else.

 

UPDATE:

"Nvidia's Northern Europe PR Manager Lianne Hunter reached out with the company's official position regarding the allegations: “The claim is not true. OEMs decide on their system configurations, selecting GPU and then CPU to pair with it. We support both Intel and AMD across our product stack.”"

Link to comment
Share on other sites

Link to post
Share on other sites

I thought this was due to a lack of pcie lanes to make anything over a 2060 non beneficial.

AMD blackout rig

 

cpu: ryzen 5 3600 @4.4ghz @1.35v

gpu: rx5700xt 2200mhz

ram: vengeance lpx c15 3200mhz

mobo: gigabyte b550 auros pro 

psu: cooler master mwe 650w

case: masterbox mbx520

fans:Noctua industrial 3000rpm x6

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Letgomyleghoe said:

I thought this was due to a lack of pcie lanes to make anything over a 2060 non beneficial.

Not even close, PCIe 3.0 8x is still enough, especially for mobile GPUs which are slower than their desktop counterparts.

CPU: AMD Ryzen 7 5800X3D GPU: AMD Radeon RX 6900 XT 16GB GDDR6 Motherboard: MSI PRESTIGE X570 CREATION
AIO: Corsair H150i Pro RAM: Corsair Dominator Platinum RGB 32GB 3600MHz DDR4 Case: Lian Li PC-O11 Dynamic PSU: Corsair RM850x White

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Letgomyleghoe said:

I thought this was due to a lack of pcie lanes to make anything over a 2060 non beneficial.

In most games, a 3080 doesn't take a significant framerate hit until pcie *1.0* x8.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Grabhanem said:

In most games, a 3080 doesn't take a significant framerate hit until pcie *1.0* x8.

 

1 minute ago, Morgan MLGman said:

Not even close, PCIe 3.0 8x is still enough, especially for mobile GPUs which are slower than their desktop counterparts.

huh, I remember reading some article on it and never bothered to check the spec, my bad.

AMD blackout rig

 

cpu: ryzen 5 3600 @4.4ghz @1.35v

gpu: rx5700xt 2200mhz

ram: vengeance lpx c15 3200mhz

mobo: gigabyte b550 auros pro 

psu: cooler master mwe 650w

case: masterbox mbx520

fans:Noctua industrial 3000rpm x6

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Letgomyleghoe said:

I thought this was due to a lack of pcie lanes to make anything over a 2060 non beneficial.

The 4800H has 1x8 PCIE 3.0 lanes for a dedicated GPU, 1x4 lanes for storage and 1x4 reserved for other peripherals.

The i7 10875H also has 16 PCIE3.0 lanes that can be configured as 1x16, 2x8  or 2x4 and 1x8. And will end up being configured in the same way in most laptops. So no

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Grabhanem said:

In most games, a 3080 doesn't take a significant framerate hit until pcie *1.0* x8.

Is this just an assumption or are there actual tests for this? PCIE 1.0 x8 is the equivalent of a PCIE 3.0 x2 in terms of speed, and there's a theoretical max speed of 2000MB/s

Link to comment
Share on other sites

Link to post
Share on other sites

Ah yes, a random polish website publishes a wild conspiracy theory with 0 evidence and conveniently it's one that can neither be proved nor disproved, so any and all discussion about this will be "I believe Nvidia and Intel are evil so therefore this proves my point".

Can't wait to see a bunch of AMD fanboys use this to prove that Nvidia and Intel are evil, and AMD is their friend who need protection.

I like how they have also shut down any argument that can disprove them in the style of a flat-earther.

 

"Couldn't the reason for the lack of high end Nvidia GPUs on AMD laptops be because of PCIe limitations?"

Nooo! That's just a lie told by Intel! You'd be a fool to believe that! It barely makes a difference in most situations so therefore it is wrong. Also please ignore that Nvidia have traditionally had very strict PCIe requirements, such as SLI not being supported on anything less than PCIe x8 motherboards despite PCIe x4 being enough at the time to get 98% of performance.

 

"How do you explain the new laptops which do have AMD CPUs and high end Nvidia cards?"

The conspiracy has been broken and brave OEMs are now rising up against the oppressive regim of Intel and Nvidia! 

 

 

Gotta love how they quote themselves as the source. I should start doing that.

 

 

This rumor seems like a load of bollocks.

Source: LAwLz

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, AndreiArgeanu said:

Is this just an assumption or are there actual tests for this? PCIE 1.0 x8 is the equivalent of a PCIE 3.0 x2 in terms of speed, and there's a theoretical max speed of 2000MB/s

https://www.techpowerup.com/review/pci-express-4-0-performance-scaling-radeon-rx-5700-xt/24.html

https://www.techpowerup.com/review/nvidia-geforce-rtx-3080-pci-express-scaling/

 

This should probably answer the questions

Remember to either quote or @mention others, so they are notified of your reply

Link to comment
Share on other sites

Link to post
Share on other sites

I mean, it's not like you could find a whole lot of Renoir laptops in the first place.

 

On a more serious note though, I'd definitely hold off on pulling out the pitchforks until this becomes something more than what's essentially just a baseless allegation by the looks of things. Not that the vast majority of AMD's fanbase will care about that, mind you.

 

I also can't really see why Nvidia would turn against AMD's CPU department (which is very much not the same thing as Radeon). Off the top of my head, they've had various promos with their GPUs in AMD powered systems even recently. In fact, Intel entering the GPU market would only make things worse per se, since they're becoming another competitor to Nvidia.

 

10 minutes ago, Grabhanem said:

In most games, a 3080 doesn't take a significant framerate hit until pcie *1.0* x8.

average-fps_3840-2160.png.562f0252b577b2cdd56bbc3eefad8800.png

Desktop: Intel Core i9-9900K | ASUS Strix Z390-F | G.Skill Trident Z Neo 2x16GB 3200MHz CL14 | EVGA GeForce RTX 2070 SUPER XC Ultra | Corsair RM650x | Fractal Design Define R6

Laptop: 2018 Apple MacBook Pro 13"  --  i5-8259U | 8GB LPDDR3 | 512GB NVMe

Peripherals: Leopold FC660C w/ Topre Silent 45g | Logitech MX Master 3 & Razer Basilisk X HyperSpeed | HIFIMAN HE400se & iFi ZEN DAC | Audio-Technica AT2020USB+

Display: Gigabyte G34WQC

Link to comment
Share on other sites

Link to post
Share on other sites

I was trying to find out where did they get this information, but they only say that they got it from an unnamed notebook OEM that they work with. Believe what you want ¯\_( ͡° ͜ʖ ͡°)_/¯
 

8 minutes ago, AndreiArgeanu said:

Is this just an assumption or are there actual tests for this?

From actual tests that I've seen, the first real drop in performance occurs once you go down to PCIe 3.0 x4/PCIe 2.0 x8. Even then it was only around 10%, but I haven't seen tests done using the latest generation GPUs. I personally don't expect to see any meaningful differences though.
 

6 minutes ago, LAwLz said:

Ah yes, a random polish website publishes a wild conspiracy theory with 0 evidence and conveniently it's one that can neither be proved nor disproved, so any and all discussion about this will be "I believe Nvidia and Intel are evil so therefore this proves my point".

Can't wait to see a bunch of AMD fanboys use this to prove that Nvidia and Intel are evil, and AMD is their friend who need protection.

 

Gotta love how they quote themselves as the source. I should start doing that.

Well, it is a fairly known website in Poland. But as I said above, they claim an OEM that they work with told them about this. No proof of any kind 😉

CPU: AMD Ryzen 7 5800X3D GPU: AMD Radeon RX 6900 XT 16GB GDDR6 Motherboard: MSI PRESTIGE X570 CREATION
AIO: Corsair H150i Pro RAM: Corsair Dominator Platinum RGB 32GB 3600MHz DDR4 Case: Lian Li PC-O11 Dynamic PSU: Corsair RM850x White

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, LAwLz said:

Source: LAwLz

Id browse this website.

Workstation Laptop: Dell Precision 7540, Xeon E-2276M, 32gb DDR4, Quadro T2000 GPU, 4k display

Wifes Rig: ASRock B550m Riptide, Ryzen 5 5600X, Sapphire Nitro+ RX 6700 XT, 16gb (2x8) 3600mhz V-Color Skywalker RAM, ARESGAME AGS 850w PSU, 1tb WD Black SN750, 500gb Crucial m.2, DIYPC MA01-G case

My Rig: ASRock B450m Pro4, Ryzen 5 3600, ARESGAME River 5 CPU cooler, EVGA RTX 2060 KO, 16gb (2x8) 3600mhz TeamGroup T-Force RAM, ARESGAME AGV750w PSU, 1tb WD Black SN750 NVMe Win 10 boot drive, 3tb Hitachi 7200 RPM HDD, Fractal Design Focus G Mini custom painted.  

NVIDIA GeForce RTX 2060 video card benchmark result - AMD Ryzen 5 3600,ASRock B450M Pro4 (3dmark.com)

Daughter 1 Rig: ASrock B450 Pro4, Ryzen 7 1700 @ 4.2ghz all core 1.4vCore, AMD R9 Fury X w/ Swiftech KOMODO waterblock, Custom Loop 2x240mm + 1x120mm radiators in push/pull 16gb (2x8) Patriot Viper CL14 2666mhz RAM, Corsair HX850 PSU, 250gb Samsun 960 EVO NVMe Win 10 boot drive, 500gb Samsung 840 EVO SSD, 512GB TeamGroup MP30 M.2 SATA III SSD, SuperTalent 512gb SATA III SSD, CoolerMaster HAF XM Case. 

https://www.3dmark.com/3dm/37004594?

Daughter 2 Rig: ASUS B350-PRIME ATX, Ryzen 7 1700, Sapphire Nitro+ R9 Fury Tri-X, 16gb (2x8) 3200mhz V-Color Skywalker, ANTEC Earthwatts 750w PSU, MasterLiquid Lite 120 AIO cooler in Push/Pull config as rear exhaust, 250gb Samsung 850 Evo SSD, Patriot Burst 240gb SSD, Cougar MX330-X Case

 

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, Grabhanem said:

In most games, a 3080 doesn't take a significant framerate hit until pcie *1.0* x8.

 

7 minutes ago, Eigenvektor said:

Thanks. Turns out games do take a significant performance hit with a 3080 at PCIE 1.0 x16. There's some titles where the performance is nearly 1/3 on PCIE 1.1 x8. And in most the 3080 just seems to turn into a 3070 when ran at pcie 1.1 x16.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah....I really don't think that's the case at all. Sounds like the perfect fanboy bait, really.

 

If there really was such an agreement....why do it on laptop segments which tend to not be as sellable? Why do it only for one year? The fact that laptops running Cezanne can be specced with up to a 3080M (yes, I am calling it that) makes whatever they claim not add up for the prior-generation Renoir APUs.

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

The problem is this is entirely believable. It's likely an anti-trust and illegal action in most jurisdictions, perfectly in keeping with what many businesses (especially these two) have a history of doing, could be effective and we have absolutely no way to prove the charge otherwise.

 

It's top-tier hit piece material, no clue if it actually happened.

Link to comment
Share on other sites

Link to post
Share on other sites

Wouldn't be first Intel's ride for s**t like this and neither would be with NVIDIA. It's gonna be a massive s**tstorm if true.

Link to comment
Share on other sites

Link to post
Share on other sites

Given how both Intel and NVIDIA are giant bags of shit when it comes to any sort of morals or ethics, I could totally see them doing this sort of a deal. That said, I won't give any weight to a single rumour from some random site, with zero evidence or any backup for their claims from other parties.

Hand, n. A singular instrument worn at the end of the human arm and commonly thrust into somebody’s pocket.

Link to comment
Share on other sites

Link to post
Share on other sites

Whilst NVIDIA and Intel (alongside AMD) has had a history of sketchy behavior that leads to many questions regarding ethical behavior, if the goal is to suppress AMD’s rise in the mobile space, assuming OEMs are right when they have remarked “much stronger than expected” demand, there’s already a giant hole in said plan/goal in the sense that the Renoir APUs are fitted in laptops that tend to be the most sellable, especially for machines like the G14 which had proven itself to be a strong mobile workstation without breaking the bank much at all. 

 

In fact, the biggest competitor to AMD right now is.....AMD itself, because it’s extremely difficult to get your hands on a machine with a Renoir APU and signs are pointing towards Cezanne being the same. Tiger Lake may actually be a damn good product, but it’s currently only in ultrabooks, with a proper 45W 8-core variant still on its way.

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

Hold on, the new i7 11370H has only x4 PCIe 4.0 lanes which would be equivalent to an x8 PCIe 3.0 connection and yet that's only been paired with a maximum of a 3060 so far. Then again, it is only a 35W part so maybe it doesn't make sense to pair something higher with it.

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, LAwLz said:

Ah yes, a random polish website publishes a wild conspiracy theory with 0 evidence and conveniently it's one that can neither be proved nor disproved, so any and all discussion about this will be "I believe Nvidia and Intel are evil so therefore this proves my point".

Can't wait to see a bunch of AMD fanboys use this to prove that Nvidia and Intel are evil, and AMD is their friend who need protection.

I like how they have also shut down any argument that can disprove them in the style of a flat-earther.

 

"Couldn't the reason for the lack of high end Nvidia GPUs on AMD laptops be because of PCIe limitations?"

Nooo! That's just a lie told by Intel! You'd be a fool to believe that! It barely makes a difference in most situations so therefore it is wrong. Also please ignore that Nvidia have traditionally had very strict PCIe requirements, such as SLI not being supported on anything less than PCIe x8 motherboards despite PCIe x4 being enough at the time to get 98% of performance.

 

"How do you explain the new laptops which do have AMD CPUs and high end Nvidia cards?"

The conspiracy has been broken and brave OEMs are now rising up against the oppressive regim of Intel and Nvidia! 

 

 

Gotta love how they quote themselves as the source. I should start doing that.

 

 

This rumor seems like a load of bollocks.

Source: LAwLz

Intel Nvidia bad

AMD is our Lord and savior

 

Can I just say that I think Nvidia have the right to not allow their GPU to be paired with things they don't like? If they can control it of course

Childish bickering sure, but I don't think it's foul play in my book

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, CephDigital said:

Hold on, the new i7 11370H has only x4 PCIe 4.0 lanes which would be equivalent to an x8 PCIe 3.0 connection and yet that's only been paired with a maximum of a 3060 so far. Then again, it is only a 35W part so maybe it doesn't make sense to pair something higher with it.

* only 4x 4.0 lanes through the CPU. The 11370H technically supports both PCIe 3.0 and 4.0: 4.0 through the cpu and 3.0 through the chipset. So you should also have some 3.0 lanes for storage devices etc. through there.

 

I think it also doesn't include the lanes used for Thunderbolt 4 connectivity (those are on a separate part of the die), so it probably has just as much overall bandwidth that last gen had (8x+4x+4x 3.0 lanes equivalent.)

CPU: i7 4790k, RAM: 16GB DDR3, GPU: GTX 1060 6GB

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, Caroline said:

This has been going on since the beginning of times.

Any examples?

With source of course

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, Caroline said:

This has been going on since the beginning of times.

Even the Bible has that one passage about Jesus lamenting about his fancy, new laptop not having a beefy NVIDIA GPU in it because it was using an AMD GPU and how the three wise men tried to console him.

Hand, n. A singular instrument worn at the end of the human arm and commonly thrust into somebody’s pocket.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Caroline said:

Intel-vs-AMD-Laptop-CPU-Market-Share-202

Can you elaborate how this graph relates to the claim you made regarding to the topic?

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Mateyyy said:

I mean, it's not like you could find a whole lot of Renoir laptops in the first place.

 

On a more serious note though, I'd definitely hold off on pulling out the pitchforks until this becomes something more than what's essentially just a baseless allegation by the looks of things. Not that the vast majority of AMD's fanbase will care about that, mind you.

 

I also can't really see why Nvidia would turn against AMD's CPU department (which is very much not the same thing as Radeon). Off the top of my head, they've had various promos with their GPUs in AMD powered systems even recently. In fact, Intel entering the GPU market would only make things worse per se, since they're becoming another competitor to Nvidia.

 

average-fps_3840-2160.png.562f0252b577b2cdd56bbc3eefad8800.png

Is there the same scaling on 1080p and 1440p?

I could use some help with this!

please, pm me if you would like to contribute to my gpu bios database (includes overclocking bios, stock bios, and upgrades to gpus via modding)

Bios database

My beautiful, but not that powerful, main PC:

prior build:

Spoiler

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×