Jump to content

Intel Kaby Lake-G might have Polaris graphics instead of Vega.

Bharat Makhija

Source : https://www.techquila.co.in/intel-kaby-lake-g-polaris-gpu-vega/

 

Intel Kaby Lake-G, (the Intel Haydes Canyon NUC specifically) might not have Vega graphics in it as advertised.

Quote

AIDA64, a third-party benchmark utility, identifies the Radeon GPU on the NUC as Polaris 22.

However, SiSoftware Sandra lists the Intel Haydes Canyon NUC to have the integrated GPU as Radeon RX Vega M GH. 

Quote

While replying on third party applications like AIDA64 and Sandra for identifying a GPU architecture is not very reliable, there are multiple other pointers that bolster the claim. The DirectX 12 feature level being limited to 12.0 being the most prominent one. Third-party utilities use PCI device IDs for identification. 694C and 694E are the device IDs for Kaby Lake-G Radeon graphics. These IDs also scream Polaris 22.

To summarize, the factors that help the argument that we might be getting the older Polaris instead of Vega are:

  • DX12.0 instead of DX12.1.
  • No Rapid Packed Maths support introduced with Vega.
  • Uncertainty whether these can be enabled via a software patch.

Giving Intel/AMD the benefit of the doubt, the graphics solution packed with Intel Core i7-8809G in the NUC was custom designed for Intel, so accordingly, architecture modification to a certain extent is certainly possible. However, one thing can be said for sure:

Quote

Intel Core i7-8809G Kaby Lake-G is not the same Vega as the one in the Ryzen 5 2400G.

No comment from Intel/AMD, as of writing this post.

Link to comment
Share on other sites

Link to post
Share on other sites

well it makes sence to sell your opponent a product that is not as good as ur own but better than his own. additionaly they named it not just vega m, what would lead to the point that its a mobile vega gpu, they named it rx vega m, more like the rx series which is based on the polaris architecture.

Link to comment
Share on other sites

Link to post
Share on other sites

They should test to see if it has tile based rendering. That would be better proof IMO than rapid packed math, which could be driver related.

Make sure to quote me or tag me when responding to me, or I might not know you replied! Examples:

 

Do this:

Quote

And make sure you do it by hitting the quote button at the bottom left of my post, and not the one inside the editor!

Or this:

@DocSwag

 

Buy whatever product is best for you, not what product is "best" for the market.

 

Interested in computer architecture? Still in middle or high school? P.M. me!

 

I love computer hardware and feel free to ask me anything about that (or phones). I especially like SSDs. But please do not ask me anything about Networking, programming, command line stuff, or any relatively hard software stuff. I know next to nothing about that.

 

Compooters:

Spoiler

Desktop:

Spoiler

CPU: i7 6700k, CPU Cooler: be quiet! Dark Rock Pro 3, Motherboard: MSI Z170a KRAIT GAMING, RAM: G.Skill Ripjaws 4 Series 4x4gb DDR4-2666 MHz, Storage: SanDisk SSD Plus 240gb + OCZ Vertex 180 480 GB + Western Digital Caviar Blue 1 TB 7200 RPM, Video Card: EVGA GTX 970 SSC, Case: Fractal Design Define S, Power Supply: Seasonic Focus+ Gold 650w Yay, Keyboard: Logitech G710+, Mouse: Logitech G502 Proteus Spectrum, Headphones: B&O H9i, Monitor: LG 29um67 (2560x1080 75hz freesync)

Home Server:

Spoiler

CPU: Pentium G4400, CPU Cooler: Stock, Motherboard: MSI h110l Pro Mini AC, RAM: Hyper X Fury DDR4 1x8gb 2133 MHz, Storage: PNY CS1311 120gb SSD + two Segate 4tb HDDs in RAID 1, Video Card: Does Intel Integrated Graphics count?, Case: Fractal Design Node 304, Power Supply: Seasonic 360w 80+ Gold, Keyboard+Mouse+Monitor: Does it matter?

Laptop (I use it for school):

Spoiler

Surface book 2 13" with an i7 8650u, 8gb RAM, 256 GB storage, and a GTX 1050

And if you're curious (or a stalker) I have a Just Black Pixel 2 XL 64gb

 

Link to comment
Share on other sites

Link to post
Share on other sites

Well... define what "vega" means and then we can have a discussion... how AMD brands their own products is up to them. We should care about performance and features.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Sauron said:

Well... define what "vega" means and then we can have a discussion... how AMD brands their own products is up to them. We should care about performance and features.

Also custom Intel version; so likely made to their specifications.

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

Nothing more than a naming scheme, nothing new really.

 

nvdea's been doing that with their mobile chips for a long time.

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, Sauron said:

Well... define what "vega" means and then we can have a discussion... how AMD brands their own products is up to them. We should care about performance and features.

The question I suppose then becomes, which gen of GCN are they using?

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Dylanc1500 said:

The question I suppose then becomes, which gen of GCN are they using?

And the answer is: it really doesn't matter when we can benchmark it. After all, GCN is as much of a marketing label as Vega.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Sauron said:

And the answer is: it really doesn't matter when we can benchmark it. After all, GCN is as much of a marketing label as Vega.

That is true, but it is what they use to seperate the different generations of archs at least, with the code names falling under that, such as "raven ridge".

Link to comment
Share on other sites

Link to post
Share on other sites

Continuing the "what defines Vega?" well the APUs lack some features from the dGPU too.

Quote

Another interesting thought comes when analyzing this debate with the Ryzen 7 2400G and Ryzen 5 2200G products, both of which claim to use Vega GPUs as a portion of the APU. However, without support for HBM2 or the high-bandwidth cache controller, does that somehow shortchange the branding for it? Or are the memory features of the GPU considered secondary to its design?

https://www.pcper.com/news/Graphics-Cards/GPU-Intel-Kaby-Lake-G-More-Polaris-Vega

Link to comment
Share on other sites

Link to post
Share on other sites

https://www.pcper.com/news/Processors/Intel-Announces-New-CPUs-Integrating-AMD-Radeon-Graphics

 

They actually did an article about this last year stating in it looking more like Polaris than Vega.

 

Quote

The GPU looks to be based on the Polaris architecture which is a slight step back from AMD’s cutting edge Vega architecture.  Polaris does not implement the Infinity Fabric component that Vega does.  It is more conventional in terms of data communication.  It is a step beyond what AMD has provided for Sony and Microsoft, who each utilize a semi-custom design for the latest console chips.  AMD is able to integrate the HBM2 controller that is featured in Vega.  Using HBM2 provides a tremendous amount of bandwidth along with power savings as compared to traditional GDDR-5 memory modules.  It also saves dramatically on PCB space allowing for smaller form factors.

 

 

It makes sense AMD agreed to Intel with an older generation product just for competition sake (advantage for them).

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Razor01 said:

It makes sense AMD agreed to Intel with an older generation product just for competition sake (advantage for them).

I don't really believe that. Considering that Intel/AMD already had something resembling a finished product (or an engineering sample), it was likely that the decision to partner up had to be made way before then. My WAG is about a year from nothing to something. If coming from November of 2017, Vega was still an on- paper GPU as far as anyone was concerned.

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, M.Yurizaki said:

I don't really believe that. Considering that Intel/AMD already had something resembling a finished product (or an engineering sample), it was likely that the decision to partner up had to be made way before then. My WAG is about a year from nothing to something. If coming from November of 2017, Vega was still an on- paper GPU as far as anyone was concerned.

 

 

They were showing Vega off a year before it was released ;)  It was taped out mid 2016.  June or July if I remember correctly.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Razor01 said:

 

 

They were showing Vega off a year before it was released ;)  It was tapped out mid 2016.  June or July if I remember correctly.

I can't seem to find anything regarding the GPU being taped out in 2016. This is the earliest my five minutes on Google has gotten me: https://segmentnext.com/2017/02/23/amd-vega-gpu-2/

 

On the flip side it could've been an executive decision by Intel to not go with the latest and greatest.

 

EDIT: There was a Tweet in mid 2016 about a major development milestone being achieved but no mentioning of a tape out (https://www.techpowerup.com/223672/amd-vega-10-gpu-crosses-a-development-milestone)

Link to comment
Share on other sites

Link to post
Share on other sites

twitter Raja Kaduri

 

 
That was when it taped out.
 
It takes about 6 months to year from that to getting risk production back and then finally final silicon, which is what we have, 6 months after they started showing off Vega.  Then final release 6 months after that.
 
December was the when they showed it off first, in live demos/games

 

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.

×