Jump to content

Intel and Nvidia copied AMD

CPotter

So far not impressed with the title.

"Intel and NVIDIA copied AMD" implies that they copied some CPU or GPU, not Resizable BAR, which they didn't invent anyway.

4/10 title, 7/10 video.

 

EDIT:
Have finished the video.

This feels like the "Linus was right." title all over again.

Just use Floatplane titles, please!

elephants

Link to comment
Share on other sites

Link to post
Share on other sites

https://www.nvidia.com/en-us/geforce/news/geforce-rtx-30-series-resizable-bar-support/

 

GeForce RTX 30 Series Resizable BAR Supported Games
As of February 25th, 2021
Assassin's Creed Valhalla
Battlefield V
Borderlands 3
Forza Horizon 4
Gears 5
Metro Exodus
Red Dead Redemption 2
Watch Dogs: Legion

 

Assassin's Creed Valhalla is one of these games. Nvidia is individually validating games to make sure that it's a "seamless" experience in that no game loses performance with it enabled (hopefully there's a way to force it). 

 

It'd be nice to see a video on these specific games. 

Current System: Ryzen 7 3700X, Noctua NH L12 Ghost S1 Edition, 32GB DDR4 @ 3200MHz, MAG B550i Gaming Edge, 1TB WD SN550 NVME, SF750, RTX 3080 Founders Edition, Louqe Ghost S1

Link to comment
Share on other sites

Link to post
Share on other sites

Resizable bar is honestly nothing more than changing the amount of address space in the PCIe package.
It has upsides and downsides associated with it.

 

The downside with it is that each package header becomes larger and therefor brings with more overhead with each package.

But the upside is that we do not need to send a preliminary package before our actual package to inform the GPU that we want to apply a new offset to where in memory we are working.

 

If it is an actual benefit will depend on the application.


Some applications for an example can dump texture and other assets, even code into graphics memory and then never really need to touch it. Meaning that the application can mostly just linger about and buffer commands into the portion of memory it has access to. Since there is little need for the CPU itself to read a texture in vRAM, the GPU itself can handle that business alone. Increasing the size of our header just eats into the bandwidth of our link in this case.

Other applications might need to swap textures more often, or just work with datasets that fills up a larger portion of memory where the application can need arbitrary access to any portion of our dataset. Then not having resizable bar can lead to slowdowns due to needing to flog in an extra package for our offset. If we jump about a lot, then the extra packages needed for the offset can add a lot more overhead than just making our package able to handle a larger address from the get go.

 

Resizable bar supports address modes up to 38 bits, PCISIG states 512 GB. (38 bits is only 256 GB so I guess they address byte pairs just to confuse everyone...)

But there is an Expanded Resizable BAR being proposed taking its support up to 263 bits. (I don't know why they need to address more bits than there is postulated to be atoms in the universe, but I guess it is future proof...)

 

Choosing an adequate size for our address is honestly something one can debate about for years. After all, the IAB and IETF took over a decade to settle for 256 bits for IPv6. (they did look at other sizes like 64, 96, 128, among others.)

 

In the end, resizable bar isn't a killer feature worth hyping about.

In some cases it has improvements, in other cases it will be a detriment.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Hymenopus_Coronatus said:

https://www.nvidia.com/en-us/geforce/news/geforce-rtx-30-series-resizable-bar-support/

 

GeForce RTX 30 Series Resizable BAR Supported Games
As of February 25th, 2021
Assassin's Creed Valhalla
Battlefield V
Borderlands 3
Forza Horizon 4
Gears 5
Metro Exodus
Red Dead Redemption 2
Watch Dogs: Legion

 

Assassin's Creed Valhalla is one of these games. Nvidia is individually validating games to make sure that it's a "seamless" experience in that no game loses performance with it enabled (hopefully there's a way to force it). 

 

It'd be nice to see a video on these specific games. 

I came to the forum after watching the video to state this.  Your testing was flawed as you showed games that do not have it enabled yet.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Hymenopus_Coronatus said:

https://www.nvidia.com/en-us/geforce/news/geforce-rtx-30-series-resizable-bar-support/

 

GeForce RTX 30 Series Resizable BAR Supported Games
As of February 25th, 2021
Assassin's Creed Valhalla
Battlefield V
Borderlands 3
Forza Horizon 4
Gears 5
Metro Exodus
Red Dead Redemption 2
Watch Dogs: Legion

 

Assassin's Creed Valhalla is one of these games. Nvidia is individually validating games to make sure that it's a "seamless" experience in that no game loses performance with it enabled (hopefully there's a way to force it). 

 

It'd be nice to see a video on these specific games. 

37 minutes ago, rooker said:

I came to the forum after watching the video to state this.  Your testing was flawed as you showed games that do not have it enabled yet.

 

I'll preface this by saying that testing for this video concluded at the beginning of February, before this list was available. Sponsored videos take a while in the pipeline. The games chosen for testing were informed by which games had the best performance improvement with SAM enabled in the RX 6800/6900 reviews.

 

Yes, Nvidia is individually validating specific games, but if the theory is that resizable BAR alone is the reason for performance uplift, games that improve on AMD should also see similar improvement on Nvidia unless the driver is explicitly preventing them from using resizable BAR or otherwise interfering. The fact that AMD had no such compatibility list points to the conclusion that either Nvidia rushed this like we said, or AMD's memory management was just that bad to begin with (unlikely given Nvidia's claims of "similar" / 10% performance improvements).

Emily @ LINUS MEDIA GROUP                                  

congratulations on breaking absolutely zero stereotypes - @cs_deathmatch

Link to comment
Share on other sites

Link to post
Share on other sites

37 minutes ago, GabenJr said:

I'll preface this by saying that testing for this video concluded at the beginning of February, before this list was available.

Yeah that's what I thought, this list is very recent and there was no way for you guys to know that (MSI said a similar thing as you stated in the video, but obviously you guys had no idea that Nvidia would actually create a compatibility list like this).

 

I'd be interested in seeing a follow up piece sometime in the future though

37 minutes ago, GabenJr said:

games that improve on AMD should also see similar improvement on Nvidia unless the driver is explicitly preventing them from using resizable BAR or otherwise interfering

This is something that I've been curious about. The driver only allows the games in the list to use resize BAR I believe and I don't see a "force resize bar" option. It seems like they haven't tested enough games yet and haven't allowed the other games (that benefited on AMD) to use the technology. I like this approach, but they should have held off on the release till the end of March (when the other 30-series cards get the updates) to expand this list. Or at least include a force resizable BAR option.

 

I think Nvidia probably has a similar perf uplift, but they need to add some way to force it for all games to allow proper testing of the technology. Right now it's hard to tell when it's being used and when it's now (aside from checking in Control Panel to see whether it's enabled on the driver level).

 

image.thumb.png.61718e0b38cdd714bd921b9bd7e51d21.png

I'd like to see some kinda notification like this though, to clarify it's actually enabled as the game launches (sorry for the bad photoshop 😂)

Current System: Ryzen 7 3700X, Noctua NH L12 Ghost S1 Edition, 32GB DDR4 @ 3200MHz, MAG B550i Gaming Edge, 1TB WD SN550 NVME, SF750, RTX 3080 Founders Edition, Louqe Ghost S1

Link to comment
Share on other sites

Link to post
Share on other sites

41 minutes ago, Hymenopus_Coronatus said:

Yeah that's what I thought, this list is very recent and there was no way for you guys to know that (MSI said a similar thing as you stated in the video, but obviously you guys had no idea that Nvidia would actually create a compatibility list like this).

 

I'd be interested in seeing a follow up piece sometime in the future though

This is something that I've been curious about. The driver only allows the games in the list to use resize BAR I believe and I don't see a "force resize bar" option. It seems like they haven't tested enough games yet and haven't allowed the other games (that benefited on AMD) to use the technology. I like this approach, but they should have held off on the release till the end of March (when the other 30-series cards get the updates) to expand this list. Or at least include a force resizable BAR option.

 

I think Nvidia probably has a similar perf uplift, but they need to add some way to force it for all games to allow proper testing of the technology. Right now it's hard to tell when it's being used and when it's now (aside from checking in Control Panel to see whether it's enabled on the driver level).

 

image.thumb.png.61718e0b38cdd714bd921b9bd7e51d21.png

I'd like to see some kinda notification like this though, to clarify it's actually enabled as the game launches (sorry for the bad photoshop 😂)

Making an announcement when starting a game is something I can see why a user could want.

But coming at this from the hardware perspective, makes this announcement a bit bogus. It is on par with an image editor stating what Huffman pattern it used for each block in a PNG image. Ie, totally useless information as far as the end user is concerned.

The resizable bar feature can be a detriment in some applications due to how it works. (Setting a larger bar size increases the size of the package header, and thereby associated overhead.) Though, I already made a post about this above.

In short, the resizable bar won't give a net gain in all applications.

Link to comment
Share on other sites

Link to post
Share on other sites

Tiamat edition.

 

Tiamat- Goddess of chaos. 🤔

 

NOTE: I no longer frequent this site. If you really need help, PM/DM me and my e.mail will alert me. 

Link to comment
Share on other sites

Link to post
Share on other sites

Really, really not impressed with the title.

 

I very much doubt AMD actually made Resizable BAR. Contributed to its development? Probably, but make it? Nah.

 

So other than NVIDIA and Intel simply following the person who shot first...I don't get what they copied. It's not like it's the exact same implementation either, with NVIDIA choosing to go selective and only enabling it via drivers in validated games compared to AMD's "leave it open and experiment away" approach.

 

I'm probably just spit-balling here...but it really sounds like an attempt to attract the vocal and rabid minorities from the depth of the AMD fanbase, as it seems to reinforce the "AMD GOOD, INTELVIDIA BAD!11!" mantra. Am I overthinking this? Probably, but maybe I should just chill out a bit.

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, D13H4RD said:

Really, really not impressed with the title.

 

I very much doubt AMD actually made Resizable BAR. Contributed to its development? Probably, but make it? Nah.

 

So other than NVIDIA and Intel simply following the person who shot first...I don't get what they copied. It's not like it's the exact same implementation either, with NVIDIA choosing to go selective and only enabling it via drivers in validated games compared to AMD's "leave it open and experiment away" approach.

 

I'm probably just spit-balling here...but it really sounds like an attempt to attract the vocal and rabid minorities from the depth of the AMD fanbase, as it seems to reinforce the "AMD GOOD, INTELVIDIA BAD!11!" mantra. Am I overthinking this? Probably, but maybe I should just chill out a bit.

I do have to agree that the title of the video could have been far better to be fair.
Like "Intel and nVidia introduces resizable bar." Or, "This laptop has resizable bar but isn't having AMD components inside!" (if one wants to be more clickbaity.)

But to be fair here, resizable bar isn't particularly new. Since it is from 2008 and have been used elsewhere to be fair.
Resizable bar though makes more sense in more High Performance Computing applications, ie, where one's inter node connections use PCIe. (which is it's own large can of worms...) But Resizable bar also makes appearances elsewhere, typically as far as more uniform memory access is concerned in larger systems.

That we haven't seen it adopted in GPUs earlier is likely due to the fact that it doesn't have all that much of a difference to bring to be honest.

AMD, Intel and nVidia has all been part of the PCI Special Interest Group, but how much of a hand in development they have had I don't know. But considering that PCI is used in far more than just PCs, and even x86, and that resizable bar is a more HPC oriented feature, then I would be more inclined to think that it came from Cray (larger super computer manufacturer at the time and member of PCI-SIG) or IBM (another HPC vendor) or the like, potentially Intel (making parts for HPC applications). AMD at the time weren't really as much in that scene to be fair, and has only fairly recently gotten a foot in the door with their 32 and 64 core Epyc.

But resizable bar isn't a silver bullet that always gives performance boosts.
A well developed graphics engine won't see much benefit from it and is actually likely to see a deficit from it, especially on GPUs that has ample amounts of vRAM. Since the CPU won't have to push/pull resources back and forth as much. Ie, the 256MB of accessible memory is sufficient for buffering all the needed commands for running the graphics. Resizable bar only has advantages if the CPU needs to handle more of the memory, something we would primarily expect during asset loading. (If one needs to skitter around in all of vRAM to just run the graphics, then one's engine is frankly fairly abhorrent in regards to how it's memory is handled.)

Though, with the concept of asset streaming directly from storage, then things could change a bit. But at the same time, this would be our GPU talking with a storage device on it's own terms, calls issued by the CPU can theoretically be handled separately. (Ie, the CPU's calls is mapped to one section of vRAM, while the calls made by the storage device is mapped to another location in vRAM.)

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Nystemy said:

Great stuff in here but too much to directly quote, heh. *snip*

What you said there in essence is exactly what I've thought for a while now. I really don't understand why people are so hyped over it. Like, sure it's got a 10+% performance boost...in 2 games, maybe 3. In a lot of the other games though, you won't really see a difference.

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, D13H4RD said:

What you said there in essence is exactly what I've thought for a while now. I really don't understand why people are so hyped over it. Like, sure it's got a 10+% performance boost...in 2 games, maybe 3. In a lot of the other games though, you won't really see a difference.

Poorly optimized games that have graphics engines that doesn't care about how vRAM is handled will simply just dump their various control variables and such all over the place. This means that each time we need to maneuver over to that location, we need to first send a call to issue a new memory offset. And then a call for the actual command.

While a good graphics engine will simply aggregate all of its control structures into our 256MB.
Then we only need to jump elsewhere when handling larger assets.

Though, to be honest, we can just implement a simple buffer where the CPU drops an asset into our 256MB chunk, then our GPU relocates that asset to its more "permanent" home elsewhere in vRAM. Downside with this is that this technically needs us to first write to vRAM, then read form it, to write it back in its new location. Not always ideal....

But here one got to weigh the options of what is going to be the more efficient route.
Should we occasionally use offset commands for assets?
Or should we buffer our assets?
Or should we use resizable bar and simply issue a command to temporarily increase our address range. (I haven't read to much into this, but I greatly suspect that there is a penalty for address size changes, and potentially one can't just change address size on a whim.)

If we need to send a small asset, then buffering isn't all that bad.
For a larger asset, making an offset, dumping it, and offsetting back to where we were isn't all that bad either.
If we need to send over tons and tons of assets, then we likely need to change offset many times, here resizable bar makes sense.

Another application where resizable bar makes sense would be in particle simulations where all of vRAM is our dataset and we could potentially need to load any asset within it at any given moment. Or in AI workloads, they too are somewhat similar in their datasets as far as memory access requirements are concerned. But there is a million other HPC applications where having full memory access is more advantageous than not having it. (But even here some protocols to what I have seen implement "quick" and "heavy" paths, where the "quick" paths handles only a portion of memory for reduced overhead in more latency sensitive applications.)

In short, I don't think resizable bar is a feature worth hyping about, unless one does fluid dynamics simulations, NN, weather forecasting, or other dataset heavy GPU accelerated applications. (Since for these applications, it can be a noticeable performance increase, especially latency decrease that gives one's serial portion of the application a nice "free" performance boost ensuring that one's parallel portion don't have to idle for as long. (In short, Amdahl's law comes into play.))

Link to comment
Share on other sites

Link to post
Share on other sites

If somehow Resizable BAR is comming for RTX 3080 soon and it's working with Intel and AMD latest CPU's hopefully it will work like zen 2 based CPU's since i guess it's more like firmware / software based optimisation? would be really shame if wouldn't support zen 2.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Mortalheart said:

If somehow Resizable BAR is comming for RTX 3080 soon and it's working with Intel and AMD latest CPU's hopefully it will work like zen 2 based CPU's since i guess it's more like firmware / software based optimisation? would be really shame if wouldn't support zen 2.

Changing the size of the PCI BAR isn't something that is trivially done.
Since it is part of the PCIe controller in the CPU, it theoretically could be upgradable via firmware if the manufacturer thought it would be a useful feature. But the bar entry is fairly hard coded in the hardware sense of the word. (Ie, making it resizable consumes more transistors per entry space, and adds extra latency to handling the controller, reducing its peak throughput unless one invests more transistors into it to circumnavigate some of that issue. But I am not going to dive into circuit design, since then this post would never finish in that case......)

Considering how Ryzen is a first and foremost a consumer platform, then I doubt AMD would spend the effort on supporting the resizable bar feature, since it would complicate the control logic handling the PCIe interface.

After all, it isn't uncommon for lower end CPUs to have limited bar entry space as is. (one reason it is feasibly not possible to run a high end GPU on the Raspberry PI 4, despite it having PCIe.)

AMD likely only "recently" started working with resizable bar, likely due to having more exposure to customers working with it in the HPC segment, and I wouldn't be surprised if their engineers/marketing-team decided that the feature could be used as market differentiation. (In short, a counter to nVidia's efforts in ray tracing.)

How AMD's Epyc and Threadripper platforms are on the other hand is a better question, they could potentially support it a fair bit further back. (Potentially, they might have always supported it. I should also make a quick note that, yes, Epyc does use the Zen architecture just like Ryzen, but this is only in regards to the core logic. The integrated PCI controller isn't strictly part of Zen itself, even as far as Ryzen is concerned.)

In the end, I don't expect Zen 2 will have support for resizable bar, not even through a firmware upgrade. (I though might eat my own words here.)

Link to comment
Share on other sites

Link to post
Share on other sites

When I saw the battery size i saw 6.66µm not 99.9Wh

 

Also look not even fellow YouTubers like your nonsense clickbait titles as well:
https://i.ytimg.com/vi/H1DApIvOCMw/hqdefault.jpg?sqp=-oaymwEbCKgBEF5IVfKriqkDDggBFQAAiEIYAXABwAEG&rs=AOn4CLAwtJvYzzX4dIhodGCGvnTw7b8Nhg

clickbait video requires a clickbait comment

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×