Jump to content

Mantle Will Boost Performance By 20-50%, Nvidia Can Add Support.

Won't this help APU's a lot?  make them a lot more viable for lower end gaming machines?

Link to comment
Share on other sites

Link to post
Share on other sites

They would have to totally redesign the cards xD

what makes u think that

Link to comment
Share on other sites

Link to post
Share on other sites

I think Nvidia really need to get Mantle up and running on their hardware ASAP so we can hopefully move away from DirectX.

System specs
  • Graphics card: Asus GTX 980 Ti (Temp target: 60c, fan speed: slow as hell)
  • CPU: Intel 6700k @ 4.2Ghz
  • CPU Heatsink: ThermalRight Silver Arrow Extreme
  • Motherboard: Asus Maximus Viii Gene
  • Ram: 8GB of DDR4 @ 3000Mhz
  • Headphone source: O2 + Odac 
  • Mic input: Creative X-Fi Titanium HD
  • Case: Fractal Design Arc midi R2
  • Boot Drive: Samsung 840 Pro 128GB 
  • Storage: Seagate SSHD 2TB
  • PSU: Be quiet! Dark Power Pro 550w

Peripherals

  • Monitor: Asus ROG Swift PG278Q
  • Mouse: Razer DeathAdder Chroma (16.5 inch/360)
  • Mouse surface: Mionix Sargas 900
  • Tablet: Wacom Intuos Pen
  • Keyboard: Filco Majestouch Ninja, MX Brown, Ten Keyless 
  • Headphones: AKG K7xx
  • IEMs: BrainWavs S1
Link to comment
Share on other sites

Link to post
Share on other sites

Do you think the could reach a deal by trading G-Sync for mantle :D This would be amazing if they did this, Both companies would then benefit :D

PC Specs:


CPU: i5 4670K  4.5 GHz  | CPU COOLER: H80i | GPU:EVGA GTX 780Ti  |  Motherboard: MSI Z87I   |  Case: Bitfenix Prodigy  |  RAM: 8GB Avexir Venom  |  HDD: Seagate 1TB , 60GB Agility 3  Monitor: 32" HD TV (LOL)   PSU: Corsair 750W

Link to comment
Share on other sites

Link to post
Share on other sites

Do you think the could reach a deal by trading G-Sync for mantle :D This would be amazing if they did this, Both companies would then benefit :D

 

The problem with AMD using G-Sync is that it talks directly with the GPU on a hardware level, not software. What this means is while Nvidia could in theory rewrite Mantle enough to make it benefit them (if it already doesn't already), AMD would require a new GPU to communicate with the G-Sync module. Now as far as AMD coming out with their own G-Sync type module, I see that as a bad idea. In theory its great for us, but industry wide, it would cause issues with manufactures and vendors. 

Link to comment
Share on other sites

Link to post
Share on other sites

The problem with AMD using G-Sync is that it talks directly with the GPU on a hardware level, not software. What this means is while Nvidia could in theory rewrite Mantle enough to make it benefit them (if it already doesn't already), AMD would require a new GPU to communicate with the G-Sync module. Now as far as AMD coming out with their own G-Sync type module, I see that as a bad idea. In theory its great for us, but industry wide, it would cause issues with manufactures and vendors. 

Drats, One can only hope for now :D

PC Specs:


CPU: i5 4670K  4.5 GHz  | CPU COOLER: H80i | GPU:EVGA GTX 780Ti  |  Motherboard: MSI Z87I   |  Case: Bitfenix Prodigy  |  RAM: 8GB Avexir Venom  |  HDD: Seagate 1TB , 60GB Agility 3  Monitor: 32" HD TV (LOL)   PSU: Corsair 750W

Link to comment
Share on other sites

Link to post
Share on other sites

what makes u think that

I heard that Nvdia cards were not GCN architecture.

| Case: NZXT Tempest 210 | CPU: Intel Core i5 3570K @ 3.9 Ghz | GPU: ASUS ROG STRIX GTX 1070 | RAM: Crucial Ballistix Tactical 8GB |

| Mouse: Zowie FK1 | Monitor: Acer 21.5' | Keyboard: CoolerMaster Stealth w/ Brown Switches |

#KilledMyWife - #LinusButtPlug - #1080penis

 

Link to comment
Share on other sites

Link to post
Share on other sites

I heard that Nvdia cards were not GCN architecture.

keppler, doesnt have to be gcn

Link to comment
Share on other sites

Link to post
Share on other sites

keppler, doesnt have to be gcn

But Mantle is based off of the GCN Architecture...

| Case: NZXT Tempest 210 | CPU: Intel Core i5 3570K @ 3.9 Ghz | GPU: ASUS ROG STRIX GTX 1070 | RAM: Crucial Ballistix Tactical 8GB |

| Mouse: Zowie FK1 | Monitor: Acer 21.5' | Keyboard: CoolerMaster Stealth w/ Brown Switches |

#KilledMyWife - #LinusButtPlug - #1080penis

 

Link to comment
Share on other sites

Link to post
Share on other sites

I wonder how long until someone brings out a hack for non AMD cards..

<p>Mobo - Asus P9X79 LE ----------- CPU - I7 4930K @ 4.4GHz ------ COOLER - Custom Loop ---------- GPU - R9 290X Crossfire ---------- Ram - 8GB Corsair Vengence Pro @ 1866 --- SSD - Samsung 840 Pro 128GB ------ PSU - Corsair AX 860i ----- Case - Corsair 900D

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia will never use Mantle, it's going to sign its own death sentence if it does, the majority of developers will pick Mantle up of Nvidia goes on-board with it and that's just not acceptable to Nvidia it would mean giving AMD cards a significant performance advantage in the majority of PC games, Nvidia will simply not be able to compete with that.

You have to remember that Mantle is mainly for the console guys that want to do an easy port to the PC, meaning all of their optimizations will be based off the AMD GCN architecture.

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia will never use Mantle, it's going to sign its own death sentence if it does, the majority of developers will pick Mantle up of Nvidia goes on-board with it and that's just not acceptable to Nvidia it would mean giving AMD cards a significant performance advantage in the majority of PC games, Nvidia will simply not be able to compete with that.

You have to remember that Mantle is mainly for the console guys that want to do an easy port to the PC, meaning all of their optimizations will be based off the AMD GCN architecture.

 

Nvidia should be fine with adding Mantle support to their cards. That means that Nvidia cards should get the same boosts as the AMD cards. Hypothetically, if their top end card was 20% stronger than an AMD card before Mantle and Mantle increases efficiency by 50% - their cards are now going to be about 30% better. Now that's a simplified method, but it certainly doesn't hurt Nvidia in terms of "giving AMD all the advantage". As far as I understand, games can and will still be optimized for one card instead of just pure AMD. Mantle is not "mainly for the console" Mantle is a coding efficiency that gets more calls per frame than Direct X can (DX being a limitation). Being able to port the game from PC to the console (or vice versa) is another benefit of Mantle. 

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia should be fine with adding Mantle support to their cards. That means that Nvidia cards should get the same boosts as the AMD cards. Hypothetically, if their top end card was 20% stronger than an AMD card before Mantle and Mantle increases efficiency by 50% - their cards are now going to be about 30% better. Now that's a simplified method, but it certainly doesn't hurt Nvidia in terms of "giving AMD all the advantage". As far as I understand, games can and will still be optimized for one card instead of just pure AMD. Mantle is not "mainly for the console" Mantle is a coding efficiency that gets more calls per frame than Direct X can (DX being a limitation). Being able to port the game from PC to the console (or vice versa) is another benefit of Mantle. 

 

Mantle only works as long as games are programmed for it.  If AMD graphics cards are the only cards that take advantage of it, Mantle will be less of a priority for game developers, and that might kill it if the results aren't epic enough.  On the other hand, NVIDIA has ShadowPlay and G-Sync, and AMD doesn't.  If they had Mantle too, then AMD wouldn't have any unique features to entice people with.  However, they would solidify Mantle's presence and help make it standardized, and they would have to continually pay AMD to use the technology, and especially if it becomes standard they will never be able to not-license it until NVIDIA or someone else comes up with something to replace it.  There are pros and cons whichever way NVIDIA chooses.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×