Jump to content

bf4 mantle boost

wtfxmitchell

I, for my part, am not expecting maximum framerates to explode and for us to suddenly be getting 200FPS in games we previously only got 80 in. What I am expecting, however, is a large increase in the MINIMUM framerate (i.e. the one you notice). When you see a review for a graphics card, you always see those down-spikes when something like an explosion happens, or a horde of rabid raccoons rush across the screen being chased by a Challenger Two tank (man, I so want that game now). The one thing I took from that demo, is that the framerate, regardless of how high it actually was, remained quite stable throughout the demo, as did the frame latency.

Intel i7 5820K (4.5 GHz) | MSI X99A MPower | 32 GB Kingston HyperX Fury 2666MHz | Asus RoG STRIX GTX 1080ti OC | Samsung 951 m.2 nVME 512GB | Crucial MX200 1000GB | Western Digital Caviar Black 2000GB | Noctua NH-D15 | Fractal Define R5 | Seasonic 860 Platinum | Logitech G910 | Sennheiser 599 | Blue Yeti | Logitech G502

 

Nikon D500 | Nikon 300mm f/4 PF  | Nikon 200-500 f/5.6 | Nikon 50mm f/1.8 | Tamron 70-210 f/4 VCII | Sigma 10-20 f/3.5 | Nikon 17-55 f/2.8 | Tamron 90mm F2.8 SP Di VC USD Macro | Neewer 750II

Link to comment
Share on other sites

Link to post
Share on other sites

AMDs next line of cards will most likely not be GCN (at least I hope not), which would make Mantle very short-lived.

 

I am thinking the opposite, everything their making is GCN based., Notebook/Desktop GPU's & APU's are all now carrying GCN, why would they ditch it now after finally creating the platform.

Maximums - Asus Z97-K /w i5 4690 Bclk @106.9Mhz * x39 = 4.17Ghz, 8GB of 2600Mhz DDR3,.. Gigabyte GTX970 G1-Gaming @ 1550Mhz

 

Link to comment
Share on other sites

Link to post
Share on other sites

AMDs next line of cards will most likely not be GCN (at least I hope not), which would make Mantle very short-lived. This especially makes no sense since AMD wants many future games to use their API.

I could be entirely wrong, and I hope I am, but so far Mantle isn't looking so great.

Johan Andersson, the lead engine architect of Frostbite and the main guy developing Mantle, said multiple times that Mantle will support future AMD architectures. 

My PC: CPU: Intel Core i3 3220, MB: ASUS P8P67 LE, GPU: Palit Jetstream GTX 670, RAM: 4GB 1333mhz DDR3, Storage: 750GB Hitachi, PSU: CoolerMaster G650M 80+ Bronze, Cooling: Coolermaster Hyper 212 Plus, Case: Multirama, Display: Acer x229w 22" 1680x1050, Keyboard: Logitech K120, Mouse: Steelseries Kinzu v2, Sound: Logitech 2.1 system

Link to comment
Share on other sites

Link to post
Share on other sites

Merged

Link to comment
Share on other sites

Link to post
Share on other sites

Actually, the reason why game Devs wanted Mantle has nothing to do with increasing performance.

But allowing more Draw Calls which means you can have more objects and physics without killing performance.

It would also allow for more complex AI.

Right now it's complete useless for Devs.

Mantle has the same problem as PhysX .

AMD should make it opensource so the game Dev can do with it what they want.

 

 

Amd has said that they intend for Mantle to be open source once they have a stable and clean first version of it which i heard will prob be around 2015 (and i have to agree. releasing an api to the public in a bad state is a real buzz kill) 

Link to comment
Share on other sites

Link to post
Share on other sites

Amd has said that they intend for Mantle to be open source once they have a stable and clean first version of it which i heard will prob be around 2015 (and i have to agree. releasing an api to the public in a bad state is a real buzz kill)

So you have a source? That's great if it is true.
Link to comment
Share on other sites

Link to post
Share on other sites

So you have a source? That's great if it is true.

http://www.youtube.com/watch?v=UpkVmpx2u1U

 

31:08 where it says a lot of Mantles capabilities are not tied to GCN architecture.

 

http://www.tomshardware.com/news/amd-apu-developer-conference-mantle,25079.html

 

 

Ultimately one thing was made clear during the presentations: Mantle is not simply just for AMD APUs and GPUs. The technology is reportedly open, so whether Nvidia accepts the tech or not is a different story altogether, but it's there for the GeForce company to embrace.
Link to comment
Share on other sites

Link to post
Share on other sites

http://www.youtube.com/watch?v=UpkVmpx2u1U

 

31:08 where it says a lot of Mantles capabilities are not tied to GCN architecture.

 

http://www.tomshardware.com/news/amd-apu-developer-conference-mantle,25079.html

That's not what open source means...

Ohh well, not being tied to GCN has its pros and cons. The good thing is that it will probably work on later architectures. The bad thing is that it can't be *that* low level if it's not tied to GCN.

Link to comment
Share on other sites

Link to post
Share on other sites

I agree 45% with an apu but still 20% increase with dedicated graphics is still huge

 

I'd take 20% and consider myself lucky. Seriously, if that's what I get with my 7970 ghz at max settings, I'd be blown away! Can you imagine an increase of 15 fps at max settings? That's like a bump from a $300 to a $500 graphics card.

i7 4770k @ 4.6ghz | Gigabyte GA-Z87X-UD3H | G.SKILL Ripjaws X Series 16GB | Sapphire Vapor-X 7970 Ghz | Fractal Design Arc Midi R2


Samsung 840 EVO 250GB SSD | Toshiba 7200rpm 3TB | Seasonic 660W Platinum | Corsair H100i


Logitech G400s | Ducky Shine III Cherry MX Brown

Link to comment
Share on other sites

Link to post
Share on other sites

Ok, and? Lots of companies show lots of things, they could have changed things around and made changes that we cant make. Dont believe a thing until you see it live and in person.

 

no need to be quite so cynical. Just enjoy the news and hope that it is true. optimism yay! Even if it isnt 45% but close to that its still great.

  i5 4440, 8GB 1600 mhz, Gigabyte Z87X-UD3H, SX900 128gb SSD, 850w 80+ Gold, FD R4, 270

 

Link to comment
Share on other sites

Link to post
Share on other sites

I'd take 20% and consider myself lucky. Seriously, if that's what I get with my 7970 ghz at max settings, I'd be blown away! Can you imagine an increase of 15 fps at max settings? That's like a bump from a $300 to a $500 graphics card.

 

The big leap should be in the minimum FPS, if those talks on the technical aspects of Mantle from APU'13 are anything to go by.

Intel i7 5820K (4.5 GHz) | MSI X99A MPower | 32 GB Kingston HyperX Fury 2666MHz | Asus RoG STRIX GTX 1080ti OC | Samsung 951 m.2 nVME 512GB | Crucial MX200 1000GB | Western Digital Caviar Black 2000GB | Noctua NH-D15 | Fractal Define R5 | Seasonic 860 Platinum | Logitech G910 | Sennheiser 599 | Blue Yeti | Logitech G502

 

Nikon D500 | Nikon 300mm f/4 PF  | Nikon 200-500 f/5.6 | Nikon 50mm f/1.8 | Tamron 70-210 f/4 VCII | Sigma 10-20 f/3.5 | Nikon 17-55 f/2.8 | Tamron 90mm F2.8 SP Di VC USD Macro | Neewer 750II

Link to comment
Share on other sites

Link to post
Share on other sites

Although some may say this performance increase is irrelevant but more performance means your GPU works less if you enable V-Sync and lower temps means a longer lasting GPU.

and you get input lag

Finally my Santa hat doesn't look out of place

Link to comment
Share on other sites

Link to post
Share on other sites

and you get input lag

Not with free-sync and triple buffering apparently.

Console optimisations and how they will effect you | The difference between AMD cores and Intel cores | Memory Bus size and how it effects your VRAM usage |
How much vram do you actually need? | APUs and the future of processing | Projects: SO - here

Intel i7 5820l @ with Corsair H110 | 32GB DDR4 RAM @ 1600Mhz | XFX Radeon R9 290 @ 1.2Ghz | Corsair 600Q | Corsair TX650 | Probably too much corsair but meh should have had a Corsair SSD and RAM | 1.3TB HDD Space | Sennheiser HD598 | Beyerdynamic Custom One Pro | Blue Snowball

Link to comment
Share on other sites

Link to post
Share on other sites

Not with free-sync and triple buffering apparently.

good luck finding a free-sync monitor. There's a reason AMD didn't have a full featured press release for it if it were actually a competitor to g-sync. I want it to be, but the way its shaping up, it doesn't seems promising. Seemed like AMD wanted sommething to go against g-sync and this was the best they could do

Finally my Santa hat doesn't look out of place

Link to comment
Share on other sites

Link to post
Share on other sites

good luck finding a free-sync monitor. There's a reason AMD didn't have a full featured press release for it if it were actually a competitor to g-sync. I want it to be, but the way its shaping up, it doesn't seems promising. Seemed like AMD wanted sommething to go against g-sync and this was the best they could do

If you'd read the G-Sync vs FreeSync thread you'd know that the vast majority of Displayport 1.3 monitors will have variable VBLANK.

Console optimisations and how they will effect you | The difference between AMD cores and Intel cores | Memory Bus size and how it effects your VRAM usage |
How much vram do you actually need? | APUs and the future of processing | Projects: SO - here

Intel i7 5820l @ with Corsair H110 | 32GB DDR4 RAM @ 1600Mhz | XFX Radeon R9 290 @ 1.2Ghz | Corsair 600Q | Corsair TX650 | Probably too much corsair but meh should have had a Corsair SSD and RAM | 1.3TB HDD Space | Sennheiser HD598 | Beyerdynamic Custom One Pro | Blue Snowball

Link to comment
Share on other sites

Link to post
Share on other sites

If you'd read the G-Sync vs FreeSync thread you'd know that the vast majority of Displayport 1.3 monitors will have variable VBLANK.

show me a dp 1.3 monitor. The standard isn't out yet, let alone  in monitors

Finally my Santa hat doesn't look out of place

Link to comment
Share on other sites

Link to post
Share on other sites

show me a dp 1.3 monitor. The standard isn't out yet, let alone  in monitors

The standard is expected to be finished in Q2. It's just like G-Sync, not available at the time of announcing due to hardware limitations.

 

The eDP (embedded displayport) standard contains variable VBLANK already and as such tablets and laptops will already have FreeSync available. This is where it's most prevalent as at 60Hz and 60FPS the experience is already very smooth whereas 60Hz 30fps ,which is a result of tablets and laptops lacking in performance, variable VBLANK will be a lot more noticeable. The vast majority of people don't notice the input lag from V-Sync personally I use Variable V-Sync with triple buffering via RadeonPro and I have a very smooth experience.

Console optimisations and how they will effect you | The difference between AMD cores and Intel cores | Memory Bus size and how it effects your VRAM usage |
How much vram do you actually need? | APUs and the future of processing | Projects: SO - here

Intel i7 5820l @ with Corsair H110 | 32GB DDR4 RAM @ 1600Mhz | XFX Radeon R9 290 @ 1.2Ghz | Corsair 600Q | Corsair TX650 | Probably too much corsair but meh should have had a Corsair SSD and RAM | 1.3TB HDD Space | Sennheiser HD598 | Beyerdynamic Custom One Pro | Blue Snowball

Link to comment
Share on other sites

Link to post
Share on other sites

double post.

Edited by Kuzma

Console optimisations and how they will effect you | The difference between AMD cores and Intel cores | Memory Bus size and how it effects your VRAM usage |
How much vram do you actually need? | APUs and the future of processing | Projects: SO - here

Intel i7 5820l @ with Corsair H110 | 32GB DDR4 RAM @ 1600Mhz | XFX Radeon R9 290 @ 1.2Ghz | Corsair 600Q | Corsair TX650 | Probably too much corsair but meh should have had a Corsair SSD and RAM | 1.3TB HDD Space | Sennheiser HD598 | Beyerdynamic Custom One Pro | Blue Snowball

Link to comment
Share on other sites

Link to post
Share on other sites

The standard is expected to be finished in Q2. It's just like G-Sync, not available at the time of announcing due to hardware limitations.

 

The eDP (embedded displayport) standard contains variable VBLANK already and as such tablets and laptops will already have FreeSync available. This is where it's most prevalent as at 60Hz and 60FPS the experience is already very smooth whereas 60Hz 30fps ,which is a result of tablets and laptops lacking in performance, variable VBLANK will be a lot more noticeable. The vast majority of people don't notice the input lag from V-Sync personally I use Variable V-Sync with triple buffering via RadeonPro and I have a very smooth experience.

Yes I know that, but I'm not excited until someone releases I it and reviews it. Same thing with g-sync

Finally my Santa hat doesn't look out of place

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×