Jump to content

Mantle Demo, AMD CPU Performance.

MbV93

Mantle aside....drivers only..

 

I personally think the driver improvements will not be as revolutionary as the 7000 series.

Unlocking performance out of GCN on the 7000 series was new at the time.

It now isnt, any improvements to the architecture from 7000 series IS in the 290/290x driver sets obviously.

Im just saying don't expect the gains that the 7950/7970 had,....... from new 290/290x drivers...

Maximums - Asus Z97-K /w i5 4690 Bclk @106.9Mhz * x39 = 4.17Ghz, 8GB of 2600Mhz DDR3,.. Gigabyte GTX970 G1-Gaming @ 1550Mhz

 

Link to comment
Share on other sites

Link to post
Share on other sites

If this nobody's tweets are correct, It looks like mantle is the answer to all that.

Link to comment
Share on other sites

Link to post
Share on other sites

Put in my Sempron and see what happens. Heheheheheh.

Main rig on profile

VAULT - File Server

Spoiler

Intel Core i5 11400 w/ Shadow Rock LP, 2x16GB SP GAMING 3200MHz CL16, ASUS PRIME Z590-A, 2x LSI 9211-8i, Fractal Define 7, 256GB Team MP33, 3x 6TB WD Red Pro (general storage), 3x 1TB Seagate Barracuda (dumping ground), 3x 8TB WD White-Label (Plex) (all 3 arrays in their respective Windows Parity storage spaces), Corsair RM750x, Windows 11 Education

Sleeper HP Pavilion A6137C

Spoiler

Intel Core i7 6700K @ 4.4GHz, 4x8GB G.SKILL Ares 1800MHz CL10, ASUS Z170M-E D3, 128GB Team MP33, 1TB Seagate Barracuda, 320GB Samsung Spinpoint (for video capture), MSI GTX 970 100ME, EVGA 650G1, Windows 10 Pro

Mac Mini (Late 2020)

Spoiler

Apple M1, 8GB RAM, 256GB, macOS Sonoma

Consoles: Softmodded 1.4 Xbox w/ 500GB HDD, Xbox 360 Elite 120GB Falcon, XB1X w/2TB MX500, Xbox Series X, PS1 1001, PS2 Slim 70000 w/ FreeMcBoot, PS4 Pro 7015B 1TB (retired), PS5 Digital, Nintendo Switch OLED, Nintendo Wii RVL-001 (black)

Link to comment
Share on other sites

Link to post
Share on other sites

Hard to get excited over some random guys tweets.

Link to comment
Share on other sites

Link to post
Share on other sites

I personally think the driver improvements will not be as revolutionary as the 7000 series.

Unlocking performance out of GCN on the 7000 series was new at the time.

It now isnt, any improvements to the architecture from 7000 series IS in the 290/290x driver sets obviously.

Im just saying don't expect the gains that the 7950/7970 had,....... from new 290/290x drivers...

It's not a driver and by next Year maybe even Nvidia,Intel ect. will use it.

I think people don't really get what Mantle is.

Mantle doesn't give you a peformance boost by itself it just allows a Developer to code directly to the GPU/CPU/RAM wich isn't possible over DirectX or OpenGL.

How big the benefits are is up to the developer and it's not only about peformance.

You can even use now the onboard GPU from the CPU at the same time as your normal GPU.

So you can split up the workload wich allows the GPU to do more effects.

 

RTX2070OC 

Link to comment
Share on other sites

Link to post
Share on other sites

It's not a driver and by next Year maybe even Nvidia,Intel ect. will use it.

 

I didnt mention mantle....at all.... I know I'm in a mantle thread, but I was talking about driver updates only.

 

What I said, now in other words...

I mentioned the AMD 7000 series GCN driver improvements already being in 290x,.......... so DONT expect miracle drivers this time unlocking potential like catalyst did starting with the never settle drivers.

Because their already implemented.

Maximums - Asus Z97-K /w i5 4690 Bclk @106.9Mhz * x39 = 4.17Ghz, 8GB of 2600Mhz DDR3,.. Gigabyte GTX970 G1-Gaming @ 1550Mhz

 

Link to comment
Share on other sites

Link to post
Share on other sites

Put in my Sempron and see what happens. Heheheheheh.

Semprons are single cores and Mantle is optimised for multi-core so it wouldn't make much difference I'd guess

Asrock 890GX Extreme 3 - AMD Phenom II X4 955 @3.50GHz - Arctic Cooling Freezer XTREME Rev.2 - 4GB Kingston HyperX - AMD Radeon HD7850 - Kingston V300 240GB - Samsung Spinpoint F3 1TB - Chieftec APS-750 - Cooler Master HAF912 PLUS


osu! profile

Link to comment
Share on other sites

Link to post
Share on other sites

Hard to get excited over some random guys tweets.

Please delete your account if you just made it to bash AMD. This "random guy" is a game developer. Do you make games that use Mantle to know the performance gains?

Link to comment
Share on other sites

Link to post
Share on other sites

Please delete your account if you just made it to bash AMD. This "random guy" is a game developer. Do you make games that use Mantle to know the performance gains?

Please delete your account if you freak over every little comment....till I see actual proof, theyve got nothing thats gonna make me drop my panties...so calm down fanboy.

Link to comment
Share on other sites

Link to post
Share on other sites

This is all completely pointless until we know for sure whether or not Mantle makes a big enough difference in performance. If it's not like the difference between having G-Sync and having G-Sync then Mantle is not as revolutionary as it. I do know they are different things by the way it's just we need to measure the effect each technology has on the PC ecosystem. Even if Mantle improves performance by 50-100% it will still not be as effective as G-Sync.

Hi. Have you seen G-Sync in person?

 

If G-Sync isn't as good as Nvidia makes it out to be... You're down a couple hundred dollars. If Mantle isn't as good as AMD makes it out to be... Oh well.

Intel i5 6600k~Asus Maximus VIII Hero~G.Skill Ripjaws 4 Series 8GB DDR4-3200 CL-16~Sapphire Radeon R9 Fury Tri-X~Phanteks Enthoo Pro M~Sandisk Extreme Pro 480GB~SeaSonic Snow Silent 750~BenQ XL2730Z QHD 144Hz FreeSync~Cooler Master Seidon 240M~Varmilo VA87M (Cherry MX Brown)~Corsair Vengeance M95~Oppo PM-3~Windows 10 Pro~http://pcpartpicker.com/p/ynmBnQ

Link to comment
Share on other sites

Link to post
Share on other sites

Hi. Have you seen G-Sync in person?

 

If G-Sync isn't as good as Nvidia makes it out to be... You're down a couple hundred dollars. If Mantle isn't as good as AMD makes it out to be... Oh well.

You don't trust Linus?

Link to comment
Share on other sites

Link to post
Share on other sites

You don't trust Linus?

From what I remember, he's only seen demos from Nvidia (they are 100% created to make the worst case for non-G-Sync and best case for G-Sync). But regardless, I trust he knows what he's talking about. I think G-Sync is going to make a difference, but not as large in day to day gaming.

 

When huge developers like DICE or Crytek come out and say this is the future, I trust them as I trust Linus. It doesn't make sense to think G-Sync is the best thing ever (because someone said so) and think Mantle could not be awesome (because it's all talk now).

Intel i5 6600k~Asus Maximus VIII Hero~G.Skill Ripjaws 4 Series 8GB DDR4-3200 CL-16~Sapphire Radeon R9 Fury Tri-X~Phanteks Enthoo Pro M~Sandisk Extreme Pro 480GB~SeaSonic Snow Silent 750~BenQ XL2730Z QHD 144Hz FreeSync~Cooler Master Seidon 240M~Varmilo VA87M (Cherry MX Brown)~Corsair Vengeance M95~Oppo PM-3~Windows 10 Pro~http://pcpartpicker.com/p/ynmBnQ

Link to comment
Share on other sites

Link to post
Share on other sites

You don't trust Linus?

Its not a matter of trusting linus or not. Sure it can make frame tearing and the visual look of jumping up and down fps feel less extreme but until I play with it i'll argue that it wont feel smooth.

My logic being that anyone whos ever picked up an fps on pc will know the differance between 30fps and 60fps. The best way i can explain my personal feel with it is sure 30 fps LOOKS smooth while playing a shooter but it in no way FEELS smooth. At lower fps the game feels sluggish and less responsive even if visually it looks relativly smooth. So great G-sync will make my random drops below 60 LOOK smooth but will it somehow make the game FEEL smooth still? Until i can play with it I'm straight up guessing no.

I'll take my chances with just geting pure higher fps thank you very much.

CPU: Intel i5 4690k W/Noctua nh-d15 GPU: Gigabyte G1 980 TI MOBO: MSI Z97 Gaming 5 RAM: 16Gig Corsair Vengance Boot-Drive: 500gb Samsung Evo Storage: 2x 500g WD Blue, 1x 2tb WD Black 1x4tb WD Red

 

 

 

 

"Whatever AMD is losing in suddenly becomes the most important thing ever." - Glenwing, 1/13/2015

 

Link to comment
Share on other sites

Link to post
Share on other sites

Hi. Have you seen G-Sync in person?

 

If G-Sync isn't as good as Nvidia makes it out to be... You're down a couple hundred dollars. If Mantle isn't as good as AMD makes it out to be... Oh well.

Well if Linus and everyone else says that there is no tearing, and the image is much much smoother than it is revolutionary. Text is apparently made able to read when it is moving which is really helpful. G-Sync is something that I am looking forward to more then anything else even any game that is coming out this year other than Star citizen. Fallout 4 will be the only game I have been looking forward to since Fallout New Vegas came out. G-Sync would enable devs to put whatever object they like with very little drop in smoothness. One way that devs optimise games is to cluster together the same object to make them one single object. If we went without this then we could have  a much more diverse world in out video games. Eg not all the walls/bins/cars would all need to be the same. Cover could all look entirely different and vegetation could actually be varied. As you can see it isn't just the smoothness that G-Sync should provide it is also the world environments which is more important than achieving 120 fps in a game. The next Fallout for example could have many more objects in the world which would be really nice as walking through the wasteland can get tiresome after you see the same shack and tree for the last 12 times. But that is just whant out of my games, it doesn't mean you do. Maybe you just want shear fps which is fine.

 (\__/)

 (='.'=)

(")_(")  GTX 1070 5820K 500GB Samsung EVO SSD 1TB WD Green 16GB of RAM Corsair 540 Air Black EVGA Supernova 750W Gold  Logitech G502 Fiio E10 Wharfedale Diamond 220 Yamaha A-S501 Lian Li Fan Controller NHD-15 KBTalking Keyboard

Link to comment
Share on other sites

Link to post
Share on other sites

So what I read out of those tweets is that AMD have found a way to improve the performance benchmarks of their CPU's without actually improving their cpus.  I wonder how many different ways you can do that with software?

 

Why can't we see how they tested and what the actual results are? I would be more impressed if they backed up their tweets with evidence.

Actually software has always been the limping dog, Software being that it has to work on a broad set of hardware is what is causing the problems, the full performance of a hardware part is never fully realized unless you write a program to work on only that one set of hardware, this is how consoles stay relevant for 10yr lifespans. So the old core 2 duo you have lying around still could probably run most modern games capped out, the problem is the software does not leverage it correctly. The AMD FX series loves threaded programs and when a program that is properly threaded it eats most other chips for lunch. AMD went with a CPU that depends on good software, problem is its not a perfect world and it is simple to go all out on one core then do proper threading.

 

Unless coding for a broad set of hardware with many optimizations becomes easier then we will continue to see performance left on the table, I bet my old 8800GT still has some life in it if a game was optimized for that one card. The same holds true for many CPU and they are rarely utilized to their full potential except under a benchmark.

 

The thing is Intel and AMD both have diffrent ideas of how a CPU should be made and used, this is clear as you really cant use the broad term "Core" anymore as both designs are so different you really cant compare them.

Link to comment
Share on other sites

Link to post
Share on other sites

Please delete your account if you freak over every little comment....till I see actual proof, theyve got nothing thats gonna make me drop my panties...so calm down fanboy.

 

All you posts are on AMD topics saying "Mantle is nothing, G-Sync is better" and all that jazz. A performance boost is guaranteed and people from the industry aren't random guys. -_-

Link to comment
Share on other sites

Link to post
Share on other sites

Its not a matter of trusting linus or not. Sure it can make frame tearing and the visual look of jumping up and down fps feel less extreme but until I play with it i'll argue that it wont feel smooth.

My logic being that anyone whos ever picked up an fps on pc will know the differance between 30fps and 60fps. The best way i can explain my personal feel with it is sure 30 fps LOOKS smooth while playing a shooter but it in no way FEELS smooth. At lower fps the game feels sluggish and less responsive even if visually it looks relativly smooth. So great G-sync will make my random drops below 60 LOOK smooth but will it somehow make the game FEEL smooth still? Until i can play with it I'm straight up guessing no.

I'll take my chances with just geting pure higher fps thank you very much.

 

My understanding was that G-sync does not control how many FPS your computer can produce, so It should have zero effect on your pc's performance.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

My understanding was that G-sync does not control how many FPS your computer can produce, so It should have zero effect on your pc's performance.

Yes this is true but it wont make my experiance any better if it still feels like its running at low fps is my point. Im not one to see a point in droping a few hundred on a new screen just to get my experiance to look smooth at low fps when its still going to FEEL like junk at low fps. If i get low fps i'll adujst my settings to get a consistant 60+ fps even if visually it looks smooth.

CPU: Intel i5 4690k W/Noctua nh-d15 GPU: Gigabyte G1 980 TI MOBO: MSI Z97 Gaming 5 RAM: 16Gig Corsair Vengance Boot-Drive: 500gb Samsung Evo Storage: 2x 500g WD Blue, 1x 2tb WD Black 1x4tb WD Red

 

 

 

 

"Whatever AMD is losing in suddenly becomes the most important thing ever." - Glenwing, 1/13/2015

 

Link to comment
Share on other sites

Link to post
Share on other sites

Yes this is true but it wont make my experiance any better if it still feels like its running at low fps is my point. Im not one to see a point in droping a few hundred on a new screen just to get my experiance to look smooth at low fps when its still going to FEEL like junk at low fps. If i get low fps i'll adujst my settings to get a consistant 60+ fps even if visually it looks smooth.

Not just about low FPS, it's about 59 FPS, about 45 FPS. it smooths it out across the board.

Motherboard - Gigabyte P67A-UD5 Processor - Intel Core i7-2600K RAM - G.Skill Ripjaws @1600 8GB Graphics Cards  - MSI and EVGA GeForce GTX 580 SLI PSU - Cooler Master Silent Pro 1,000w SSD - OCZ Vertex 3 120GB x2 HDD - WD Caviar Black 1TB Case - Corsair Obsidian 600D Audio - Asus Xonar DG


   Hail Sithis!

Link to comment
Share on other sites

Link to post
Share on other sites

If your game feels like junk at low FPS it is because your pc is struggling,  there is no performance hit from G-sync which means it will feel smoother even though your pc is having trouble keeping up. This also means there will be less times you feel the need to drop your resolution or quality options in order to maintain 60+ frames.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Actually software has always been the limping dog, Software being that it has to work on a broad set of hardware is what is causing the problems, the full performance of a hardware part is never fully realized unless you write a program to work on only that one set of hardware, this is how consoles stay relevant for 10yr lifespans. So the old core 2 duo you have lying around still could probably run most modern games capped out, the problem is the software does not leverage it correctly. The AMD FX series loves threaded programs and when a program that is properly threaded it eats most other chips for lunch. AMD went with a CPU that depends on good software, problem is its not a perfect world and it is simple to go all out on one core then do proper threading.

 

Unless coding for a broad set of hardware with many optimizations becomes easier then we will continue to see performance left on the table, I bet my old 8800GT still has some life in it if a game was optimized for that one card. The same holds true for many CPU and they are rarely utilized to their full potential except under a benchmark.

 

The thing is Intel and AMD both have diffrent ideas of how a CPU should be made and used, this is clear as you really cant use the broad term "Core" anymore as both designs are so different you really cant compare them.

Ergo they have not improved the performance of their CPU as they say "CPU benchmarks indicate that @amdfx1997kid 8350 is equal to I7 4770K"  which as you point out are not equal at all. They have simply optimised what ever benchmark they are using.

 

What I was eluding to was that software benchmarks can be coded to provide any result from any hardware for any purpose.  And until they qualify their testing procedure and conditions and give us some actual numbers, I am treating all of this as standard BS marketing hype.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Ergo they have not improved the performance of their CPU as they say "CPU benchmarks indicate that @amdfx1997kid 8350 is equal to I7 4770K"  which as you point out are not equal at all. They have simply optimised what ever benchmark they are using.

 

What I was eluding to was that software benchmarks can be coded to provide any result from any hardware for any purpose.  And until they qualify their testing procedure and conditions and give us some actual numbers, I am treating all of this as standard BS marketing hype.

I think they used it more to show how much peformance gain you get from it.

Just think of how much power where not using when you can get i7 4770K DX11  peformance on an 2GHZ 8350.

How much peformance will we get when we would use Mantle with an i7 4770k.

RTX2070OC 

Link to comment
Share on other sites

Link to post
Share on other sites

I think they used it more to show how much peformance gain you get from it.

Just think of how much power where not using when you can get i7 4770K DX11  peformance on an 2GHZ 8350.

How much peformance will we get when we would use Mantle with an i7 4770k.

Assuming the performance benchmarks they are talking about will translate to real world performance for us plebs, and assuming mantle will provide the same boost for an I7.

 

I don't want to sound like pessimist but this is how AMD and Nvidia etc hype up their market so I am not holding my breath.

 

However when I see real world figures, even if they are half as good, then I might start getting excited.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

if DICE are planning to release mantle by december they should have a fairly stable build to show off gains already

 

Although you have to remember, this is DICE...

Link to comment
Share on other sites

Link to post
Share on other sites

Assuming the performance benchmarks they are talking about will translate to real world performance for us plebs, and assuming mantle will provide the same boost for an I7.

 

I don't want to sound like pessimist but this is how AMD and Nvidia etc hype up their market so I am not holding my breath.

 

However when I see real world figures, even if they are half as good, then I might start getting excited.

Keep in mind AMD is shifting driver support to the developers side. So the devs can optimize a lot better per game than AMD can.

That's another boost in performance.

In all reality it does look promising in the performance front.

Motherboard - Gigabyte P67A-UD5 Processor - Intel Core i7-2600K RAM - G.Skill Ripjaws @1600 8GB Graphics Cards  - MSI and EVGA GeForce GTX 580 SLI PSU - Cooler Master Silent Pro 1,000w SSD - OCZ Vertex 3 120GB x2 HDD - WD Caviar Black 1TB Case - Corsair Obsidian 600D Audio - Asus Xonar DG


   Hail Sithis!

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×