Jump to content

Mantle Demo

qwertywarrior

Thank you so much! I missed it unfortunately I came into the stream a bit, I love you so darn much right now! Thanks! :D

Current Build: Case: Define R4 White/Window CPU: i5 3470 @4.0Ghz GPU: GTX 680 DCUII +500Mhz(Mem) Cooler: Hyper 212 EVO Monitor: Acer Monitor 1920x1080 MOBO: Asrock Z77 Extreme 4 Storage: 2TB HDD, 120GB 840 EVO (OS)

Future Build: 4670K, GTX 780 MSI TwinFrozr OC, Z87X-D3H, 8GB @1866Mhz, 120GB SSD, 1TB HDD, 750D, RM 650W, Custom Loop. White/Blue/Black Colour Scheme. I literally cannot wait *_*

Link to comment
Share on other sites

Link to post
Share on other sites

WOW can you ask for a worst demo ever.

Here is what I see wrong with it:

1- Phsyics and AI uses the CPU not the GPU. Mantle is useless here.

2- He talks about DirectX, on how it is slower... he is basically saying it's only a tad slower. But with AMD graphic card you have many DirectX games that don't perform as well as with a similar performing card from Nvidia. (the reverse, with OpenGL is true as well). It's never a big performance difference, but it's enough of difference to fix the performance drop he displays

3- Where is the comparison with and without Mantle? How does it compare with a similarly performing Nvidia card? See if he showed that Mantle is faster than non-Mantle on AMD, and also Nvidia competing card of what is inside the system, THEN NOW we are cooking with some serious gas, and now it provides validity to the claims. Until then, this is marketing B.S, much like Sega's Blast Processing.

Of course he going to say Mantle is the best thing in the universe. AMD is paying him to say this. This is nothing new in the business. It's sponsorship. Nvidia does it, Intel does it, and all the computer hardware and peripherals with the "MLG" tag on it, or some gamer that won a big tournament, 'approving' a product.

We have to thing subjectively and ask questions. We can't just jump in conclusion because we like a company over another... because if you do, you'll be very disappointed one day.

Note: I am not saying that Mantle is a some sort of lie. What I am saying, and my point is that we lack important information.

Link to comment
Share on other sites

Link to post
Share on other sites

WOW can you ask for a worst demo ever.

Here is what I see wrong with it:

1- Phsyics and AI uses the CPU not the GPU. Mantle is useless here.

2- He talks about DirectX, on how it is slower... he is basically saying it's only a tad slower. But with AMD graphic card you have many DirectX games that don't perform as well as with a similar performing card from Nvidia. (the reverse, with OpenGL is true as well). It's never a big performance difference, but it's enough of difference to fix the performance drop he displays

3- Where is the comparison with and without Mantle? How does it compare with a similarly performing Nvidia card? See if he showed that Mantle is faster than non-Mantle on AMD, and also Nvidia competing card of what is inside the system, THEN NOW we are cooking with some serious gas, and now it provides validity to the claims. Until then, this is marketing B.S, much like Sega's Blast Processing.

Of course he going to say Mantle is the best thing in the universe. AMD is paying him to say this. This is nothing new in the business. It's sponsorship. Nvidia does it, Intel does it, and all the computer hardware and peripherals with the "MLG" tag on it, or some gamer that won a big tournament, 'approving' a product.

We have to thing subjectively and ask questions. We can't just jump in conclusion because we like a company over another... because if you do, you'll be very disappointed one day.

Note: I am not saying that Mantle is a some sort of lie. What I am saying, and my point is that we lack important information.

You do realize that mantle reduces the time the frame is being processed in the api, which is cpu bound,meaning that cpu loads perform a sort of "worst case scenario"? I don't think you understand the technical workings of mantle. The goal is to remove the wasted time in the api where only one core of the cpu is in use, leaving a relatively unloaded cpu.

Daily Driver:

Case: Red Prodigy CPU: i5 3570K @ 4.3 GHZ GPU: Powercolor PCS+ 290x @1100 mhz MOBO: Asus P8Z77-I CPU Cooler: NZXT x40 RAM: 8GB 2133mhz AMD Gamer series Storage: A 1TB WD Blue, a 500GB WD Blue, a Samsung 840 EVO 250GB

Link to comment
Share on other sites

Link to post
Share on other sites

After I see that, all I can think about is 128 player star wars x-wing vs tie fighter.

How cool would that be dog fighting around the death star with 128 players.

Cool, and an absolute nightmare for whomever is hosting the server. That's a lot of load.

I do not feel obliged to believe that the same God who has endowed us with sense, reason and intellect has intended us to forgo their use, and by some other means to give us knowledge which we can attain by them. - Galileo Galilei
Build Logs: Tophat (in progress), DNAF | Useful Links: How To: Choosing Your Storage Devices and Configuration, Case Study: RAID Tolerance to Failure, Reducing Single Points of Failure in Redundant Storage , Why Choose an SSD?, ZFS From A to Z (Eric1024), Advanced RAID: Survival Rates, Flashing LSI RAID Cards (alpenwasser), SAN and Storage Networking

Link to comment
Share on other sites

Link to post
Share on other sites

You do realize that mantle reduces the time the frame is being processed in the api, which is cpu bound,meaning that cpu loads perform a sort of "worst case scenario"? I don't think you understand the technical workings of mantle. The goal is to remove the wasted time in the api where only one core of the cpu is in use, leaving a relatively unloaded cpu.

oh, I understand that, but it's never been a big problem as AMD is claiming. it's rather minimal. That is not the reason that makes console games looks better for a similar specs PC. It has to do with knowing the system configuration, know what is good at and not, and adapt your code with it, essentially optimizing the game for this specific hardware. In addition, they have a goal. Like, let's push the boundary on the console. While on the PC side, games are barely optimized, as we have powerful systems, we can upgrade at worst. People will still buy the game, they can drop the visuals if they have to at worst. The console version showcase their game, as this is the first one it's being made for. You have the reverse too as well. When a game is made for the PC initially, and let's say the studio actually cares in running the game at the best visuals on the low end hardware, while it can't ever be as optimized as you have a million of possible configuration, the console port won't be as good as other games, at least graphically.

As he states clearly in the video, in the middle, it's only a few stepping that appears here and there, which things like that can be fixed. And the drivers can also be more optimized. Any developer will always tell you that things can always be more optimized. If they were so sure about the Mantle, they would demonstrate a complete comparison, but they are not. Not even a glimpse.

Link to comment
Share on other sites

Link to post
Share on other sites

oh, I understand that, but it's never been a big problem as AMD is claiming. it's rather minimal. That is not the reason that makes console games looks better for a similar specs PC. It has to do with knowing the system configuration, know what is good at and not, and adapt your code with it, essentially optimizing the game for this specific hardware. In addition, they have a goal. Like, let's push the boundary on the console. While on the PC side, games are barely optimized, as we have powerful systems, we can upgrade at worst. People will still buy the game, they can drop the visuals if they have to at worst. The console version showcase their game, as this is the first one it's being made for. You have the reverse too as well. When a game is made for the PC initially, and let's say the studio actually cares in running the game at the best visuals on the low end hardware, while it can't ever be as optimized as you have a million of possible configuration, the console port won't be as good as other games, at least graphically.

As he states clearly in the video, in the middle, it's only a few stepping that appears here and there, which things like that can be fixed. And the drivers can also be more optimized. Any developer will always tell you that things can always be more optimized. If they were so sure about the Mantle, they would demonstrate a complete comparison, but they are not. Not even a glimpse.

 

Worried about the MS stock you have?

 

Direct X can drop to 1/3rd console efficiency and sometimes get close to 30k batch as a high. Mantle will give console efficiency. This demo is showing DOUBLE the batch calls that a console does. THAT is why there is no direct comparison. You can't run that demo on Direct x. It is impossible. As he said the driver/cpu can't handle it. 

 

Is BF4 going to have 60k batches? Hell no. It is a first generation mantle game. BF4 is going to have near console efficiency though which absolutely kills Direct X. 

 

How hard is this to understand? Do you really think the Direct X on a Xbox one is anything like what we have on Windows? It is a low level API just like Mantle. We are basically getting a better Glide for GCN cards. It is going to kill Direct X. Minimum fps will go up A LOT. Max FPS not so much (not at first), but who cares. The minimum dips are the PROBLEM with Direct X.

 

Why the hell would anyone want mantle and low level API's to fail? Are you happy needing 4 titans for 4k gaming when you could be using 2? Having to OC chips just to try and get minimum fps up when freakin mobile chips power consoles? Buying new crappy versions of Windows just so you get a new DirectX on the same kernel with a GUI you hate and stupid cloud crap you don't want?

 

Screw Direct X, screw MS, and screw Windows 8. You don't need Windows 8 with Mantle. You can give a middle finger to MS and never use their crappy OS again. Why the hell would I ever run Windows again if I can run hackintosh with mantle or a linux partition. So that I can use MS paint and notepad, and their crappy antivirus that they say sucks? The only reason most people have windows is to play games, and that is because Direct X has been a gun held up to our head. We have had no choice. 

 

Die Direct X, DIE.

CPU:24/7-4770k @ 4.5ghz/4.0 cache @ 1.22V override, 1.776 VCCIN. MB: Z87-G41 PC Mate. Cooling: Hyper 212 evo push/pull. Ram: Gskill Ares 1600 CL9 @ 2133 1.56v 10-12-10-31-T1 150 TRFC. Case: HAF 912 stock fans (no LED crap). HD: Seagate Barracuda 1 TB. Display: Dell S2340M IPS. GPU: Sapphire Tri-x R9 290. PSU:CX600M OS: Win 7 64 bit/Mac OS X Mavericks, dual boot Hackintosh.

Link to comment
Share on other sites

Link to post
Share on other sites

oh, I understand that, but it's never been a big problem as AMD is claiming. it's rather minimal. That is not the reason that makes console games looks better for a similar specs PC. It has to do with knowing the system configuration, know what is good at and not, and adapt your code with it, essentially optimizing the game for this specific hardware. In addition, they have a goal. Like, let's push the boundary on the console. While on the PC side, games are barely optimized, as we have powerful systems, we can upgrade at worst. People will still buy the game, they can drop the visuals if they have to at worst. The console version showcase their game, as this is the first one it's being made for. You have the reverse too as well. When a game is made for the PC initially, and let's say the studio actually cares in running the game at the best visuals on the low end hardware, while it can't ever be as optimized as you have a million of possible configuration, the console port won't be as good as other games, at least graphically.

As he states clearly in the video, in the middle, it's only a few stepping that appears here and there, which things like that can be fixed. And the drivers can also be more optimized. Any developer will always tell you that things can always be more optimized. If they were so sure about the Mantle, they would demonstrate a complete comparison, but they are not. Not even a glimpse.

Fair enough. But this api let's them use gcn like developers treat console hardware.

Daily Driver:

Case: Red Prodigy CPU: i5 3570K @ 4.3 GHZ GPU: Powercolor PCS+ 290x @1100 mhz MOBO: Asus P8Z77-I CPU Cooler: NZXT x40 RAM: 8GB 2133mhz AMD Gamer series Storage: A 1TB WD Blue, a 500GB WD Blue, a Samsung 840 EVO 250GB

Link to comment
Share on other sites

Link to post
Share on other sites

WOW can you ask for a worst demo ever.

Here is what I see wrong with it:

1- Phsyics and AI uses the CPU not the GPU. Mantle is useless here.

2- He talks about DirectX, on how it is slower... he is basically saying it's only a tad slower. But with AMD graphic card you have many DirectX games that don't perform as well as with a similar performing card from Nvidia. (the reverse, with OpenGL is true as well). It's never a big performance difference, but it's enough of difference to fix the performance drop he displays

3- Where is the comparison with and without Mantle? How does it compare with a similarly performing Nvidia card? See if he showed that Mantle is faster than non-Mantle on AMD, and also Nvidia competing card of what is inside the system, THEN NOW we are cooking with some serious gas, and now it provides validity to the claims. Until then, this is marketing B.S, much like Sega's Blast Processing.

Of course he going to say Mantle is the best thing in the universe. AMD is paying him to say this. This is nothing new in the business. It's sponsorship. Nvidia does it, Intel does it, and all the computer hardware and peripherals with the "MLG" tag on it, or some gamer that won a big tournament, 'approving' a product.

We have to thing subjectively and ask questions. We can't just jump in conclusion because we like a company over another... because if you do, you'll be very disappointed one day.

Note: I am not saying that Mantle is a some sort of lie. What I am saying, and my point is that we lack important information.

But he says that the CPU is underclocked, and everything is running on the GPU...??

Link to comment
Share on other sites

Link to post
Share on other sites

Yes, but by how much? 100Mhz downclock is nothing, if that is the case.

In addition, how does it compare with the performance without the CPU underclock? Nothing is mentioned here.

I don't think the AI runs on the GPU, as AI is filled with if conditions, and one thing that GPU are abysmal at doing is logical if condition processing. There were talks about the possibility of AMD integrating the newly (at the time) acquired ATI GPU, to be able to make an APU where the GPU passes any conditional statement processing to the CPU as it can do billions in miliseconds (not actual figure, but it is ridiculously fast at doing those). Would be cool to see this. The question comes down if Nvidia will do this for Maxwell... (well assuming they put a modified Tegra chip inside to do this, and personally, if anything, I would expect it to see it on the Telsa series GPU only... at least on this generation, and over time (future generation) trickle it down to the consumer level graphic cards (GeForce). Having this, is an amazing way to improve greatly gaming performance, makes shader programming (DirectX or OpenGL) a breeze, and opens room to better more complex visual effects, and would make CUDA and OpenCL programming also much easier and much more powerful.

Link to comment
Share on other sites

Link to post
Share on other sites

Yes, but by how much? 100Mhz downclock is nothing, if that is the case.

In addition, how does it compare with the performance without the CPU underclock? Nothing is mentioned here.

 

http://www.youtube.com/watch?v=QIWyf8Hyjbg

 

This was a much longer video that explained it all. Bout 25 minutes in or so is when the fun begins. The 8350 is running at 2ghz (half it's speed). The only thing I don't buy in the video is him saying that "maybe" a 8350 "might" be faster then a 4770k in mantle. The 8350 can only compete with a 4770k in one thing and integer math ain't used in gaming.

 

I totally believe that the 8350 can run at 2ghz and not bottleneck a r9 290x. If it can do it at 2ghz? The I7 can do it at 1.5 :). It isn't like he lied, he was just pumping up AMD's CPU and that is fine, I get it.

 

In the end it doesn't matter at all, both chips will be complete overkill as will an I5, and hell even an I3 might be enough. Mantle is like a console API. The CPU on consoles is a mobile, LAUGHABLE CPU. It makes sense that any modern CPU will be awesome in mantle and some will be complete overkill. An I7-920 (if you still have one) should actually be pretty damn good. Don't throw it away. :)

 

I don't remember if he says it in the video, but 30k batch count is what direct x can spike to and what consoles run at. Direct x can also spike down as low as 10k. A console stays at a steady 30k, which is what mantle should do in BF4. Bigger batch counts will come later as shown in this tech demo.

CPU:24/7-4770k @ 4.5ghz/4.0 cache @ 1.22V override, 1.776 VCCIN. MB: Z87-G41 PC Mate. Cooling: Hyper 212 evo push/pull. Ram: Gskill Ares 1600 CL9 @ 2133 1.56v 10-12-10-31-T1 150 TRFC. Case: HAF 912 stock fans (no LED crap). HD: Seagate Barracuda 1 TB. Display: Dell S2340M IPS. GPU: Sapphire Tri-x R9 290. PSU:CX600M OS: Win 7 64 bit/Mac OS X Mavericks, dual boot Hackintosh.

Link to comment
Share on other sites

Link to post
Share on other sites

Interesting. Yes, I agree with you, you do raise good points.

Just to clarify, I am not trashing mantle. If anything this is good, it will move the snail paste OpenGL butt, and wake up DirectX and Windows team more, and put them on their toe, if it is as proclaimed. But due to a lack of deeper information, I think it's a lot of smoke and mirrors. Not saying it's lying and it's crap, but it's not as good as it is mentioned or should I say, in better words: hyped up to be.

Reminds me of Thunderbolt. Great technology, it sure pushed USB 3.0. But at the end of the day, Thunderbolt is nothing more than an overpriced external HDD connector. Intel talked a lot about the possibilities of the Thunderbolt, yet, we see nothing from Intel showcasing their new technology (ie: see it as a killer game for a console, or killer app for a phone OS). If you dont' have that, then there is no push.

Link to comment
Share on other sites

Link to post
Share on other sites

Interesting. Yes, I agree with you, you do raise good points.

Just to clarify, I am not trashing mantle. If anything this is good, it will move the snail paste OpenGL butt, and wake up DirectX and Windows team more, and put them on their toe, if it is as proclaimed. But due to a lack of deeper information, I think it's a lot of smoke and mirrors. Not saying it's lying and it's crap, but it's not as good as it is mentioned or should I say, in better words: hyped up to be.

Reminds me of Thunderbolt. Great technology, it sure pushed USB 3.0. But at the end of the day, Thunderbolt is nothing more than an overpriced external HDD connector. Intel talked a lot about the possibilities of the Thunderbolt, yet, we see nothing from Intel showcasing their new technology (ie: see it as a killer game for a console, or killer app for a phone OS). If you dont' have that, then there is no push.

 

The issue with hype, and sure AMD did hype it, but AMD fans of which there are many did a lot more unrealistic hyping and that is where there is going to be a huge letdown I think. It was mentioned in another thread by someone (forgive me for not remembering who, but whomever you are, you get credit for this), that this big jump of fps all depends on the normal fps. IE, scaling 25% fps boost when your fps was 20 only gets you 5 fps, which is piss poor to be frank. Without testing, no one can say how it scales from low end, to high end and beyond, so no one can really say how Mantle is. AMD is sorta hiding it, I think, because if you have this great system... show it off with many configs of hardware. *shrugs* My take on that.

 

But yeah, there is more hype from the fans of AMD, then I think there is from AMD. It was also weird, I think, that AMD made the announcement about BF4, and I know people are putting it on DICE saying their fixing bugs,  but DICE woulda made the announcement you would think if that was the case, not AMD. It makes me wonder if there is something more going on with Mantle then we know about. (That is speculation, and with a little research but not nearly enough). 

Link to comment
Share on other sites

Link to post
Share on other sites

The issue with hype, and sure AMD did hype it, but AMD fans of which there are many did a lot more unrealistic hyping and that is where there is going to be a huge letdown I think. It was mentioned in another thread by someone (forgive me for not remembering who, but whomever you are, you get credit for this), that this big jump of fps all depends on the normal fps. IE, scaling 25% fps boost when your fps was 20 only gets you 5 fps, which is piss poor to be frank. Without testing, no one can say how it scales from low end, to high end and beyond, so no one can really say how Mantle is. AMD is sorta hiding it, I think, because if you have this great system... show it off with many configs of hardware. *shrugs* My take on that.

 

But yeah, there is more hype from the fans of AMD, then I think there is from AMD. It was also weird, I think, that AMD made the announcement about BF4, and I know people are putting it on DICE saying their fixing bugs,  but DICE woulda made the announcement you would think if that was the case, not AMD. It makes me wonder if there is something more going on with Mantle then we know about. (That is speculation, and with a little research but not nearly enough). 

 

I hype it not for what it promises at the beginning, but for what it can mean. with the correct usage, this can completely remove most CPU overhead, it (in conjunction with the XDMA engines on the new 290 and 290x) can see crossfire configurations as one GPU with 4GB of addressable memory and a ton of "cores". I see this as a promise for excellent crossfire scaling and making my 3570K relevant for much longer than it would be otherwise. I am not excited for the release, although another 25% in BF4 would let me get 60 frames at ultra on my 1440P. I am excited for what it can mean if used correctly. DirectX has had some more obvious drawbacks, but it has also stunted progress in ways that we don't really see yet - and this can remove those drawbacks that we don't even realize existed yet.

Daily Driver:

Case: Red Prodigy CPU: i5 3570K @ 4.3 GHZ GPU: Powercolor PCS+ 290x @1100 mhz MOBO: Asus P8Z77-I CPU Cooler: NZXT x40 RAM: 8GB 2133mhz AMD Gamer series Storage: A 1TB WD Blue, a 500GB WD Blue, a Samsung 840 EVO 250GB

Link to comment
Share on other sites

Link to post
Share on other sites

Have they announced any information about the demo being available for download in the near future? Would be great to be able to test it out once Mantle support is in the drivers.

Link to comment
Share on other sites

Link to post
Share on other sites

Is....is that Homeworld :o

Da Rig : Mobo: Asus M5A97-r2.0 | Cpu: FX-8320 (4.1ghz) Stock cooler | Gpu: Sapphire R9 290 | RAM: 16GB Patriot Intel Extreme (1600mhz...for now >:]) | PSU: Antec 620w continuous | SSD: Corsair Force 60gb (boot) | HDD: WD 500GB 7200rpm  WD Blue 1TB | OS: WIndows 7 Ultimate 64-bit |

Link to comment
Share on other sites

Link to post
Share on other sites

Worried about the MS stock you have?

 

Direct X can drop to 1/3rd console efficiency and sometimes get close to 30k batch as a high. Mantle will give console efficiency. This demo is showing DOUBLE the batch calls that a console does. THAT is why there is no direct comparison. You can't run that demo on Direct x. It is impossible. As he said the driver/cpu can't handle it. 

 

Is BF4 going to have 60k batches? Hell no. It is a first generation mantle game. BF4 is going to have near console efficiency though which absolutely kills Direct X. 

 

How hard is this to understand? Do you really think the Direct X on a Xbox one is anything like what we have on Windows? It is a low level API just like Mantle. We are basically getting a better Glide for GCN cards. It is going to kill Direct X. Minimum fps will go up A LOT. Max FPS not so much (not at first), but who cares. The minimum dips are the PROBLEM with Direct X.

 

Why the hell would anyone want mantle and low level API's to fail? Are you happy needing 4 titans for 4k gaming when you could be using 2? Having to OC chips just to try and get minimum fps up when freakin mobile chips power consoles? Buying new crappy versions of Windows just so you get a new DirectX on the same kernel with a GUI you hate and stupid cloud crap you don't want?

 

Screw Direct X, screw MS, and screw Windows 8. You don't need Windows 8 with Mantle. You can give a middle finger to MS and never use their crappy OS again. Why the hell would I ever run Windows again if I can run hackintosh with mantle or a linux partition. So that I can use MS paint and notepad, and their crappy antivirus that they say sucks? The only reason most people have windows is to play games, and that is because Direct X has been a gun held up to our head. We have had no choice. 

 

Die Direct X, DIE.

I agree with you but Mantle has 1 major issue:

It's not on Nvidia!

If it would work on GTX600+ and HD7000+ we would see games with 50k-100k batchtes.

But as long as it's GCN only Devs will continue to make 5k-10k games :/

This is a subject where we "NEED" Nvidia and AMD to get together and make a Low Level API standard.

If not it will never be able to really help.

 

RTX2070OC 

Link to comment
Share on other sites

Link to post
Share on other sites

I agree with you but Mantle has 1 major issue:

It's not on Nvidia!

If it would work on GTX600+ and HD7000+ we would see games with 50k-100k batchtes.

But as long as it's GCN only Devs will continue to make 5k-10k games :/

This is a subject where we "NEED" Nvidia and AMD to get together and make a Low Level API standard.

If not it will never be able to really help.

 

 

They will continue to make 30k batch games you mean, because that is what a console runs at. Mantle will just run them well, without hiccups like Direct X. 50-100k batches? We would only get that if Nvidia joined probably. They could add "mantle only" graphics. "Flare" if you will in Mass Effect, Battlefront. It could just be disabled on other API's for FPS reasons. Doubt they will bother though.

 

If something like a r9 270 gets better FPS lows then a 760 which cost 70 bucks more, and this might happen? You can be sure Nvidia will join mantle or make their own. They aren't going to lose their a@# selling higher tier cards for cheaper prices. 

CPU:24/7-4770k @ 4.5ghz/4.0 cache @ 1.22V override, 1.776 VCCIN. MB: Z87-G41 PC Mate. Cooling: Hyper 212 evo push/pull. Ram: Gskill Ares 1600 CL9 @ 2133 1.56v 10-12-10-31-T1 150 TRFC. Case: HAF 912 stock fans (no LED crap). HD: Seagate Barracuda 1 TB. Display: Dell S2340M IPS. GPU: Sapphire Tri-x R9 290. PSU:CX600M OS: Win 7 64 bit/Mac OS X Mavericks, dual boot Hackintosh.

Link to comment
Share on other sites

Link to post
Share on other sites

Worried about the MS stock you have?

 

Direct X can drop to 1/3rd console efficiency and sometimes get close to 30k batch as a high. Mantle will give console efficiency. This demo is showing DOUBLE the batch calls that a console does. THAT is why there is no direct comparison. You can't run that demo on Direct x. It is impossible. As he said the driver/cpu can't handle it. 

 

Is BF4 going to have 60k batches? Hell no. It is a first generation mantle game. BF4 is going to have near console efficiency though which absolutely kills Direct X. 

 

How hard is this to understand? Do you really think the Direct X on a Xbox one is anything like what we have on Windows? It is a low level API just like Mantle. We are basically getting a better Glide for GCN cards. It is going to kill Direct X. Minimum fps will go up A LOT. Max FPS not so much (not at first), but who cares. The minimum dips are the PROBLEM with Direct X.

 

Why the hell would anyone want mantle and low level API's to fail? Are you happy needing 4 titans for 4k gaming when you could be using 2? Having to OC chips just to try and get minimum fps up when freakin mobile chips power consoles? Buying new crappy versions of Windows just so you get a new DirectX on the same kernel with a GUI you hate and stupid cloud crap you don't want?

 

Screw Direct X, screw MS, and screw Windows 8. You don't need Windows 8 with Mantle. You can give a middle finger to MS and never use their crappy OS again. Why the hell would I ever run Windows again if I can run hackintosh with mantle or a linux partition. So that I can use MS paint and notepad, and their crappy antivirus that they say sucks? The only reason most people have windows is to play games, and that is because Direct X has been a gun held up to our head. We have had no choice. 

 

Die Direct X, DIE.

I use windows because of software support, not games. There just straight up aren't equivalents to foobar and MPC-HC on anything else. VLC isn't as good objectively on lower end hardware for 10b video, and other audio programs are just a little gimped. Then there's all the examples that I don't use, like Creative Suite. 

 

I also don't want to deal with the MacOSX's learning curve or have to rice linux to make a GUI that doesn't make my eyes bleed. Windows 8 is objectively better than 7. 

Link to comment
Share on other sites

Link to post
Share on other sites

I use windows because of software support, not games. There just straight up aren't equivalents to foobar and MPC-HC on anything else. VLC isn't as good objectively on lower end hardware for 10b video, and other audio programs are just a little gimped. Then there's all the examples that I don't use, like Creative Suite. 

 

I also don't want to deal with the MacOSX's learning curve or have to rice linux to make a GUI that doesn't make my eyes bleed. Windows 8 is objectively better than 7. 

 

You think Mac OS has a huge learning curve and Windows 8 has a nice GUI. How is Bizzarro World? Do all the MS shareholders and employees hang out there. Maybe that would explain Windows 8.

CPU:24/7-4770k @ 4.5ghz/4.0 cache @ 1.22V override, 1.776 VCCIN. MB: Z87-G41 PC Mate. Cooling: Hyper 212 evo push/pull. Ram: Gskill Ares 1600 CL9 @ 2133 1.56v 10-12-10-31-T1 150 TRFC. Case: HAF 912 stock fans (no LED crap). HD: Seagate Barracuda 1 TB. Display: Dell S2340M IPS. GPU: Sapphire Tri-x R9 290. PSU:CX600M OS: Win 7 64 bit/Mac OS X Mavericks, dual boot Hackintosh.

Link to comment
Share on other sites

Link to post
Share on other sites

You think Mac OS has a huge learning curve and Windows 8 has a nice GUI. How is Bizzarro World? Do all the MS shareholders and employees hang out there. Maybe that would explain Windows 8.

"Mac OS has a huge learning curve"

 

Yes. That's just true. Are you kidding here? 

 

"Windows 8 has a nice GUI"

 

Are you saying you prefer aero over matte finishes?

Link to comment
Share on other sites

Link to post
Share on other sites

"Mac OS has a huge learning curve"

 

Yes. That's just true. Are you kidding here? 

 

"Windows 8 has a nice GUI"

 

Are you saying you prefer aero over matte finishes?

I'd say that Mac OS is rather simple <.< And yes, for me win7 is far superior to win8. It could be my personal thing but majority of gamers and HW enthusist would agree that W7 >W8.

 

To be honest the only people whom I've heard praising W8 over W7 are those who rushed to buy it on day 1 and those who got it with their prebuilt pcs EDIT: and Linus. Though his argument is the dual network solution on W8 and this is not something most people use.

 

 

As to the topic I'd love the Mantle to be great so that windows would move their behinds and do something with directX (at the very least), however I fear as to what exactly mantle will be considering all the secrecy and PR hype without any relevant data.

Link to comment
Share on other sites

Link to post
Share on other sites

WOW can you ask for a worst demo ever.

Here is what I see wrong with it:

1- Phsyics and AI uses the CPU not the GPU. Mantle is useless here.

Quite the opposite, Mantle is crucial here, by reducing the API load on the CPU, the CPU has significantly more free resources to handle the physics and AI which results in better performance.

2- He talks about DirectX, on how it is slower... he is basically saying it's only a tad slower. But with AMD graphic card you have many DirectX games that don't perform as well as with a similar performing card from Nvidia. (the reverse, with OpenGL is true as well). It's never a big performance difference, but it's enough of difference to fix the performance drop he displays.

 

What you're talking about here is entirely different, developers optimize for Nvidia and AMD in DirectX by skewing their rendering paths to utilize code which runs superior on the GPU architecture they're optimizing for, for example in Batman Arkham Origins which is an Nvidia title, Nvidia made sure that a lot of Nvidia optimized libraries are used for a bunch of different effects including the lighting and depth of field, they also over-emphasized the tessellation.

In comparison AMD emphasized DX11 DirectCompute for their TressFX hair physics because of their massive lead in compute performance, they also let the developers go crazy with memory constrained effects and textures because on average AMD cards have significantly more VRAM than Nvidia cards.

 

3- Where is the comparison with and without Mantle? How does it compare with a similarly performing Nvidia card? See if he showed that Mantle is faster than non-Mantle on AMD, and also Nvidia competing card of what is inside the system, THEN NOW we are cooking with some serious gas, and now it provides validity to the claims. Until then, this is marketing B.S, much like Sega's Blast Processing.

Oxide Games are a developer independent of AMD or Nvidia, they can't go making AMD vs Nvidia claims, because they have to work with both camps.

You'll have to wait for reviews to see a direct comparison.

Here is Mantle vs DirectX on an R9 270X and a 4770K/A8 7600

6ee8f82e_amd_kaveri_tech_day_2014_071b.j

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×