Search the Community
Showing results for tags 'ashes of the singularity'.
-
AMD Ryzen - Leaked Ashes Of The Singularity Benchmarks at 4.0GHz When it comes to benchmarks of benchmarks of the Ryzen CPU's, we have been seeing quite a few of then, the one sketchier than the other. Recently a user by the quite ironical name of "AMD_FanBoy", posted some AOTS benchmarks featuring: Yes! A Ryzen CPU! (what else, right!?) According to wccftech they were posted to the AOTS database, but quickly removed. Luckely for us, someone managed to screenshot it in time! According to this benchmark we are dealing with an 8 core / 16 treaths variant of the Ryzen lineup. If you look at the CPU's suffix in the benchmark we can also see it sais "40/36" which would indicate the CPU is running at a 3.6GHz base and $.0 GHz boost clock as also stated by wccftech in their article. As fot the Benchmark itself, it was ran on "Crazy [4K]" and the settings as shown in the picture below. What's interesting however is that the GPU used here is an NVidia Titan X and not an AMD card. While we can ofcourse never been 100% sure, this means there is a good chance these benchmarks came from a thirth party, possibly a reviewer or AIB. Yes, this could aslo be a play from AMD and we will probably never know for certain, but it is a nice lead and on top of that I really hope AMD doesn't name themselfs "AMD_FanBoy" on the webs (but hey who knows :P). Now These benchmarks look prety decent, right?! Well when I saw this on wccftech I was REALLY interested to see how this would compare to some CPU's from Intel's lineup, and so I set out for an epic journey across Google to find some AOTS benchmarks! This proved slightly more difficult than I hoped, since most AOTS benchmarks on the Intel CPU's were in 1080 which is very understandable. After a good while though I stumbled across what might be the closest comparrison we could get our hands on. It uses the same card (a Titan X), was done on "Crazy [4K]" with almost identical settings. The only exception is that "Half-resolution terrain" was enabled unlike in the Ryzen benchmark. For comparisson I got you guys the benchmarks on the I7 69050 and then for the more game oriented users I also got the 6700K and what it showed kinda supprised me! Sadly I couldnt find $K AOT benchmarks for The 6900k or the 7700k, so if anyone finds them, feel free to link them! (The Intel benchmarks were taken from overclockers.ua) 6950x: 6700k: As you can see from the benchmarks, the 8 Core/16 Treath Ryzen CPU Is holding it's own against Intels 10 core hyperthreaded 6950x. As a matter of fact it is outpreforming this $1500+ CPU (be it barely). Now ofcourse the 6950x is not really a CPU you would get for your next gaming rig, because of its price but mainly also because that is not the main market its aimed at. Still, I thought it was quite a fun fact to mention and with the lack of 6900k benchmarks I thought I'd throw it in here to at least show some comparison to an Intel CPU with a higher core count. Then comes the comparison that I found most interesting and other gamers along with me probably do too: The Ryzen vs the 6700k! Now if we take a look at the graph below we can see the lineup of Ryzen CPU's that leaked out recently, this specific shot taken from Hardwareinfo. So, What we know is we are dealing with an 8 core/16treath Ryzen CPU. The benchmarked one is Clocked at 4GHz boost speed. Well as we can see there is not 1 CPU running 4GHz so it's most likely an overlock, leaving us with the cores and treaths to narrow down the possible CPU's. This means it could be any one starting from the AMD R7 PRO 1700 upto the R7 1800X and according to this that means its ment to compete with at least the 7700 (non k!) and at most the 6900k! But wait? What is this benchmark? The Ryzen CPU we see here is being outpreformed by the 6700k. And looking at the suffix it's running a 4.0GHZ, that is factory clock speeds for the intel vs an OC for the Ryzen! So here we can see that the 6700k is winning with anything a 16.1 all the way upto a 40.9 FPS! Now sure, the Ryzen is running without half resolution terrain on and that could possibly close the gap a little, but that gap is quite significant. On top of that we are comparing a 6700k here and not even the newer 7700k which is one of the CPU's its supposed to compete with according to AMD. To conclude all this I do want to point out that noone can ever be 100% sure about the origin of these images and therefor we can not guarantee that this is how the Ryzen lineup will preform on release. There have been many benchmarks that have been leaked over the past while and some of those show the ryzen CPU's beating the 7700k without too much efford. However I personally find these benchmark quite confincing, partially because of their origin likely not being at AMD themselves as is also mentioned by wccftech on their article. I am curious to see what benchmarks await us once these chips travel to reviewers all over the world for official release and I certainly hope that I'm on the wrong end when I'm thinking AMD may have once again marketed their hardware higher than what it can actually reached. And yes that opinion is partially based of these benchmark but also of past experiences with AMD, so dont weight them too strongly. What do you guys think of this? Do you think these are legit, or do you think they are flawed and AMD will finally bringus something as strong as they keep telling us it is? Regardless these benchmarks show that you can deffinately get away with running 4K games using the Ryzen CPU even if its being beaten by the 6700k, so at the very least if these are legit we are getting a gaming capable CPU out of AMD this year regardless of wether they were telling us the truth or not! Sources: http://wccftech.com/amd-ashes-ryzen-4-0-ghz-benchmarks/ https://www.overclockers.ua/news/hardware/2017-02-03/119681/ https://nl.hardware.info/nieuws/50984/volledige-line-up-amd-ryzen-processors-lekt-uit-topmodel-heet-ryzen-r7-1800x
-
Hey guys. So first of all thanks coming here and checking out my problem. So I've been unable to play two of my favourite games recently due to them crashing. This only started happening a few days ago, I was able to play them fine until then. I play both games in DirectX 12. All other games have had no problems. Warhammer: Total War crashes whenever I try to load a game a few seconds in to the loading screen, whether from the cloud or local drive, or manual or auto save. This is from the Event Viewer: Log Name: Application Source: Windows Error Reporting Date: 10/26/2016 10:40:17 PM Event ID: 1001 Task Category: None Level: Information Keywords: Classic User: N/A Computer: James-PC Description: Fault bucket 470146, type 5 Event Name: InPageError Response: Not available Cab Id: 0 Problem signature: P1: c000009c P2: 00000003 P3: P4: P5: P6: P7: P8: P9: P10: Attached files: \\?\C:\ProgramData\Microsoft\Windows\WER\Temp\WER2DC2.tmp.WERInternalMetadata.xml These files may be available here: C:\ProgramData\Microsoft\Windows\WER\ReportArchive\AppCrash_c000009c_205d69f6ab74fd2e308992ba3998e88b816acb_9f00b4c8_0467437c Analysis symbol: Rechecking for solution: 0 Report Id: f0c9cdc9-998b-4647-9559-c15c52d1f4f1 Report Status: 0 Hashed bucket: 7d2749361dc05922d6a071409bc80d44 Event Xml: <Event xmlns="http://schemas.microsoft.com/win/2004/08/events/event"> <System> <Provider Name="Windows Error Reporting" /> <EventID Qualifiers="0">1001</EventID> <Level>4</Level> <Task>0</Task> <Keywords>0x80000000000000</Keywords> <TimeCreated SystemTime="2016-10-26T21:40:17.187066300Z" /> <EventRecordID>7647</EventRecordID> <Channel>Application</Channel> <Computer>James-PC</Computer> <Security /> </System> <EventData> <Data>470146</Data> <Data>5</Data> <Data>InPageError</Data> <Data>Not available</Data> <Data>0</Data> <Data>c000009c</Data> <Data>00000003</Data> <Data> </Data> <Data> </Data> <Data> </Data> <Data> </Data> <Data> </Data> <Data> </Data> <Data> </Data> <Data> </Data> <Data> \\?\C:\ProgramData\Microsoft\Windows\WER\Temp\WER2DC2.tmp.WERInternalMetadata.xml</Data> <Data>C:\ProgramData\Microsoft\Windows\WER\ReportArchive\AppCrash_c000009c_205d69f6ab74fd2e308992ba3998e88b816acb_9f00b4c8_0467437c</Data> <Data> </Data> <Data>0</Data> <Data>f0c9cdc9-998b-4647-9559-c15c52d1f4f1</Data> <Data>0</Data> <Data>7d2749361dc05922d6a071409bc80d44</Data> </EventData> </Event> Unfortunaltey nothing seems to show up in the Event Viewer for AOTS when it crashes, so I don't expect to get much help for that. I'll just say it crashes in-game, a few seconds in, on any mode. Specs: OS: Windows 10 CPU: i7 3770 GPU: GTX 970 Motherboard: P8H77-V LE RAM: 2x DDR3 8GB Storage: 1TB HDD PSU: 750w I think that's all the info. Thanks for reading, looking forward to replies!
- 5 replies
-
- warhammer total war.
- ashes of the singularity
-
(and 1 more)
Tagged with:
-
According to someone shared screencaps in Ashes of the Singularity benchmarks below, showing new AMD APUs R5 2600H (3.6GHz + Vega8) and R7 2800H (3.7GHz + Vega11), as well as unlabelled GPUs 66A0:00 (possibly Vega20) and 69A0:00 (possibly Vega12). Just sharing.
-
Tom's hardware has conducted an interview with Dan Baker of Oxide Games, the creator of the Nitrous Engine. Nitrous Engine is one of the first engines to implement DX12 features at all levels, and is currently being used to develop the game Ashes of the Singularity. A few months ago, news broke that Nitrous engine would eventually support not only traditional multi-GPU setups, but cross-vendor GPU setups as well. This announcement generated much attention from the tech crowd, and left the public with many unanswered questions. First of all, will iGpu + dGpu setups be possible? Baker says that it will eventually be possible - This means that although difficult, highly different performing GPU's, including iGpu's, will be able to work in multi-gpu setups confirmed. On the topic of cross-vendor GPU support, However, This means that multi-vendor support is confirmed. Baker also explained why these new multi-gpu features are even possible at all. It's all thanks to a new method to make use of multi-gpu setups: Explicit Linked Multiadapter. What is Explicit Multiadapter? Baker also goes on to say that Explicit Multiadapter also allows for much better scaling and better memory sharing. He also emphasized that Although general driver optimizations will still have to be made, there will be no special effort needed for Multi-GPU setups on DX12. This means all dx12 games will have multi-gpu support. One of the highly anticipated DX12 features, memory pooling, has not yet been implemented, and for good reason. This means that although it's generally considered to be a low priority, memory pooling is still possible and is under consideration for use. One of the unforeseen improvements that exists in Nitrous engine that specifically brings more general performance was also detailed: Object Space Rendering. It essentially renders in reverse order of current GPU's: It removes aliasing and noise before adding any detail to the scene. Baker says This means that although not a DX12 feature, Nitrous engine may bring large improvements in overall performance. Bakers closes by saying this: Personally, I am really once again excited for DX12. Although memory pooling seems to be a long shot, Multi-gpu systems are finally getting the love they deserve. Looking forwards to hearing more about this. Sorry for the long post, I wanted to break the article down in an easy-to-understand way. Source: http://www.tomshardware.com/news/oxide-games-dan-baker-interview,30665.html
- 29 replies
-
- dx12
- oxide games
-
(and 4 more)
Tagged with:
-
Hey, looking forward to find a bunch of people to play Ashes of the Singularity with I'm interested in Human Vs. Human Co-Op Human Vs. AI For those that don't know what Ashes of the Singularity is: Basically it's a Real Time Strategy game currently in development in pre-beta phase. It's main features are NO UNIT CAP !!!! http://store.steampowered.com/app/228880
-
PC Perspective has tested a new alpha build of the first DirectX 12 game, Ashes of the Singularity. DirectX 12 is the newest gaming API, exclusive to Windows 10 users, offering less CPU overhead, better threaded performance on CPU's, and lower level access to GPU hardware. Ashes of the Singularity, made by the same people of the Star Swarm demo, utilizing Mantle, is set to be the first DirectX 12 game to reach the market later this year. PC Perspective has gotten a hold of an alpha build with built in benchmarks, to test how DX12 can benefit gaming; in this case a real time strategy game. But nothing new without a little drama: This is not a surprise when we get into the actual benchmarks between the AMD R9 390X and NVidia's GTX 980: Nvidia GTX 980: AMD R9 390X: Update 1: Extremetech has also published their own benchmarks, but using an AMD Fury X and an NVidia 980ti. They have also done so using MSAA, which NVidia actively called a "game bug" in their reviewers guide, sent out to multiple review sites. The article includes an official retort from Oxide Games, developers of Ashes of the Singularity, stating that their AA implementation is not bugged: And the conclusion from Extremetech: Update 2: Arstechnica has made a comparison between an NVidia GTX 980ti and an AMD R 290X from 2013, with GCN 1.1: There has been some accusation from people that there might be some vendor bias in the game, so I have quoted the official response here from the oxide blog: NVidia has also launched their Ashes of the Singularity optimized driver, which has been used in all of the benchmarks in this post (as it came with the reviewers guide) http://www.geforce.com/whats-new/articles/geforce-355-60-whql-driver-released Please bear in mind that this is 1 game, so it is not representative of all upcoming DX12 games, nor the DX12 API in itself, but merely one RTS game, which tends to be heavy on the CPU, and in this case draw calls. Sources: PCPer article: http://www.pcper.com/reviews/Graphics-Cards/DX12-GPU-and-CPU-Performance-Tested-Ashes-Singularity-Benchmark Official Oxide Games retort: http://oxidegames.com/2015/08/16/the-birth-of-a-new-api/ Extremetech article: http://www.extremetech.com/gaming/212314-directx-12-arrives-at-last-with-ashes-of-the-singularity-amd-and-nvidia-go-head-to-head Arstechnica: http://arstechnica.co.uk/gaming/2015/08/directx-12-tested-an-early-win-for-amd-and-disappointment-for-nvidia/ My personal take: This is great news and a an interesting introduction to the new API, we all look forward too. RTS games could end up being the games to benefit the most out of the much higher draw call count, at least to begin with. We also knew NVidia's DX11 drivers are more optimized, and more importantly, multithreaded properly, giving NVidia hardware a leg up, that is not hardware based. DX12 might just be the API to level the playing field, or even give AMD a huge head start, or at least an advantage to lead to koth status for their cards. This makes it even more interesting, if PCPer would do a 980ti and Fury X benchmark battle. DX12 (and maybe Vulkan for that matter) might shake up the GPU market quite a bit; if so AMD could stand to win quite a bit on it. Update 1: Wow, Oxide Games has officially debunked NVidia's AA claim of being bugged. Like I said we knew NVidia's DX11 drivers are multithreaded giving better performance than AMD's, but seeing the playing field being so levelled in DX12, as to see NVidia outright being untruthful is pretty shocking. Are they this afraid? Update 2: Officially NVidia has had access to the source code of this game for over a year. They've had access to DX12 for a long time too. They've even gotten their own shader optimizations implemented into the game. And they've released an official Ashes of the Singularity driver specifically for these benchmarks. Either NVidia hardware just isn't powerful enough to run this game much faster, and/or NVidia's own DX12 drivers simply aren't working as expected yet.
- 282 replies
-
- directx 12
- dx12
-
(and 2 more)
Tagged with: