Jump to content

SixthSenseDJ

Member
  • Posts

    6
  • Joined

  • Last visited

Awards

This user doesn't have any awards

SixthSenseDJ's Achievements

  1. I know some people like that too, even in an otherwise high end system - but all of them are music producers who don't give a rat's arse about gaming.
  2. The 270X is a 7870 rebrand with a 50mhz overclock. The R9 290X is a 2nd gen GCN card. It does feature architecture used in the 7000 series - but only the 7790, which was also a 2nd gen GCN card, the first to be released actually. I have actually owned all 3 these cards, although I wasn't into overclocking back then.
  3. Hey guys, I've sold my reference, washers + thermal paste-modded, 5700XT when someone made me an offer I just could not refuse (over twice the price I paid for that card, 2 months old, in late 2019). So what I did, was install a Sapphire R9 290X 4GB Tri-X OC instead. It cost me 125 euros. As I really like tinkering with tech, more so than actual gaming, I decided to do some in-depth testing and see how this beast of a card from late 2013 holds up today. This test is still a work in progress, I will be adding game benchmarks as well as my experiences with CrossFire later on, because tomorrow I'll be picking up a second, identical, card. The goal of this is not to see wether this card is a viable option for GPU-starved gamers in 2021, because it's evidently clear it is. More importantly, it is about how to squeeze the most performance possible out of it. I will even be including some recommendations on what settings to use in several popular games like GTA V and Red Dead Redemption 2, once I have been able to do more testing, but more of that later. The reason for this is that virtually all Youtube videos featuring it tested in games operate it at stock settings and also do not feature tailor-made graphics preset tweaks to optimise performance. Another goal is to see, if you happen to play a lot of DX11 titles, CrossFire is a viable option given that together, these cards pack more raw processing power than a 5700 XT but they sell for about half the price of a 5700 XT reference card for a pair. The test system: Ryzen 7 3700X, on a ML240L RGB, 1.310v, PBO settings manually tweaked (typical CB20 score = 5000-5050) Gigabyte AORUS X570 Elite 32 (2*16) GB G.Skill Ripjaws V 3200c16 (Samsung E-die) OC'ed to 3600-18-22-22-42 Kingston A2000 1TB Corsair RM850 (2019 revision) PSU Cooler Master MasterBox MB511 RGB with the rear exhaust fan replaced with a MasterFan 120 ARGB AIO fan speed set to auto, rear exhaust manually set to 85%. 3 front fans set to 60% fan speed. Philips BDM4065UC 40'' 4K VA monitor (an fitting monitor for testing this as it was released when the 290X was the champion of 4K gaming on a single card) Latest Adrenalin 21.3.1 driver Ambient temperature: 20 degrees. First thing I did after thoroughly cleaning the card and applying some Noctua thermal compound, was do some testing and monitoring temps and noise levels. After I read a few bits and pieces online about noise problems I stuck a few rubber m3 washers in between the shroud and the heatsink, which made the card less noisy. I settled for a 80% maximum fan speed and a curve set in MSI Afterburner. With the default fan curve it easily hit the high 80s even on stock speed during Unigine Superposition, too high for my liking. I also used an older version of GPU-Z to check the ASIC quality, which was 73%. Now for overclocking. Note this is the version with the 6+8 pin PCIe power. Another version of the 4GB with the same cooler exists that has a up to 1300mhz boost clock but also 2 8-pin connectors just like the 8GB variant. I could get a 1150mhz OC stable with a +63mV overvolt and maxed out power limit. Any more voltage would surely allow me to push far higher probably (the 2x8 pin version has a boost clock of 1300mhz) but was limited by power draw, any more overvolting also resulted in the same effective voltage, as well as the same wattage consumption, measured in both Afterburner, GPU-Z, as well as HWInfo 64, the power delivery system just cannot get the required amount of electrical current to the GPU core for a higher effective overvolt. Regarding memory overclocking: I managed 1650 mhz which ran perfectly stable in Furmark. However, it did not run stable in Unigine. Managed to get a fully stable 1500mhz OC eventually, after solving some random black screen glitches while changing screen mode (an option for this is in MSI Afterburner's settings). Using said OC and fan speed in Furmark, the card maxed out at 84 degrees. Afterwards I ran dozens of instances of the built-in benchmark of a couple of games from the last 2-3 years. I consider this OC to be 100% stable after I did not witness one instance of artefacting except some texture pop-ins (which are not related to the OC but are unavoidable when running a card like this in modern AAA titles, and also happened at stock settings). Now for the benchmarks, using the same overclock: Unigine Superposition: 4k Optimized (including full textures exceeding VRAM limit): 4133 points, 30.9 FPS, frame drops to 27 fps. Had this been a game, it would have been playable, if it's not a fast-paced shooter or racing sim. 1080p high: 6778, 50.7 FPS. 1080p medium: 9446, 70.7 FPS. 1080p Extreme: 2972, 22.2 FPS. For comparison: a GTX 1650 Super only hits the 2400-2500 mark in the 1080p Extreme test without an overclock. I think we can safely say, that given that a 1650 Super would need to have 18% overclocking potential to match my 290X, it is a less powerful card. It is also considerably more power-efficient but is typically sold for in excess of 250 euros for an AIB model on Ebay and sometimes a lot more (one currently active auction has a highest bid of 270 GBP - 300 euros!). And, if like for so many people, your goal is to have a stop-gap solution until you can get a high end present-day card, you'd have a PSU easily capable of delivering the 300 watts a 290X needs anyways. So far, I am highly impressed with the results. It soundly beats the 1650 Super, which is a decent GPU costing considerably more in this market, in a benchmark test known for favouring NVIDIA. The results in gaming, even in 1440p, also seem promising, but more on that later once I am done with doing more tweaking, I don't just want to give a baseline performance per game but also an optimal set of graphics settings for getting the best performance to image quality possible. I will be putting it to the test in Red Dead Redemption 2, Far Cry 5, GTA V, F1 2018 and Mafia: Definitive Edition, all in 1440p, and CoD: Warzone in 1080p. Will also share my experiences running Far Cry 5 and GTA V at 4K Ultra in CrossFire as both games are supported with dedicated profiles. My goal is to see what settings these games should be run at, which mostly means trying to avoid microstutters rather than aiming for a stable 60FPS. In F1 2018 I'll be aiming for a stable 60+ FPS obviously, as I consider that to be important in racing sims. In the case of CoD: Warzone I will be aiming for hitting an average of 120 FPS with minimal frame drops (forget the fact my screen cannot even display all those frames ;). I know it can be done given its performance in 1080p high graphics settings, let's see what it takes to get there and see how close we can get to 144.... To be continued..... (by the way, in case anyone's interested, the custom backplate is from Portuguese PC mod shop ColdZero - www.coldzero.eu )
  4. So, no one has any idea? I have the memory running on 3600mhz but need to set the subtimings so it runs properly at the advertised 19-20-20-40. Just setting the 4 main timings to these figures leads to my PC not booting and requiring a BIOS reset. Using the RAM calculator is of absolutely no use since I have no idea what values (for example: memory chip type) to enter there and the website of the manufacturer does not specify them.
  5. What 'different in potential'? Am asking for data to set them up to reach their advertised 3600mhz clock speeds on AMD. These memory modules are made for 3600mhz but they just have no XMP presets for an AMD config.
  6. hey, everyone, built a new PC and figured out my memory sticks have no pre-programmed XML profile on an AMD platform. Anybody managed to get them to work on 3600mhz yet, and if so, what settings in the BIOS?
×