Jump to content

Sophia_Borjia

Member
  • Posts

    169
  • Joined

  • Last visited

Awards

Profile Information

  • Location
    United States, Midwest

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. 9th and 10th both would need new motherboard if upgrading form 7th gen. 7th and 9th share socket, but different chipsets, 8th and 9th gen use 1151(300series). 10th added hyper threading. i5-7600 to i7-9700 raises clocks and doubles the core count, some extra ipc as well, should be pretty noticeable.
  2. A few games run better with 8 cores vs 6. 6 is enough for vast majority. But it wasn't that long ago wisdom was 4 cores is enough for gaming. In 2-3 years those extra cores could be difference between old system is still good enough and needing upgrade when new games do want the extra cores.
  3. Cpu for for that socket only needs 20 lanes, 4 go to chipset(blue), 16 lanes(green) go to Switch IC (yellow). Pulled block diagram from manual for Asus Z390 pro on Asus website. Having all cards able to saturate 8x at same time would need hedt/server platform like intel xeon/amd epyc/intel x299/ibm power9. Every gpu won't need to max out lanes at same time in most cases, so this may not affect performance much. Mining definitely won't need to max all lanes of all cards at same time.
  4. everything is high right now. Over a year a go it was less expensive new. Now newegg has $270 new and $240 used. 2700x might be over launch msrp right now, lol. It was on black friday sale 2019 for about 130..... If you need it now, that could be a good price, in 6 months you my be able to get a much better one. If supply issues of new parts are fixed and stop messing up prices of everything else due to demand shifting. Only thing I have seen good prices on recently are intel 8 cores, 9th and 10th gen have had some good sales. There were a few times prices of i9-9900k and r7 2700 were about even. For the use case listed, used xeon might be a good option, server boards with asmedia chip have onboard gpu, not for gaming but work for docker and sever use with no gpu. motherboards with "dedicated management 1gbe" have this, same chip handles vga output and the management lan port.
  5. I was under impression the recommended approach for server with freenas and hardware raid hba card was to reflash firmware to disable hardware raid and use software raid, software raid being considered more reliable. Buying same equipment can not be cheaper, and reflashing hardware is more steps , so could not be motivated by laziness. Only motive to do this would be that in at least some cases software raid is better option.
  6. Newegg has Rx580 8gb for $400. Also 1050ti for under $180.
  7. 5700xt is about $3.60-3.70 after electric cost per day. Also, power used for gpu heats room as efficiently as electric heat, so in winter if using electric heat anyway electricity is free, Unless racks of cards like pro miners use takes more power then heater would. If prices on crypto don't crash first, less then year to make $1000 mining on 5700xt. If prices on coins don't drop, 3060ti will be way over $1000 still. More cards mining will lower value per card, So sooner it will cost more for power then the cost of hashes and gpu inflation goes down, and miners start flooding market with used cards.
  8. Might be worth keeping, almost $4 per day mining on one now. More mining will cause crypto prices to drop and then lower cost of gpu. So, if enough gamers add to number of cards mining it will crash the prices on miners and make gpu affordable again once bitcoin is no longer as profitable.
  9. Those are near $1000 right now....$986 to $1399 for rx5700xt on newegg.
  10. If resolution vs screen size is issue depends on preference. I had 32" 720p tv hooked up as secondary monitor, it wasn't bad for desktop or games, same as console on same tv. Reading pdf or ebooks or using ms office on it all day would not be great time. Photo editing was fine, could zoom to see detail, just couldn't see full image and full res at same time. to get idea of how it would look, measure 3 inches past edge of monitor on diagonal line through corners of screen. note angle of vision this fills at normal viewing distance. lean into monitor until angle matches, that is same size the pixels will look with bigger screen of same resolution at normal distance. Not sure what R6 is indicating, abbreviation of game title I'm guessing. 1050ti won't play most games at 144hz, but older or indie games with lower specs it might. You should be able to benchmark it to check. The gpu will still show the higher fps even if monitor can't display it.
  11. Yes, rx6800, rx6800xt, and 6900xt are all same big Navi die.
  12. If higher res uses higher res textures then ram use goes up at the square of texture size. If it uses mostly same textures ram use won't change much. A 4k texture is 4096x4096, not the 3840 x 2160 of 4k picture. Size of textures will vary at same resolution, so it is not as simple as 4k texture res for 4k screen res. Gpu can also do particles and physics as well, so what percentage of ram is used for textures can vary a lot. A 4k photoscaned texture may not compress well, where 8k procedural one could compress to be smaller then 2k. It can very hugely from game to game
  13. I have heard the initial batch has rebate for board maker from nvidia and cost to make is very high for board makers with nvidia keeping its profit high. So it would not be surprising if when availability is normal prices are all over msrp. Not by 80%, more like 10-25%. If a rx6800 16gb and rtx3070 8gb end up with same real price when both are in stock, just wow. Interesting times....
  14. Alternative available now i9-9990k for $319 on newegg right now. I would try for rx6800 if going with new ryzen. The new feature that boosts performance with matched cpu/motherboard/gpu and it should be both cheaper and higher performance then that 3070. A few games are even recommending over 8gb of vram, so 8gb cards may start hitting ram issues with future games. A wait for 3070ti with more ram at least.
  15. The 4000 is important to note with the latency latency is measured in clock cycles, so if clock is faster.... 4000 with Cl19 is same latency as 15.2 as 3200 and 17.1 at 3600. Higher clocks sometimes mean loosening timings. So, ddr-4000 cl19 is probably better latency then 3600 cl 17, since if 4000 cl 19 was ran at 3600 the timings could be tightened, good chance to better then cl17 if 3600 cl 17 was run at 4000, latency would probably be worse then cl 19, if it could do it at all. So, ddr-4000 cl 19 is good latency. My question would be why buy such good ram, but only go 6 cores? Most games now only use 6, but a few will use 8 well. Watchdogs legion and probably Farcry 7. Games that are more an immersive sim style and open world will use more core then average since they have lots of systems and ai in background that the cpu handles and not the gpu. That 6 core should play everything out now or about to come out very well, in 2-3 years we will be seeing a lot of titles run better on 8, and a few that even use 12. If you plan to buy zen 5 at launch as well it won't matter, but if you want to keep using for next 4-5 years. I would go for 5900x with cheaper ram(paid less for 32gb of 3200 b-die last year), pcie 3.0 ssd, cheaper rx5700xt(I paid over $100 less almost year ago for a rx5700xt) if cost increase is to much. Extra cores in long run should be cheaper since it will keep system playing the new releases longer before another upgrade.
×