Jump to content
Search In
  • More options...
Find results that contain...
Find results in...


  • Content Count

  • Joined

  • Last visited

Everything posted by jones177

  1. I am running 2 SLI systems now. One with GTX 1080s and one with GTX 1080 tis. 90% of the time these computers are only using one card and anyone building an SLI system should understand that. The GTX 1080 ti SLI computer is not playing any games that support SLI now but the single 1080 ti does fine at 1440p. The two GTX 1080s are hooked up to a 4k monitor and only play SLI games but it is not often. My two gaming computers use RTX 2080 tis and at some point they will be in the same computer but both will be replaced by 3080 tis. I have SLI games that I have 1000s of hours in and plan to do 1000s more so it is worth it to me to do SLI but I have never bought a GPU for SLI.
  2. I still use NMM on SE. I download a mod manually and then install it with NMM. Download here. https://www.nexusmods.com/site/mods/4
  3. I look like that now but it took 38 years of gaming.
  4. More is better but at what cost. I use a 38" 3840 X 1600 ultrawide. It is a LG and has some of the features your new monitor has. I could only recommend it using a RTX 2080 ti. My Ultra wide computer is a perfect match for that monitor and therefore a joy to use. My 4k computer struggles even with a RTX 2080 ti in some games but is really nice for productivity. 4K monitors are great if you use 2 or more computers like I do but I would not use one as my only monitor. For productivity 1440p did not work for me. Text looked about the same as 1080p. In games 3440 X 1440 was great but a GTX 1080 that has about the same power as your Vega was only good for all frames over 60 and not much else. If you like high refresh rate you would have to drop the resolution anyway. I have a 1440p 144hz monitor but I prefer using the 38" slower(75hz) ultra wide. If you were doing 3D or image editing I would say go 4k and suffer the poor quality 1440p 60hz you get from a cheap 4k monitor in games, but with all the activities you mentioned and your hardware I can not think of a better choice.
  5. Yes. It suits you specs very well and has nice features. I do prefer using a 16:9 for web design since resolution testing is easier and that it were I usually screw up.
  6. I am a 4k gamer and with the current top hard to play games I am lucky to get 60fps with an overclocked 2080 ti and a 5ghz CPU. That is 59fps average on RDR 2 and 61fps average on AC:O. I expect games to get harder going by the new console specs so 4k will get harder to do. All my 4k monitors were not good at displaying 1080p and my VA 4k monitor is not even good for 1440p. So dropping the resolution to get good performance is not on opinion on them. The only future proofed 4k monitor worth having is not a monitor at all but a 2019 plus 4k LG OLED TV. It does 4k 60hz now and 1440p 120hz. It does 1080p, 1440p and 4k equally well and is G-sync compatible. They are the only displays I would buy for entertainment and the only downside is the cost(so far).
  7. I judge open world games by CPU lows and Witcher 3 lows were quit high compared to other open world games that came out at that time. When Witcher 3 came out I used a i7 2600k with a GTX 980. The CPU lows were in the 50s so playing at 1440p and keeping the frame rate in the 50s produced a totally smooth experience. By contrast with the same CPU Bethesda open world games since Skyrim had CPU lows in the 30s. So they were never totally smooth until I had a CPU that could keep the lows above the frame rate I played at. These games are also load as you go(no load screens) so what the games are stored on makes a big difference in their performance. Your CPU has a higher IPC than a i7 2600k so staying smooth at 60fps should be easy for it. Your GPU is slightly faster than a GTX 980 so again running at 60 should be smooth. If you are playing at 1080p at 90fps your CPU lows will drop you to 60 and that is rough. At 1440p at 60fps the lows don't exist so a better experience. Witcher 3 is still the smoothest open world game that I have played with Outer Worlds being the second.
  8. Yes. I was freelance 3D artist before I retired so when one computer was rendering and I had no 2D work to do on my other I would play, pausing it to set up the next render. Skyrim would get paused 2 to 4 hours at a time. Fallout 4 crashes to desktop if paused too long so it is actual hours. It is also crazy modded and even my i7 6700k computer below can't run it.
  9. Things seem to work well or they don't. I have had that issue in the past with Nvidia GPUs and it always ends up being the motherboard. I have had issues between same make and model boards with only the revision numbers being different Like you, I would put another card in and the computer would be fine. That is why I never sell my old cards. Fortunately for me there are usually four gaming PCs in the house and the card will work in one of them. If you only have one gaming PC, getting a very different card is the right move.
  10. For 32" you may have to pay $50 more. LG 32UD59-B 32" Class 4K Ultra HD Freesync monitors cost about $350.
  11. Heaven only uses one CPU core and likes high IPC so it is a good test for GPUs with Intel CPUs. Sort of like an old video game. Your score is about normal for a blower system. The cooler the card the better the score.
  12. The last card I got for my i7 2600k was a GTX 1080 and had no issues so I would go 2060. GTX 1080 = RTX 2060.
  13. I use an OLED TV as a monitor and on it one of the best features is the quality of the 1440p output. It is even better than my dedicated 1440p monitors considering the size. I am a 4k snob that thinks 1440p is low res but I like 1440p on high end TVs. Try running at 1440p before you upgrade. My i7 6700k was fine for 4k with a GTX 1080 ti(= 2080). It did suffer a bit at lower resolutions. I replaced it with a 6 core for gaming since it is a lot smoother experience. I have 2080 tis and I would not pair one with a i7 6700k. They like high IPC so at a minimum I would only use a i7 8700k or a R5 3600X. The only CPU I would buy for one needs to be 5ghz plus.
  14. Could be a few things. I have had SATA ports fail on motherboards. It ended up being an Intel chip set issue. That is the main reason I retired my i7 2600k. First SATA 2 went slowly and when SATA 3 started to go I upgraded. I have not run AMD boards long enough to have it happen on them. If firmware goes bad on any drive on a system it causes freezes since Windows checks the drive over and over again. I have only had it happen on hard drives but I have only bought 4 SSDs and I have bought many, many HDDs. If a CPU is at a true 100% it has no cycles for disk access. This is normal for rendering with 3D Studio Max(Nvidia Mental Ray). Can't even assess the drive over a network without about a minute delay. Poorly written software can do this. Check the Performance Monitor.
  15. I had a 32" 1440p(144hz) monitor beside my 32" 4k(60hz) for a while and did not like it at all. For non gaming it was too fuzzy(low res). It doesn't take many more pixels not to look fuzzy since my 3840 X 1600 ultrawide does it. For years I prepared images for large format printing. All the errors that got past me where because I did not zoom in enough. If I started doing it again I would only use 4k monitors.
  16. Right now you have a balanced system. If you get a 2080 ti it will be unbalanced and you will want to upgrade your CPU. 2080 tis like high IPC and frequency. I would literally hate to run a 2080 ti on a R 2700X but I would not mind running one on a R 3700X. I would upgrade the CPU first so you get the frames with a 2080 ti or a next gen GPU.
  17. I used my 55" Sony from 2006 to 2019. It is only when I decided to hook up a computer to it that I started looking for a new one and only got one when a G-sync compatible TV was available. If I wasn't a 4k gamer I would have waited until it broke or 4k became the new broadcasting standard. What I recommend is that you ask your friends to buy one for you. That will shut them up.
  18. I had the same type of rig. I upgraded to a 2080 ti. Retired the i7 8700k because it could not do 5ghz, cool and stable and got a i7 8086k. If I was doing it today I would go with an i9 9900kS. I replaced my 1tb SATA SSD with a 2tb 970 EVO. If you don't mod games there is not point in this upgrade. I went from 16 to 32gbs ram and that is the only upgrade I could have done without. I like the Cinebench R15 single core score of the i9 10900K so that may be my next upgrade. If you are happy with the performance of your rig don't bother. None of the upgrades were games changers. They just allowed me to turn things up a bit.
  19. In the test that I have done the less cores a game uses the more likely it will bottleneck at 1440p with a 2080 ti. My 5ghz CPUs are for my old modded 1 and 2 core games. Not my modern ones. At 3440 X 1440 I have never gotten a bottle neck with a 2080 ti even with a non overclocked i7 8700k so I consider it the idea resolution for a 2080 ti. There are exceptions like AC:O. It uses more CPU at 1440p but less GPU than at 4k. At 4k it uses less CPU and more GPU. So it seems to be bottlenecked but I think It would run better with more cores than the 6 I have. I play RTX games at 1440p with most frames over 60. It takes a bit of time to get the setting right but it is worth it. It is definitely worth getting. With the monitors I have now I really don't need a 3080 ti unless it has HDMI 2.1. With HDMI 2.1 my LG OLED TV will do 120hz at 4k. Then I will need a 3080 ti. My Ultra wide monitor is a 75hz 3840 X 1600 monitor. I would like to get a higher refresh rate version but not at their current prices. My FTW3 Ultra can keep most frame over 75 even on demanding games so It will most likely not get a GPU upgrade. You are right about driver optimization. When I got my first 2080 ti it was not much faster in games than my 1080 ti. In benches it destroyed the 1080 ti.
  20. I have been using a LG OLED for a monitor since last November. If I walk away from the Oled I switch it off. Off is really standby so it comes back on in an instant. As I am typing this on the OLED it will dim and britten. This is one of the many features it has to prevent burn in. Two others are picture shift and pixel refresher. For Windows I set it to turn the screen off in 30 minutes but I always turn it off with the remote. The thing is to keep in mind that it is not a monitor. My 38" monitor that is a few feet away from the OLED stays on without a screen saver all day. It cost the same as the OLED but is a totally different animal.
  21. There are seven Windows computers in my house. None of the installs are quite the same. Two of the computers have very close specs. One has issues with every other update and one has no issues at all. I install from CD. With other methods I get blue screens and random shutdowns. It was so bad that I am afraid to try another method. My new Windows 10 CDs are less stable than my old ones so I install with the old and use the keys of the new.
  22. Thanks. I have decided to replace Corsair 860 since I think it may have a flaw. Occasionally the EVGA FTW3 Ultra is power starved. It may be an issue with the card but it runs fine on my other PSUs. What happens is I will load a game and notice I am getting low frame rate(stutter). I switch on MSI Afterburner and the readings seem to be stuck. The GPU temperature will be in the 40s and that is well below its operating temps. The power limit will be stuck around 61 to 67% and the mhz usually shows 2115 which is what the card spikes to before it heats up under load. This condition survives a reboot so switching off then on is the only way to get rid of it. I think it is called a bios freeze from what I read. The Corsair did fail running two GTX 1080 tis and that should not happened either. Nowhere did I find a post with someone having issues with 850s watts with that setup. If the FTW3 Ultra gets a bios freeze with the new PSU it will be RMAed.
  23. That is a little further than I want to go. How did you calculate your 400 watts max load?
  24. My i7 8086k at 5.1ghz uses 152 watts max. You have one so you could test it yourself. My EVGA uses 387 watts max. That is a bit over what it states in the bios but I posted the bios images here. The only bench/game that hit my CPU and GPU hard at the same time is AC:O and that is how I test real world max output. All my synthetic benches hit one or the other but none both at the same time. GPU Z tells me I am using 383 watts max when playing. The ASUS EC sensor on my Hero board tells me the CPU power usage through HWiNFO is 132 watts when playing. The power meter plug readings go well with what the software is telling me so I would like to know what is off. Is it GPU Z, HWiNFO or my power meter that is off. Do you think the software is catching spikes in power usage that the power meter does not see? Like on the other side of the PSU(not the wall side). All I want to do is figure out what is causing the discrepancy.