Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

jones177

Member
  • Content Count

    1,395
  • Joined

  • Last visited

Everything posted by jones177

  1. My work computers were for 3Dmax, Photoshop and V ray so may be a bit different. All my rendering was CPU so that comes first. I would go R 9 3900X or R 9 3950X if budget allows but I would not be unhappy with your choice. Since rendering can cause heat buildups I avoid cheap motherboards but the Prime has basically the same VRM as the Hero so I would go with that for a none overclocked build. For overclocked I would go with a high end Gigabyte this gen. I did not use GPUs for rendering but I did use them for Photoshop. 8gbs is my vram minimum so the 2070s is a good choice. Unless all your SSDs/drives are 970 EVOs there is no point using them since they are as slow as your slowest drives. I recommend a big SATA SSD. I stopped putting 7200rpm HDDs on my rending computers years ago(2007). If I need slow storage I would have it remotely. They tend to slow things down. That could be a 3DMax/Photoshop thing but to me it is 1 second vs 5 seconds on average load times(30mb file). For PSU, save some money and go with a 1000 watt. My 1000 watt PSU will be running 2 RTX 2080 tis later this year and I don't expect any issues.
  2. My 1440p 16:9 and 21:9 VA monitors were fine in games but were poor in text. My 4k VA monitor is good in games(60hz) and text. When I had a cheap IPS, VA and TN monitors running at the same time I preferred the VA for gaming. Both the TN and IPS did not have enough detail in shadows for me. I now use an expensive IPS monitor with HDR and it does have enough detail in shadows to make it a better experience. Right now I am using a 4th option and that is OLED.
  3. I have 2 i7 8086ks and one is at 5ghz all cores and the other is at 5.1ghz. They both use Noctua NH-D15s. They idle in the low 30s and game in the low 70s. They both use a der8auer 5ghz overclock at 1.344v. With higher clocks the idle temps move into the 40s so I stick with 5 and 5.1ghz on air. There is also no point in going higher since the next big performance gain over 5ghz is at 5.35ghz and that is not in 24/7 overclocking territory. Without using a RTX 2080 ti there is not much of a performance difference between 4.8 and 5ghz. I got only 3 to 5 fps difference when I used GTX 1080 tis. That changed with the 2080 tis. I recommend running some game benches to see what the difference between 4.8 and 5 is before doing a swop.
  4. Some stress tests are hot. To test CPU overclocks I use Cinebench R15. If temps are in the low 80s I am happy. For "game like" temps I use CPU Z and I like the low 70s for that. For stability. If an overclock can get through Cinebench R15 it can get through most games. The only own 2 games that use most of my CPU. One is AC:O and I keep it on both my gaming computers for testing. If your cooler can't maintain around 34c at idle it is too small or you have a hot chip. The last hot chip I had was one of my a i7 6700ks. Since I had 2 that used the same motherboard it was easy to compare. It did need a NH-D15 to keep it cool but with a better motherboard(ASUS PRO vs ASUS HERO VIII) it runs fine with a NH-U14S. The i9 9900k I overclocked to 5ghz, stable uses 1.27v. It uses a NH-D15 with one fan. The motherboard is a AORUS Ultra I would not use less. It idles in the low 30s and games in the low 70s. It is totally stable.
  5. GTX 1080s very in performance. If you bought a non factory overclocked version your score is about right. If you bought an factory overclocked version your score is low. Published TimeSpy scores are usually overclocked to the max so comparing a 24/7 setup is not practical in my view. Here is the score of my stock EVGA 1080 SC. It uses a i7 6700k so the CPU score is lower. The EVGA factory overclock on the card leaves not headroom for manual overclocking. I usually test GPUs with Heaven bench since it uses only 1 CPU core and it loops in a window. This is what it looks like. Since the R 2700x has about the same single core strength as a stock i7 6700k you should be close to this score.
  6. My X only liked it when I did dangerous things that could kill me. She did get me into programming computers because she told me I was too stupid to do it. That started my hobby.
  7. I need my high end PCs to do what I like. If I could do with less I would. Before the year 2000 my PCs were average but then I bought MS Flight Simulator 2000 and my average rig could only to about 10fps on the ground and 15fps in the air. With MS Flight Simulator you can add content and I did for years so when I stopped playing I was only doing 30fps in the air and 20 on the ground with a high end PC. Then it was MS Train Train Simulator and I loved my 100 plus car coal trains with multiple engines. Lots of physics going on one core so I bought the fastest CPUs I could afford. I then went fantasy with Oblivion in 2007 and modded it until Skyrim came out in 2011. It needed more than a 7200 rpm HDD to load in the extra content as I moved around the world so I started using 10,000 rpm dives that cost more than double standard drives. They removed stutter. I am now moving from SATA SSDs to NVMe M.2(2tb 970 EVO) for the same reason. With a heavily modded Skyrim powerful cores were not enough, it needed a powerful GPU with lots of vram as well. I went from a GTX 670(2gb) to a 680 4gb just for the extra vram and skipped the 700 series because they only had 3gbs. I now need more than the 11 gbs I have with the 2080 ti since I am using 10.5 on one of my modded games. Then in 2015 I discovered 4k and wanted to play everything at 4k. My Skyrim(with ENB) could do 60fps at 1080p with a GTX 980 but less than 20 at 4k so the goal was to get it to 60. It took a RTX 2080 ti to do it. Now Ray tracing has come. Right now I can do 1440p 60fps in my RT games with it on but the goal is again 4k 60 and soon that will change to 4k 120fps. Then it will be 2160p 21:9 120. Right now I am playing Space Engines at 4k averaging 120fps. As I build more and more the frames will go down but I will still build more. It will never end.
  8. That won't last. It gets easier as they get older and at some point it gets hard again. My Son has gotten my computer hand-me downs since he was 8. He will be 40 this year. He travels alot for work so he has a nice gaming laptop. I help him with his desktop.
  9. In recent times a 32" 1440p 144hz G-sync monitor, 32gb 3200 ram and a WD 5tb HDD. I bought the 32GB (2 x 16GB) DDR4 PC4-25600 3200MHz ram because it was cheap and I wanted to see if any of my games used it. Non did. The upside was that I used the old 8gb stick on a new build. I got the monitor on a Black Friday sale in 2018. I put it beside my 32" 4k monitor and it did not look good. The text looked bad compared to the 4k monitor. In testing my games I realized non really benefited from high refresh rate over high resolution. Within about a week I gave it to my Son how loved it. In 2016 my 1tb WD Raptor(10,000 rpm HDD) was dieing and I needed a replacement. I read that the new large WD Black drives were fast so I bought one(5tb). Even though my vanilla games ran fine my modded games stuttered like crazy. The same day I ordered a 1tb SSD and everything ran smooth on it. My Son got the 5tb Black for his school work.
  10. My mobile data is slow because of poor coverage in my area and the design of the building I live in. To send an email I have to leave the phone by a window. It is pointless to use for streaming unless I drive about a mile down the road. For landline data I get what I pay for. When I was a business I used two providers. My work was time sensitize so guarantees did not matter. Comcast Business was as bad as their consumer service so I dropped it after about a year and went consumer. It was more to do with the many outages than overall speed. ATT was better with less outages but slower over all.
  11. Yes, but I can only use the tools I have until someone shows me a better way. In the meantime I will overestimate since it is alway better to have more than you need than not enough. The difference in price is not worth the headaches if I get it wrong.
  12. If you have a better way what is it? Saying it is inaccurate is pointless since everything that measures is inaccurate. It all depends on how much. If you have a better more accurate way please let me know.
  13. Here is Possession X1 that I did a while ago and GPU Z I did today. Here is also what CPU Z says on the bios on my FTW3 Ultra. I am not too far off max.
  14. My GPU wattage through Precision X1 and GPU Z. CPU wattage is through HWiNFO64. From the wall I use a cheap Power Meter Plug from Amazon. If you know a better way that does not cost a lot please let me know.
  15. Going by GPU Z and HWiNFO64 my 2080 ti is using 387 watts and my i7 8086k is using 152 watts. that is 539 watts without taking the rest of the system into consideration. I think I am using more than 480 watts.
  16. The text and other Windows features looked bad on my 32" 1440p and 34" 3440 X 1440 21:9 monitors. They look great on my 32" 4k monitor and 38" 3840 X 1600p 21:9 monitor. As far as I am concerned 1440p is low res. I view my 55" OLD TV that I use as a monitor double the distance away than what I use the 32" 4k and 38" 21:9 so the text and windows features look fine to me.
  17. I think they underestimate. My FTW3 Ultra 2080 ti rig uses 550 watts from the wall playing AC:O. The EVGA calculator recommends 600 watts. The Newegg calculator recommends 512 watts and Coolermaster recommends 581 watts. I have peaked from the wall over 2 of these recommendations.
  18. My 2080 ti with only a 5.1ghz 6 core CPU can use close to 600 watts in a game or app that uses all the CPU and GPU. That is with my benching setup. With my 24/7 overclock it is at 550 watts on a game like AC:O. My EVGA FTW3 Ultra 2080 ti uses 387 watts max overclocked. The 6 core at 5.1ghz uses about 150 watts. I don't have RGB but I think the VRMs. On a game like AC:O it is a 550 watts most of the time. With the 860 watt PSU I have on that rig I feel very limited since I can not put another 2080 ti in it even though I own one. With my 1000 watt PSU in my other rig that is doable. If I was building a rig with your specs I would use a 1200 watt PSU.
  19. My RTX 2080 ti does about 120fps on most games at 3440 X 1440 but RDR 2 only averages about 90fps with the settings I like. I only expect a 3080 ti to get up to 144fps on easy to run AAA titles at 3440 X 1440. I am not a fan of the PG35VQ. It is too small, low res and too expensive. I prefer 38" at 3840 X 1600 as a minimum since It is a lot better for non game applications and it does not feel hight impaired like 34"/35" 21:9 monitors do(I own one). LG has a nice 38" 3840 X 1600 144hz monitor(38GL950G-B)for about $1800. It does not have HDR 1000 but HDR on Windows 10 does need work. I use a 55" B9 OLED TV on my other gaming PC so I use that for non Windows HDR content. Getting a G-sync compatible LG OLED TV might be a good stopgap until the 3080 ti comes out. They are only around $1300 and you can always stick it in a bedroom after. Mine has worked out so well that I am replacing all my 4k and 1440p monitors with them. I am just waiting for LGs 48" version to come out. I am keeping my 38" ultrawide and I will replace that when a 48/49" 21:9 4k ultrawide comes out. If ever.
  20. My bad for testing one game. Shadow of the Tomb Raider must upscale automatically. When I tested other games like RDR 2 and Control they behaved as you described.
  21. I have a question. If I set a game to 1080p full screen on my 4k monitor the screenshot will be at 1080p. If I set a game to 1080p full screen on my 4k OLED TV the screenshot will be at 4k. What are differences in these technologies?
  22. I have two gaming computers that use the same motherboard, CPU, ram and both have 2080 tis. One uses a 970 EVO and the other uses two 1tb Crucial SATA SSDs. Only my modded games that use 4 and 8k textures and have more than 5 times the NPCs in them benefit from my 2tb 970 EVO right now. The difference is stutter in open world in game load-ins. I have my modded games with the same saves on both computers so it is easy to tell the difference. This sort of thing is not new to me. In 2008 my modded Oblivion game was slow and stuttered. My character could not take a step or turn without the 7200 rpm HDD being accessed and that meant stutter. In desperation I tried a 10,000 rpm WD Raptor that cost more than double the 7200 rpm drive and it was half the size. The stutter was gone. I also put them in my work computers and a PhotoShop file that took 5 seconds to load with a 7200 rpm drive only took 2 seconds with the Rapor. Windows load times were about the same. The WD Raptors lasted up until 2016 when my heavily modded Skyrim started to get load-in pauses/stutter. Since I added at least one mod almost every weekend since 2011 I was not surprised that I broke the game. A 1tb SATA got the game smooth again. In a couple of years SATA SSDs will be too slow for open world games since they will have more content and larger textures. It really depends on the speed of the SSDs on the new consoles when it will happen.
  23. I did photo editing with PhotoShop for years(2005 to 2018). Most of the images I edited where for print and the average size was 42" X 96" at 150ppi. That is over 10 4k screen sizes. I used a i7 2600k from 2011 to 2018 to do it. I tried a 6 core in 2013 but found no real performance gain. What was more important than cores was HDD/SSD speed since load times were an issue. So if I started photo editing again for my living I would use the rig I have now but I would add another 2tb 970 EVO. I have tested a i9 9900k and it is not an upgrade for a Intel or 2nd gen Ryzen 6 core for Photoshop. It is a decent upgrade for CPU 3D rendering. To me both CPUs are capable of do the job. Time was money to me so the time it would take to downgrade would not justify the savings.
  24. At 1080p you are getting bottlenecks. 1080p is sort of the realm of CPUs with more powerful cores. Your CPU is better suited for 1440p and 4k. This is with a EVGA 2080 ti XC at stock settings with a i7 8086k at 5ghz. Note the difference in the "GPU Bound" numbers.
  25. Looks like this. My room temp is 23c and the EVO idles at 29c. The EVO is under a Noctua NH-D15.
×