Jump to content
Search In
  • More options...
Find results that contain...
Find results in...


  • Content Count

  • Joined

  • Last visited

Everything posted by jones177

  1. I have a question. If I set a game to 1080p full screen on my 4k monitor the screenshot will be at 1080p. If I set a game to 1080p full screen on my 4k OLED TV the screenshot will be at 4k. What are differences in these technologies?
  2. I have two gaming computers that use the same motherboard, CPU, ram and both have 2080 tis. One uses a 970 EVO and the other uses two 1tb Crucial SATA SSDs. Only my modded games that use 4 and 8k textures and have more than 5 times the NPCs in them benefit from my 2tb 970 EVO right now. The difference is stutter in open world in game load-ins. I have my modded games with the same saves on both computers so it is easy to tell the difference. This sort of thing is not new to me. In 2008 my modded Oblivion game was slow and stuttered. My character could not take a step or turn without the 7200 rpm HDD being accessed and that meant stutter. In desperation I tried a 10,000 rpm WD Raptor that cost more than double the 7200 rpm drive and it was half the size. The stutter was gone. I also put them in my work computers and a PhotoShop file that took 5 seconds to load with a 7200 rpm drive only took 2 seconds with the Rapor. Windows load times were about the same. The WD Raptors lasted up until 2016 when my heavily modded Skyrim started to get load-in pauses/stutter. Since I added at least one mod almost every weekend since 2011 I was not surprised that I broke the game. A 1tb SATA got the game smooth again. In a couple of years SATA SSDs will be too slow for open world games since they will have more content and larger textures. It really depends on the speed of the SSDs on the new consoles when it will happen.
  3. I did photo editing with PhotoShop for years(2005 to 2018). Most of the images I edited where for print and the average size was 42" X 96" at 150ppi. That is over 10 4k screen sizes. I used a i7 2600k from 2011 to 2018 to do it. I tried a 6 core in 2013 but found no real performance gain. What was more important than cores was HDD/SSD speed since load times were an issue. So if I started photo editing again for my living I would use the rig I have now but I would add another 2tb 970 EVO. I have tested a i9 9900k and it is not an upgrade for a Intel or 2nd gen Ryzen 6 core for Photoshop. It is a decent upgrade for CPU 3D rendering. To me both CPUs are capable of do the job. Time was money to me so the time it would take to downgrade would not justify the savings.
  4. At 1080p you are getting bottlenecks. 1080p is sort of the realm of CPUs with more powerful cores. Your CPU is better suited for 1440p and 4k. This is with a EVGA 2080 ti XC at stock settings with a i7 8086k at 5ghz. Note the difference in the "GPU Bound" numbers.
  5. Looks like this. My room temp is 23c and the EVO idles at 29c. The EVO is under a Noctua NH-D15.
  6. I have a 2tb 970 EVO with a 5.1ghz CPU on air and a 2080 ti on air. The EVO in open world games is at 44c. I use the EVO without a heatsink since it was trapping heat.
  7. The rule of thumb is dumb. TVs are just big monitors now. My 55" OLED is 32" away from my eyes. I will probably get two 48" OLEDs, one for each bedroom, if they are cheaper than the 55" C9 . They will be replacing a 32" 1440p 144hz monitor and a 32" 4k monitor. The monitors are not comparable to the OLED for watching TV and movies but the OLED TV can play my kind of games just as well.
  8. I run one RTX 2080 ti at 3840 X 1600 75hz and the only game so far that runs below 75hz with the settings I like is RDR 2. The other I run at 4k 60hz and all games are over 60fps except RDR 2. In tests at 1440p games like Shadow of the Tomb Raider average about 132fps but RDR 2 averages 105fps. Surprisingly AC:O does the lowest frames at 1440p with 76fps average and 4k at 66fps. Definitely a CPU game.
  9. I have a EVGA XC 2080 ti with Micron and a FTW3 Ultra with Samsung. The Samsung clocks to 8100mhz and crashes at 8200mhz. It crashes the bench but not the computer. the Micron clocks at 8200mhz and crashes at 8300mhz. It crashes(freezes) the computer. The Samsung crashes immediately and the Micron crashes about half way through a bench. The Micron likes to waste my time so I don't like it. I also have two EVGA GTX 1080 ti SC2s with Micron and Samsung ram. They clock the same and are now used in SLI.
  10. It might be the extra cores as well. I only tested my 1080 ti stock and got 23765 with a 5ghz 6 core 3200mhz ram.
  11. This week. At 1440p the frame rate is really good.
  12. I now gauge the longevity of GPUs by RDR 2. If a GPU can't run it at 60fps with desant settings at 1080p it is done. Using that measure even my 2080 ti is done for 4k gaming since it is in the 50s using Hardware Unboxed settings. That means I have to upgrade. Thank you Rockstar Games for giving me an excuse to upgrade my GPUs hopefully this year.
  13. When my hobby was astronomy way back in the 70s I used to cook steak in beer. It was the cowboy way. We know how long it takes to evolve an intelligent species with the right conditions so the chances are that any intelligent life out there it too busy being stupid to notice anyone else, just like us.
  14. As I said, I don't pay a fee and the person that looks after internet security in my home has a degree in it so I am not worried.
  15. No. A Titan X Maxwell is good for 3440 X 1440 60hz and 1440p 75hz. So unless you go 4k or 3840 X 1600 you don't need more at the moment. You should wait for next gen. There are no good upgrades for you from AMD right now. If your Titan died today it would be a good replacement. Like with Nvidia you have to wait. The driver issues according to "Tech Deals" seems to be at 1080p in some titles. I have not seen them at higher resolutions. Freesync seems to work well with Nvidia cards. At first it did not for me but now it just works. I play at 4k 60hz and 3840 X 1600 75hz ultrawide and use 2080 tis. They would be a good upgrade for you if you could get a deal but without a deal it is best to wait.
  16. I did not use their equipment for years but changed because I could not get issues resolved since they always blamed my equipment. Since their technicians are contractors I ended up getting different ones every time I had an issues so I had to go through "my equipment isn't the problem" over and over again. When I did ordered the router I was not billed for it since the fee was part of my condo association agreement witch is strange since I pay for internet and the association pay for TV. Now I use their equipment and they can check their router so the technicians don't have to visit anymore.
  17. In the 80s it was true. When home computers first came out they came with programing languages. So to use them you had to learn how to program them. Just about everybody I knew took computer programming classes back then. Hacking and making viruses was something programers did for fun to other programers. They were pranks. At some point people with ill intent learned to program and the world changed.
  18. I use a 2080 ti for 4k gaming but a 2070 super would do for WoW. The problem is over time games get harder. Already I can't play RDR 2 at 4k with a 2080 ti. I do play it on a 3840 X 1600 ultrawide monitor and average 72fps. So I have to wait for a RXT 3080 ti to get back to all games at 4k with all frames over 60. When I did video I used an old computer for storage networked the my editing computer. I don't do editing now but I use WD Passports for storage because they are cheaper then internal HDDs.
  19. Not really enough GPU for 4k gaming and things are going to get worse over time. If you don't mind dropping the resolution for gaming you will be ok. I would not bother with the HDD since it will make the EVO pointless. Get a 2tb SATA SSD and if you need more space later on get another one. They load Windows and apps fast enough and the overall system will feel faster and the EVO/HDD setup.
  20. I did design work for many years with 3D max and other 3d programs. I only had to do 5 to 10 renderings of the design then update those renderings as the project evolved. I wanted my rendering computer to be finished a rendering by the time I had done my 2D textures work or presentation work that I did with another computer. Either CPU would be fine for me after I checked out the Auto desk forums and others to see if there are any major issues using them. It would depend more on the budget for me. A 2080 ti works hard if you do GPU rendering and can use up to 380 watts on its own. With a high core count CPU working hard as well you might need more PSU. At any time I could add more GPUs so I like more PSU. The computer I use with the EVGA FTW3 Ultra uses close to 600 watts if the GPU and CPU are maxed. On games it usually uses 400 to 450 watts since all core on the CPU are not utilized. Ram is cheap right now so get as much as you can. In the work that I did, sometimes the design had to be put in a space like a hotel or convention center lobby and I would get a CAD file of the area that would not load into 32gbs ram. So even though 32 is fine for my work I need more for projects that incorporated other peoples work. I have one system that uses a 2tb 970 EVO. If I needed more space I would get another 970 EVO since a SATA SSD would just slow the system down. Windows loads in at about the same time on both SATA and m.2 so the advantage is reading and writing data and that means that your system is as fast as your slowest drive. For a motherboard I would go with a high end X570. I don't buy cheap motherboards for 3D. If the VRMs are going to overheat it will be overnight doing a rendering job the needs to go out the next day. I learned my lesson years ago.
  21. Strange stuff. Must be a driver issue.
  22. Mine stopped working in some games but works on the desk top. It was working fine most of last year but did not work well then the feature was introduced. My other gaming computer that uses a G-sync comparable TV is working fine. Both computers use the same i7 8086ks and 2080 tis with the same motherboard. The main difference are the monitors. If you don't have it download the Nvidia Pendulum app. https://www.nvidia.com/coolstuff/demos#!/g-sync Run it full screened and widowed mode. Make sure you check "enable settings for the selected display model" in the Nvidia control panel. Mine did not work at all without it. For me G-sync in the the Pendulum app works in widowed mode but not in full screen. The games that don't work where in widowed mode but work when I change to full screen. Not really making sense but I at least I have a workaround. ------------ I just read on the G force forums that downloading a monitor driver, in my case from LG, fixes it and it did for me. I hope you can get the same result.
  23. I skipped the 770 because it did not have enough vram. So I went 570, 670, 680 4gb(bought when 770 came out) then 970. People are still using the 970 today because of the vram.
  24. EVGA 2080 ti Black doesn't throttle or freeze at 80c if it is healthy. At 100% on the power target it throttles at 84c and at 112% on the power target it throttles at 88c. Here is a video on it.