Jump to content

jones177

Member
  • Posts

    3,560
  • Joined

  • Last visited

Everything posted by jones177

  1. I used one of my LG B9s as a monitor and it did burn in Chrome in about 4900 hours. It is hardly noticeable but it can't be unseen. I used all the nannies and switch off when not using it. Now my OLEDs are for gaming and TV content only. I use a 49" LG IPS 120hz TV for browsing since I got use to using big displays with the OLEDs and did not want to go back to tiny monitors.
  2. No. They are just as hard as the rest. I have been on auto notify for a 3080Ti FWT3 on the EVGA web site since launch. For the 3090 version I have been on auto notify since last year. I usually buy all EVGAs but this time around I have one MSI, one ASUS as well as one EVGA card. I got a EVGA XC3 Ultra 3080 ti on launch day on the Newegg Shuffle. I went for the FTW3 Ultra as well but no luck. The MSI 3090 and ASUS 3080 were 2020 auto notifies the pop up in the last 2 months. All the cars were MSRP.
  3. I have a dedicated editing rig in the house and it uses a LG 32" 4k VA monitor with 28" 4k monitor pivoted 90% beside it. I an not a fan of 27" 4k monitors for editing. I used them for 2 years an learned not to like them. If you need IPS why not this one. https://www.lg.com/us/monitors/lg-32un650-w-4k-uhd-monitor
  4. I use a 5800x with at 2080 ti. It is connected to an OLED 4k 120hz TV that I use as a monitor. I have Skyrim LE and SE both with 250+ mods installed and I use the Nvidia control panel to force 60fps on them. Fallout 4 runs at 120hz 99% of the time but I have to use a mod so my character can jump as high as it did at 60fps. It runs exactly the same as it did at 60fps. Going from 60hz with 20 series cards to 120hz with 30 series cards has not been a big game changer since most of the games I play do fine at 60fps. Some do like more frames but they are in a minority. I start doing 4k gaming in 2015. I bought 2 4k monitors for work and I tried one on Skyrim and after that experience there was no going back. I have bought 1440p(144hz) and 1440 X 3440 ultrawide monitors but I have not liked them that much. I have a 38" 3840 X 1600 ultrawide and I do like it. It is the lowest I like to go in size and resolution. The season I went with TVs is that I have a 32" 4k monitor but I liked the width of my 38" Ultrawide. Even the 55" OLEDs were cheaper than 4k monitors larger that 32" that were not junk. I did some tests with the 5800x/3080 ti setup at 4k ultra and this is what I got. Shadow of the Tomb Raider 91fps average. Assassin's Creed Odyssey 67fps average Assassin's Creed Valhalla 58fps average Horizon Zero Dawn 83fps average. Far Cry 5 105fps average. I don't play at ultra so I usually do about 10 frames more. If I was not using 4k TVs I would be using a 32" 60hz monitor. I don't like 27"/28" 4k monitors at all(own 2). So if you are not willing to go big like I did there are no real options better than a 32" 60hz monitor right now.
  5. I pick up GTA 5 for free on the Epic store. The computer I used at the time to play it was a i7 8086k with an EVGA FTW 3 Ultra 2080 ti. The game was not smooth out of the "box" but with the all the settings turned up it was totally smooth using DirectX 11. At the beginning of the year I built a 5800x gaming computer. It got a EVGA XC3 Ultra 3080 ti recently so I installed GTA5 and did some tests. It did stutter like you described. I decided to install GTA 5 on a i9 10900k/3090, i9 9900k/3080 and reinstall on the i7 8086k/2080 ti to see if they stuttered as well. None of them stuttered. It may be an AMD thing with some hardware or maybe we are just unlucky.
  6. I have gotten three 30 series cards at msrp but 2 were auto notifies from 2020 and the other was from the Newegg Shuffle. I still need one more so I am still looking.
  7. I recently tested my 3080 ti with a 5800x and I did use Assassins Creed Odyssey as one of the benches. I also tested an i9 10900k with a 3090 that confirmed the 5800x/3080 ti computer was operating properly. This is with Ultra High settings. AC:O 1080p 98fps average(5800x) (i9 10900k/3090 103fps ) AC:O 1440p 93fps average(5800x) (i9 10900k/3090 94fps) AC:O 4k 67fps average(5800x) (i9 10900k/3090 70fps) Things to check. I have a Corsair RM850x. It has 5, 8 pin PCIe connectors. 4 are below the 24pin. You should be using 3 of them. Assassins Creed Odyssey will run at much lower frames on a 5200rpm HDD or USB 3 device. It is lumpy on a 7200rpm HDD and perfect on a SATA SSD. Assassins Creed Odyssey has a hz slider that sometimes goes to 24hz when changing resolution or other settings. This causes what looks like stutter. Assassins Creed Odyssey uses about 100 watts more from the wall than most games I have tested so use GPUZ to see if your PCie slots and cables are delivering the correct amount of power. For the total power go to GPUZ Advanced/Nvidia Bios and look for the default power limit. Both my three 8 pin 30 series cards are 370 watts default(no overclock). Good Luck.
  8. I have a Rift S. It is used about once or twice a month. It is usually to add and test mods for SkyrimVR and Fallout 4VR so when I do another play through I will have new content. The last game I bought was Half Life Alyx and nothing has really interest me since. I was hoping the more older AAA titles would be converter to VR but that has not happened. If my Rift died I would replace it with an Index since like most I have no interest in a Facebook product.
  9. I went with a X570 because my computers have a productivity life long after I am finished gaming on them. In its gaming form it has only one gen 4 SSD so a B550 would have been fine but I don't know what it will have in the future. I ended up with a Aorus Master x570 and did research on the B550 version. No reviewers mentioned a difference in performance.
  10. Yes. With the old ones the screen would go black in some game menu screens. Two of the games that did to were Far Cry 5 and Horizon Zero Dawn.
  11. I use Zeskit Maya certified cables. The ones that I bought in 2019 when I got the OLED TVs that said they were HDMI 2.1, weren't.
  12. All my Intels idle in the low 30s so I was a bit put off by the 5800x I bought. I did get it down into the low 30s but it took an EK 360mm AIO to do it.
  13. I added the 8k textures in 2017 after getting the GTX 1080 ti so it is not a new thing with me. I only put the game with the 8k textures on the 3080 computer to test and it was less smooth in the areas that used more than 10gbs vram. My main gaming computer has a 3080 ti so I am getting a perfect experience even with 8k textures.
  14. My issue with the 3080 is the vram. My vanilla games run fine on it but my modded games that use 8k textures are not smooth in areas that exceed 10gbs. Even my GTX 1080 ti can run them better since they are older games.
  15. No if you are a 1440p gamer. Yes if you are a 4k gamer. I would get the 3080 ti first and then save for a 5000 series CPU.
  16. I have tested all my resent GPUs with Shadow of the Tomb Raider and Assassin's Creed Odyssey. Older GPUs like the GTX 980 and 980 ti do not produce a playable experience with these settings at 1440p. So for this test only high end 5 year old Nvidia GPUs were playable. Going by this the RTX 3080 ti will last until 2025. I used Shadow of the Tomb Raider and Assassin's Creed Odyssey because there are lots of benches of them and they are still not easy to run. 1440p Highest preset on SOTTR and Ultra High for AC:O. Since both games look great on lower settings this is the worst case. GTX 1080 = SOTTR 69fps average(i7 8086k) AC:O 52fps average(i7 8086k) GTX 1080 ti = SOTTR 84fps average(i7 8086k) AC:O 60fps average(i7 8086k) RTX 2080 ti = SOTTR 133fps average(i7 8086k) AC:O 81fps average(i7 8086k) RTX 3080 = SOTTR 144fps average(i9 9900k) AC:O 88fps average(i9 9900k) RTX 3080 ti = SOTTR 160fps average(5800x) AC:O 93fps average(5800x) RTX 3090 = SOTTR 164 average(i9 10900k) AC:O 94fps average(i9 10900k)
  17. At the time I did the build I had the Pro 4 on a shelf and the ram(Corsair Vengeance RGB) was on the shelf as well. So it was either to buy more ram or buy a another cooler. I chose to buy 360mm AIO.
  18. I think some are hot and some are not so start with the Dark Rock Pro 4. My 5800x is on a EK 360mm AIO and it stays cool but it is annoying since it is louder than my Intels. I have a Dark Rock Pro 4 but the ram(RGB) I have in the computer does not fit under it so I did not use it. If it did fit I would have started with the DRP4.
  19. I bought a 3090 recently and my concern was not how high the boost clock was but how hot the vram got. I ended up with a MSI Gaming X Trio($2,280) because it is a conservative card with heat pipes on the back plate. I have a RTX 3080 ti as well and it is an EVGA XC3 Ultra($1,200) witch matches Nidia's MSRP. The difference is 3 to 6 frames in games so not many frames for over $1000 more. It is disappearing from my gaming PC list along with the i9 10900k and will be used for 4 and 8k video editing by my Son. For testing to see if a GPU is a good upgrade I use Shadow of the Tomb Raider built in bench since most reviewers still use it. Your RTX 3080S does about the same as a GTX 1080 ti so you be the judge. I also added the GTX 1080 score since that is the GPU I used 5 years ago. SOTTR 1440p Highest preset with TAA. GTX 1080 = 69fps average(i7 8086k) GTX 1080 ti = 84fps average(i7 8086k) RTX 2080 ti = 133fps average(i7 8086k) RTX 3080 = 144fps average(i9 9900k) RTX 3080 ti = 160fps average(5800x) RTX 3090 = 164 average(i9 10900k) As for the 40 series. Going by the frames I get in Shadow of the Tomb Raider if I was a 1440p 144hz gamer I would have no reason to upgrade. I am a 4k 120hz games so I still need more frames. If I have to wait an extra year again to get more frames it does not matter since I am not unhappy with the frames I am getting now and wasn't even with the 2080 tis.
  20. Mine is with a i9 10900k. I am testing it now and in about 2 weeks it will be an editing only rig.
  21. The RTX 3080 ti does not seem to like a lot of types of anti aliasing used in some games and will stutter. I get around it by playing at 4k with no anti aliasing on these games. One is GTA 5. With GTA5 your GPU usage is about right. Here is my 5800x with a RTX 3080 ti. I get about the same usage with my i7 8086k with a 2080 ti.
×